space counterplan - Open Evidence Project

advertisement
**notes
Twonily = Emily and Anthony
#gbnswag
**case
**federalism
1nc – federalism
Courts will always prevent snowball – healthcare proves
Nagel 1 (Robert F., Professor of Law – University of Colorado, Annals of the American Academy of
Political and Social Science, March, p. 53)//twonily
In what appears to be an ambitious campaign to enhance the role of the states in the federal system, the Supreme Court has
recently issued a series of rulings that limit the power of the national government. Some of these
decisions, which set boundaries to Congress's power to regulate commerce and to enforce the provisions of the Fourteenth
Amendment, establish areas that are subject (at least in theory) only to state regulation. Others protect the autonomy of state governments by
restricting congressional authority to expose state governments to suit in either state or federal courts and to "commandeer" state institutions
for national regulatory purposes. Taken
together, these decisions seem to reflect a judgment--held by a slight majority of
the dramatic expansion of the national government during the twentieth century has put
in jeopardy fundamental principles of constitutional structure.
the justices--that
Federalism is resilient
Swaine 3 (Edward T., Assistant Professor in the Wharton School – University of Pennsylvania, “Does
Federalism Constrain the Treaty Power?”, Columbia Law Review, April, 103 Colum. L. Rev. 403,
Lexis)//twonily
Federalism is the vampire of U.S. foreign relations law: officially deceased or moribund at best, but in reality
surprisingly resilient and prone to recover
at unsettling intervals. Linked with a dark period in our constitutional prehistory, foreign
relations federalism was supposedly given a lasting burial by the Constitution's nationalization of foreign affairs authority; in foreign relations, the orthodox position
held, states 1 simply ceased to exist. 2 Nonetheless, rumors of their twilight existence persist. [*405] With lingering memories of previous scares, 3 frightened law
professors have begun to huddle together in symposia to discuss a rash of recent sightings - especially in the form of state-conducted foreign relations, obstacles to
compliance with international agreements, and special exemptions in treaties and implementing statutes. 4
Squo solves – and no spillover
Young 03 (Ernest Young, Professor of Law at Texas, May 2003, Texas Law Review, Lexis)//twonily
One of the privileges of being a junior faculty member is that senior colleagues often feel obligated to read one's rough drafts. On many
occasions when I have written about federalism - from a stance considerably more sympathetic to the States than Judge Noonan's - my
colleagues have responded with the following comment: "Relax. The
States retain vast reserves of autonomy and
authority over any number of important areas. It will be a long time, if ever, before the national
government can expand its authority far enough to really endanger the federal balance. Don't make it sound like
you think the sky is falling."
No modeling
Moravcsik 05 (Andrew Moravcsik, “Dream On America”, Newsweek, January 31, 2005,
http://www.msnbc.msn.com/id/6857387/site/newsweek/)//twonily
Not long ago, the American dream was a global fantasy. Not only Americans saw themselves as a beacon unto nations. So did much of the rest
of the world. East Europeans tuned into Radio Free Europe. Chinese students erected a replica of the Statue of Liberty in Tiananmen Square.
You had only to listen to George W. Bush's Inaugural Address last week (invoking "freedom" and "liberty" 49 times) to appreciate just how
deeply Americans still believe in this founding myth. For many in the world, the president's rhetoric confirmed their worst fears of an imperial
America relentlessly pursuing its narrow national interests. But
the greater danger may be a delusional America—one
that believes, despite all evidence to the contrary, that the American Dream lives on, that America remains a
model for the world, one whose mission is to spread the word. The gulf between how Americans view themselves and how the world
views them was summed up in a poll last week by the BBC. Fully 71 percent of Americans see the United States as a source of good in the
world. More than half view Bush's election as positive for global security. Other studies report that 70 percent have faith in their domestic
institutions and nearly 80 percent believe "American ideas and customs" should spread globally. Foreigners take an entirely different view: 58
percent in the BBC poll see Bush's re-election as a threat to world peace. Among America's traditional allies, the figure is strikingly higher: 77
percent in Germany, 64 percent in Britain and 82 percent in Turkey. Among the 1.3 billion members of the Islamic world, public support for the
United States is measured in single digits. Only Poland, the Philippines and India viewed Bush's second Inaugural positively. Tellingly, the anti-
Bushism of the president's first term is giving way to a more general anti-Americanism. A plurality of voters (the average is 70 percent) in each
of the 21 countries surveyed by the BBC oppose sending any troops to Iraq, including those in most of the countries that have done so. Only
one third, disproportionately in the poorest and most dictatorial countries, would like to see American values spread in their country. Says
Doug Miller of GlobeScan, which conducted the BBC report: "President Bush has further isolated America from the world. Unless the
administration changes its approach, it will continue to erode America's good name, and hence its ability to effectively influence world affairs."
Former Brazilian president Jose Sarney expressed the sentiments of the 78 percent of his countrymen who see America as a threat: "Now that
Bush has been re-elected, all I can say is, God bless the rest of the world." The truth is that Americans are living in a dream world. Not only do
others not share America's self-regard, they no longer aspire to emulate the country's social and economic achievements.
The loss of faith in the American Dream goes beyond this swaggering administration and its war in Iraq. A President Kerry would have had to
confront a similar disaffection, for it grows from the success of something America holds dear: the spread of democracy, free markets and
international institutions—globalization, in a word. Countries
today have dozens of political, economic and social
models to choose from. Anti-Americanism is especially virulent in Europe and Latin America, where countries have
established their own distinctive ways—none made in America. Futurologist Jeremy Rifkin, in his recent book "The European Dream," hails an
emerging European Union based on generous social welfare, cultural diversity and respect for international law—a model that's caught on
quickly across the former nations of Eastern Europe and the Baltics. In Asia, the rise of autocratic capitalism in China or Singapore is as much a
"model" for development as America's scandal-ridden corporate culture. "First we emulate," one Chinese businessman recently told the board
of one U.S. multinational, "then we overtake." Many are tempted to write off the new anti-Americanism as a temporary perturbation, or mere
resentment. Blinded by its own myth, America has grown incapable of recognizing its flaws. For there is much about the American Dream to
fault. If the rest of the world has lost faith in the American model—political, economic, diplomatic—it's partly for the very good reason that it
doesn't work as well anymore. AMERICAN DEMOCRACY: Once upon a time, the U.S. Constitution was a revolutionary document, full of epochal
innovations—free elections, judicial review, checks and balances, federalism and, perhaps most important, a Bill of Rights. In the 19th and 20th
centuries, countries around the world copied the document, not least in Latin America. So did Germany and Japan after World War II. Today?
When nations write a new constitution, as dozens have in the past two decades, they seldom look to the American
model. When the soviets withdrew from Central Europe, U.S. constitutional experts rushed in. They got
a polite hearing, and were sent home. Jiri Pehe, adviser to former president Vaclav Havel, recalls the Czechs' firm decision to
adopt a European-style parliamentary system with strict limits on campaigning. "For Europeans, money talks too much in
American democracy. It's very prone to certain kinds of corruption, or at least influence from powerful lobbies," he says. "Europeans
would not want to follow that route." They also sought to limit the dominance of television, unlike in American campaigns where, Pehe says,
"TV debates and photogenic looks govern election victories." So it is elsewhere. After American planes and bombs freed the country, Kosovo
opted for a European constitution. Drafting a post-apartheid constitution, South
Africa rejected American-style federalism in
favor of a German model, which leaders deemed appropriate for the social-welfare state they hoped to construct. Now fledgling
African democracies look to South Africa as their inspiration, says John Stremlau, a former U.S. State Department official who
currently heads the international relations department at the University of Witwatersrand in Johannesburg: "We can't rely on the Americans."
The new democracies are looking for a constitution written in modern times and reflecting their progressive concerns about racial and social
equality, he explains. "To borrow Lincoln's phrase, South Africa is now Africa's 'last great hope'." Much in American law and society troubles the
world these days. Nearly all countries reject the United States' right to bear arms as a quirky and dangerous anachronism. They abhor the death
penalty and demand broader privacy protections. Above all, once most foreign systems reach a reasonable level of affluence, they follow the
Europeans in treating the provision of adequate social welfare is a basic right. All this, says Bruce Ackerman at Yale University Law School,
contributes to the
growing sense that American law, once the world standard, has become "provincial." The
United States' refusal to apply the Geneva Conventions to certain terrorist suspects, to ratify global
human-rights treaties such as the innocuous Convention on the Rights of the Child or to endorse the International
Criminal Court (coupled with the abuses at Abu Ghraib and Guantanamo) only reinforces the conviction that America's
Constitution and legal system are out of step with the rest of the world.
The New York precedent doesn’t apply – NWPA doesn’t violate federalism
Yoo 4 – Prof. Law at UC Berkeley School of Law (John Yoo, 1/1/14, “Judicial Safeguards of Federalism
and the Environment: Yucca Mountain from a Constitutional Perspective,”
http://scholarship.law.berkeley.edu/cgi/viewcontent.cgi?article=2500&context=facpubs)//twonily
Currently, nuclear waste is stored in 131 facilities in thirty-nine states
. Since the late 1970s, efforts have been underway to establish a single nuclear waste repository
for the nation. Initially, Congress directed the Secretary of Energy to recommend to the President three potential sites to s erve as a single nuclear waste repository. 75 In 1987, Congress directed the Department of Energy to examine only Yucca Mountain, Nevada.76 In 2002, some fifteen
years after that directive, the Secretary of Energy recommended to the President that Yucca Mountain be designated as the site of that single national repository for nuclear waste. 77 President George W. Bush approved that recommendation and submitted it to Congress. The Governor
Yucca Mountain
The protection of the environment is generally thought to provide a public good public
undoubtedly benefits
The decision to create a single nuclear waste
repository at Yucca Mountain would provide a public good by establishing a single waste repository
The benefits
will
of the State of Nevada submitted a notice of disapproval. 78 Pursuant to 42 U.S.C. § 10135, Congress approved the siting of a nuclear repository at Yucca Mountain. 79 Once
project will go forward. 80
receives a license from the Nuclear Regulatory Commission, the
. The
from an environment that does not pose a health risk of one sort or another.
for the
nation's nuclear waste rather than having it dispersed among thirty-nine different states. It centralizes that waste, offers a location away from densely populated areas, and provides distinct security advantages.
from selecting a single site
accrue to a large number of people
. This environmental policy choice, however, differs in key respects from common environmental regulatory schemes. The more typical environmental regulation ostensibly benefits a large
number of people. No one person benefits any more than any other. For example, it can generally be said that
Though there may be
negative externalities
we all derive roughly the same benefit from clean air or water
are likely to be borne diffusely
associated with environmental regulations, they
.
. For example, if the negative externality of an environmental regulatory
scheme is to increase the costs of the operation of certain businesses, those businesses may be able to transfer some or all of those costs to the consumers of their products. This ability to transfer those businesses. Transferring costs not only limits the impact on those businesses but
further distributes the negative externality. Moreover, even in circumstances in which the impact of the negative externalities may be more concentrated, individuals bearing them are not excluded from the benefit. For example, if a business owner faces increased costs owing to an air
pollution regulatory scheme, he nonetheless would benefit from the clean air that the regulation might provide. Additionally, because everyone stands to benefit equally, no one person has an incentive to pursue such protections. The enactment of such protections frequently presents a
collective action problem. In
contrast to run-of-the-mill environmental regulation
the
, the kind of
diffuse benefits
discussed above
can only be
achieved by consolidating the waste
additional national security negative externalities are created by the
storage of nuclear waste at many sites
present targets of opportunity for terrorist attacks.
Such attacks could cause dire results. Selecting a single site
is designed to limit the number of
persons that are exposed to potential negative externalities associated with long-term storage
into a single location. As noted above, there are currently 131 such sites in thirty-nine states. The 161 million people located in and around those 131 sites all are exposed to the
various risks that are associated with a nuclear waste repository. 8 1 Moreover,
so
different
. These sites
potential
for disposing of nuclear waste
the
the
of nuclear waste. The
question then becomes where that site should be located. Storing all the nation's nuclear waste at a single site also presents a classic not-in-my-backyard dilemma played out on the national level. Virtually everyone recognizes the benefits to be derived from having a single site for
disposing of nuclear waste. Unlike the typical environmental regulation circumstance described above in which those suffering some of the negative externalities will nonetheless stand to receive the same benefits from the regulatory scheme, those limited number of individuals still
bearing the potential negative externalities of having a nuclear waste repository site located near them will not receive the benefits that the rest of the nation will receive by virtue of storing nuclear waste in a single site. To be sure, to the extent that the risk of attack itself is reduced, it
could be argued that even those living in close proximity to a repository benefit from that risk reduction. Nonetheless, those persons residing near the site bear the remaining risk alone. It is hard to imagine any state or locality that would welcome a nuclear waste dump. It is, therefore,
And it is clear that the circumstances of Yucca
Nevada against the federal government Should judicial review be available
hardly a shock that the State of Nevada and various local entities in Nevada have strenuously objected to placing that repository at Yucca Mountain.
pit the interest of
the State of
Mountain
.
for claims arising out of this conflict between Nevada and the federal government?
any conflict arising out of Yucca
should not be subject to judicial review
tell us that
Mountain
The political safeguards theory would, of course,
. In support of that, the political safeguards would point to all of the
formal mechanisms enacted by Congress to take account of the State of Nevada's views and interests. Specifically, the Secretary of Energy was required to "hold public hearings in the vicinity of the Yucca Mountain site, for the purpose of informing the residents of the area of such
consideration and receiving their comments regarding the possible recommendation of the site." 82 If the Secretary decided to recommend Yucca Mountain as the nuclear waste site to the President, the statute required the Secretary to notify the Governor and the legislature of the
State of Nevada thirty days in advance of his transmittal of his recommendation to the President. 83 Moreover, that recommendation must include "the views and comments of the Governor and legislature of any State." 84 Only after receiving this recommendation can the President
submit his approval of the site to Congress. 85 Additionally, once the President submitted his approval to Congress, the stat ute permitted the Governor or the legislature of the state in which the site is located to submit "a notice of disapproval. '86 Though the outcome proved
this situation demonstrates that states are very much represented
Congress enacted legislation codifying mechanisms for ensuring that Nevada's views would be
considered
The question
is whether the decision to locate
a single nuclear waste repository
at Yucca
violates
federalism
when analyzed through either of these models
decision to site a nuclear waste repository at Yucca Mountain does not violate the Constitution
undesirable to Nevada, the political safeguards theory would say
in the national legislature. After all,
. Thus, Yucca Mountain is a situation that proponents of this version of judicial review like Wechsler, Choper, and Kramer would argue demonstrates that political safeguards adequately protect federalism, For those of us who are proponents of the judicial
review of federalism, the various formal mechanisms adopted by Congress to take into account the views of the State of Nevada matter little.
for the nation
Mountain
for our purposes
the constitutional principles of
. We employ the two models of federalism discussed above-the
political autonomy model and the dual sovereignty modelto analyze whether this decision violates the Constitution. We conclude that
the federal
government's
. As
discussed above, the political autonomy model of federalism views states as autonomous, independent entities. They have funda mental attributes that exist without first having to examine the powers that the Constitution expressly delegates to the f ederal government. Under this model,
Congress cannot legislate in such a way as to impress state officials into the service of the federal
government
It is difficult to contend that locating the nuclear waste repository on
or command a state to regulate in any given way.
the federal land at Yucca Mountain undermines Nevada's political autonomy . In no way has Nevada's
ability to govern itself been obscured
In New York, Congress
had usurped a state's legislative process through the take-title provision provision commanded the
states to regulate in a manner determined by the federal government
As the Court explained
Congress cannot regulate states qua states. Does the location of a
nuclear repository for the entire nation at Yucca Mountain compel the state government of Nevada in
any way? No. The government has not attempted to compel the state legislature to enact legislation
has not treated the State of Nevada as a mere field office or administrative
agency
approval of the placement of the repository at Yucca
does not require any
. A brief comparison with New York and Printz swiftly demonstrates that Nevada's ability has not been so obscured.
. That
. 87 In Printz, the federal government directed state officials to carry out a federal regulatory
program.
carefully
in New York,
federal
implementing a federal regulatory scheme. The federal government
. Furthermore, the federal government's
action by state officials
Mountain
government has not attempted to require state officials to be responsible
It has not delegated to state officials any ministerial task
Thus, it has not asked the State
of Nevada to absorb the financial cost of Yucca Mountain, but has instead minimized the financial
. The federal
for
managing the repository, providing its security, or administering regulations for the repository for any period of time. 88
regarding
the Yucca Mountain site. Congress has also provided for various forms of financial assistance to the state and local governments affected by the siting of the nuclear waste facility at Yucca Mountain.89
costs
to the state. 90 Indeed,
in its latest challenge
to siting of the nuclear waste repository at Yucca Mountain,
Nevada has not even contended that the
federal government has attempted to directly regulate the state
or its officers.91 Instead,
the federal government
bears the entirety of the responsibility for the construction, management, maintenance, and
regulation of
the
Yucca
Mountain site.
Because responsibility
for the project
rests squarely with the federal government,
there is little risk that the political accountability of either
the federal government or the State of Nevada
would be undermined
. Simply put, the
actions taken by the federal government do not interfere in Nevada's relationship with its citizens. Those actions are consistent with the fundamental choice made by the Framers that the federal government would have the power to directly regulate the people, not the states.
Consequently, when we view Yucca Mountain through the lens of the political autonomy model of federalism, the State of Nevada appears to have no constitutional claim. A review of Yucca Mountain through the dual sovereignty model of federalism likewise demonstrates that there has
not been a violation of the Constitution. As discussed above, the dual sovereignty model re-pository at Yucca Mountain would benefit the nation's security. 98 By contrast, the Constitution expressly precludes the states from entering into treaties and from "keep[ing] troops, or Ships of
the authority
the Constitution commits to the federal government for nation's security , it is difficult to argue that
War in time of Peace" without the consent of Congress. 99 The Constitution further forbids the states from "engag[ing] in War" without congressional consent "unless actually invaded, or in such imminent Danger as will not admit of delay."' Because of
the
where the federal government determines that a single nuclear
waste
dump, rather than numerous
nuclear waste dumps, is in the nation's national security interests
Under the Commerce Clause, the Constitution also provides Congress with the power "[t]o
regulate Commerce
Court has held that Congress's Commerce Clause powers extend to
those activities substantially affecting
commerce waste to be stored at Yucca
is waste from the
generation of electricity
waste
has substantial effects on interstate
commerce impossible to imagine day-to-day operations
the Supreme
Court has noted that "it is difficult to conceive of a more basic element of interstate commerce than
and that locating the dump on federal land in Nevada serves that interest, the federal government
lacks the power to act.
among the several States." 10 1 The
interstate
through nuclear power. Electricity's generation and the
. It is
electric energy
such usage creates. 10 4 Even
. The
Mountain
it creates no doubt
in today's economy that occur without the extensive use of electricity. Indeed,
, a product used in virtually every home, and every commercial or manufacturing facility."' 103 The use of nuclear power to g enerate electricity entails obvious externalities that require key policy decisions, such as where to store the waste that
in New York v. United States petitioners did not dispute Congress's power to legislate
, the
regarding nuclear waste disposal
means it had selected
. Instead,
they contended solely that Congress could not do so through the
.
Democracy doesn’t solve war
Rosato 3 — international relations, international security, and qualitive methods expert, B.A. Cambridge
Univsity, Oxford University, MA Ph.D, University of Chicago (Sebastian, “The flawed logic of democratic
peace theory”, American Political Science Review, Vol. 97, No. 4, November 2003,
http://weber.ucsd.edu/~tkousser/Rosato%202003.pdf)
The causal logics that underpin democratic peace theory cannot explain why democracies remain at
peace with one another because the mechanisms thatmakeup these logics do not operate as stipulated by the theory’s proponents. In the
case of the normative logic, liberal democracies do not reliably externalize their domestic norms of conflict
resolution and do not treat one another with trust and respect when their interests clash. Similarly, in the
case of the institutional logic, democratic leaders are not especially accountable to peaceloving publics or pacific
interest groups, democracies are not particularly slow to mobilize or incapable of surprise attack, and open political
competition offers no guarantee that a democracy will reveal private information about its level of resolve. In view of these findings there
are good reasons to doubt that joint democracy causes peace. Democratic peace theorists could counter this claim by
pointing out that even in the absence of a good explanation for the democratic peace, the fact remains that democracies have rarely fought
one another. In addition to casting doubt on existing explanations for the democratic peace, then, a comprehensive critique should also offer a
positive account of the finding. One potential explanation is that the democratic peace is in fact an imperial peace based on American power.
This claim rests on two observations. First, the
democratic peace is essentially a post-World War II phenomenon
restricted to the Americas and Western Europe. Second, the United States has been the dominant power in both these regions
since World War II and has placed an overriding emphasis on regional peace. There are three reasons we should expect democratic peace
theory’s empirical claims to hold only in the post1945 period. First, as even proponents of the democratic peace have admitted, there were few
democracies 23 Kirschner (2000) suggests that even if all parties know each others’ private information, there are still good reasons to expect
them to go to war.
Democracy fails
Rothstein and Teorell 8 (Bo, August Röhss Chair in Political Science at University of Gothenburg, *AND
Jan Teorell, PhD in government, associate professor of political science at Lund University, research
fellow at the Quality of Government Institute, Gothenburg University, April 2008, “What Is Quality of
Government? A Theory of Impartial Government Institutions,” Governance: An International Journal of
Policy, Administration, and Institutions, Vol. 21, No. 2,
http://www.sahlgrenska.gu.se/digitalAssets/1358/1358049_what-is-quality-of-government.pdf)
Empirically, there is no straightforward relationship between democracy in the access to public power and
impartiality in the exercise of public power. On the contrary, democracy seems to be curvilinearly related to the level
of corruption (Montinola and Jackman 2002; Sung 2004). Empirical research indicates that some democratization
may at times be worse for impartiality than none. For example, some of the worst cases of corruption have appeared in
newly democratized countries, such as Peru under its former president Fujimori (McMillan and Zoido 2004). Conversely, some
undemocratic countries have shown impressive results in curbing corruption and establishing fairly
impartial bureaucracies, prime examples being Hong Kong and Singapore (Root 1996). Moreover, the track record of
democracy in terms of producing valued social outcomes is surprisingly uneven . The inherently ambiguous
results in the empirical research on whether democracy matters for growth is perhaps the most prominent example (see, e.g., Kurzman,
Werum, and Burkhart 2002; Przeworski and Limongi 1993; Sirowy and Inkeles 1990). True, democracy usually comes out as a strong predictor
of human rights (Davenport and Armstrong 2004; Poe, Tate, and Keith 1999). But democracy should arguably be defined at least partly in terms
of human rights such as personal integrity (Hadenius 1992; Hadenius and Teorell 2005), so this finding is not all that surprising. A case in point is
the relationship between democracy and the probability of civil war. Empirical research shows an inverted U-curve, with strong autocracies and
full democracies being least likely to engage in civil violence (Hegre et al. 2001). Curvilinearity is of course not tantamount to a null effect, but
this does indicate that
some democracy may at times be worse than none
(although a lot of democracy is better than
some). A related argument is the “democratic peace” theory, where the strong empirical regularity pertains to the dyadic level, that is, between
pairs of states, both of which are democracies (Oneal and Russett 1999). Monadically speaking, however,
democracies are not
significantly less aggressive than autocracies , whereas the incidence of incomplete democratization even
seems to make a country more likely to go to war (Mansfield and Snyder 2005). Finally, some recent work seriously
questions the presumed positive effects of democracy on human development, arguing that this is either extremely slow and evolving over
decades (Gerring, Thacker, and Alfaro 2005) or, even worse, vanishes completely once missing data bias have been corrected for (Ross 2006).
Simply put, knowing the extent to which a country is democratic or not cannot help in explaining the multitude of highly valued economic and
social consequences of QoG documented in the literature.
1nc – federalism bad
The entire aff is backwards – the plan decimates coherent nuclear waste disposal –
turns the case
Dreilinger 10 – J.D. @ Northwestern (Samantha Dreilinger, Spring 2010, “Fall-Out: New York v. United
States and the Low-Level Radioactive Waste Problem,”
http://scholarlycommons.law.northwestern.edu/cgi/viewcontent.cgi?article=1051&context=njlsp)//two
nily
For over thirty years, the United States has failed to solve its low-level radioactive waste problem. Working
independently, state governments have not developed a single new disposal site for low-level radioactive
waste . Congress’ best efforts to address the growing quantities of waste included passing a law that, in part,
required states to accept responsibility for all low-level radioactive waste. Unsurprisingly, most states opposed this
provision
and were relieved
when the Supreme Court deemed it unconstitutional in New York v. United States. As the
states and the federal government remain at an impasse , low-level radioactive waste accumulates in
countless makeshift storage facilities across the country. The road to New York began in 1985 when Congress passed the Low-Level Radioactive
Waste Policy Amendments Act (Amendments Act). The Amendments Act put forward a series of incentives and penalties that Congress believed would persuade
states to develop disposal sites. The most severe and controversial penalty was the “take-title” provision, under which a state that failed to develop or secure access
to a disposal site by January 1, 1996, would be forced to accept legal ownership and liability for all of its low-level radioactive waste.8 ¶3 In the early 1990s, it was
clear that the states and the federal government were heading toward a confrontation. The Amendment Act’s deadline was rapidly approaching, yet community
opposition and legal challenges had prevented states from beginning the lengthy process required to develop disposal sites.9 Faced with accepting legal
responsibility for thousands of feet of privately-generated low-level radioactive waste, one state challenged the constitutionality of the Amendments Act: New
York.10 ¶4 New York’s challenge culminated in New York v. United States. 11 In New York, the Supreme Court struck down the take-title provision of the
Amendments Act.12 According to Justice O’Connor, the
take-title provision was inconsistent with the Constitution’s division of authority
“‘commandeer’ state governments into the service of federal
between federal and state governments because it would
regulatory purposes.”13 ¶5 New York is widely considered to be one of the Rehnquist Court’s most important decisions, as well as one of Justice
O’Connor’s most influential opinions.14 The New York decision is also credited with setting the limits on the affirmative powers granted by Commerce Clause,15
reinvigorating the Tenth Amendment,16 and signaling the resurgence of dual federalism.17 New York is a seminal decision, cited in cases dealing with issues as farreaching as products liability,18 assisted suicide,19 gun control,20 and gambling on Native American Reservations.21 Unfortunately, the
New York
decision did not solve the United States’ low-level radioactive waste problem. Not a single state has developed a new
low-level radioactive waste disposal site,22 leading some to view the opinion as “‘good federalism but bad public policy. ’”23 ¶6 In
fact, in the almost twenty years since the New York decision, the United States’ low-level radioactive waste problem has
become increasingly dire. Between 1999 and 2003, the production of low-level radioactive waste increased by 200%.24 Yet, disposal capabilities
remained static. Without progress in the past two decades and with no solution on the horizon, the United States’ lowlevel radioactive waste problem has been silently, but steadily, coming to a head.25 ¶7 This Comment will reintroduce the problem of low-level radioactive waste
and emphasize the pressing need for a comprehensive solution. It will trace the history of low-level radioactive waste in the United States and the problems that led
to the Amendments Act. This Comment will then discuss the New
York decision and its impact on the current low-level
radioactive waste disposal process. It will show that without an immediate, workable solution, the low-level radioactive waste problem will
become progressively worse. The Comment concludes by offering a solution to the United States’ low-level radioactive waste problem through a federal-state
power sharing arrangement derived, ironically, from New York, the very case that exacerbated the dimensions of low-level radioactive waste disposal.
1nc – at: iraq federalism
Iraq doesn’t model the US –
a) Europe precedes
Rubini 03 (Daniel L. Rubini is the Senior Advisor to the Ministry of Justice in Iraq, “Ask the White House,”
11-03, http://www.whitehouse.gov/ask/20031113.html)//twonily
The first action taken by the Coalition was to roll away 35 years of thuggery and eliminate rule by decree. We did not “Americanize”
the system, but have sought to reintroduce universal concepts of fundamental fairness and due process. Rule of law can
and is being accomplished, and the Iraqi system resembles more the European civil code than the
American system.
b) Public opposition
Zeidel 08 (Ronen Zeidel, fellow of the Iraq Research Team at the Truman Institute and the Center of Iraq
Studies at the University of Haifa, “Iraq’s future: The War and Beyond” Right Side News, June 13, 2008,
http://www.rightsidenews.com/200806131177/global-terrorism/iraq-s-future-the-war-andbeyond.html)//twonily
Ronen Zeidel: I wanted to say it took me a great effort to say what I said about sectarianism in Iraq, because personally, as an Iraqi citizen, I
would be in favor of Iraqi national identity all out, without having this sectarian layer in between. I guess many Iraqis would agree. It's just that
reality does not always go our own way. I think Iraqi national identity is in the process of being renegotiated after April 2003, and the new
version, once it's out, would certainly have to find more space for the sectarian layer that exists within every Iraqi citizen--sectarian and ethnic
layer to include the Kurds here. We cannot be back into blurring sectarianism altogether, forbidding it. Millions of people go to Karbala every
year for Ashura; you cannot forbid these parades and marches altogether just because you have to go back to the old version--not a good one-of Iraqi national identity.Now I must go back to the longterm and say that if we do encourage this deconstruction of all common denominators,
like deconstruction of the Sunni and sectarian identity, Iraq will end up like Somalia. There
is already a very weak central
government with lots of tribes running or ruling the countryside, each with conflicting interests and
nothing understandable--true chaos. Whether it is good in the short-term, I don't know, but in the longterm it could be really
destructive, and many Iraqis fear that. Iraqis are strongly suspicious of federalism; most of them are in favor
of a strong central government and centralization, along the lines of what the Iraqi state looked like for 83 years.
**land
1nc – land
Squo solves environmental damage
WNA 5 – World Nuclear Association (August 2013, “Safe Management of Nuclear Waste and Used
Nuclear Fuel,” http://www.world-nuclear.org/WNA/Publications/WNA-Position-Statements/SafeManagement-of-Nuclear-Waste-and-Used-Nuclear-Fuel/)//twonily
The absence of any significant environmental impact from the nuclear industry demonstrates, in practice
and on the record, the
continued success of robust and well-proven nuclear power technologies, the reality
of competent safety and environmental oversight
by national and international authorities, and
the responsible
behaviour of well-established nuclear operators. Because the nuclear industry's strong performance in safety
yields nuclear "incidents" only rarely , the media often give greater attention to even a minor nuclear
incident causing little or no harm than to the frequent and seriously harmful accidents involving fossil fuel
production or use. For example, coalmining
accidents kill thousands of people each year. Indeed, the death rate from
worldwide coalmining exceeds, in just two days, the fewer than 50 persons who died from direct radiation exposure or fallout-induced thyroid
cancer as a consequence of the world's major nuclear accident at Chernobyl, Ukraine in 1986. (Reference: UNSCEAR, the UN Scientific
Committee on the Effects of Atomic Radiation.) Oil
and gas-related accidents kill many more, while large oil spills have
had a devastating environmental effect on sea coastlines and marine ecology. Arguably even more significant than
these specific fossil fuel-related accidents is the enormous worldwide discharge of pollutants into the
atmosphere from fossil fuel combustion - a stream of emissions that continues to degrade human health and the global environment.
Studies prove there’s no impact
WNA 5 – World Nuclear Association (August 2013, “Safe Management of Nuclear Waste and Used
Nuclear Fuel,” http://www.world-nuclear.org/WNA/Publications/WNA-Position-Statements/SafeManagement-of-Nuclear-Waste-and-Used-Nuclear-Fuel/)// twonily
Nuclear Waste: A Surprisingly Small Burden Nuclear power produces huge quantities of energy from very small quantities of nuclear fuel.
In an industrial country, a typical city of one million people consumes the amount of electricity generated by a single 1,000 MWe (megawatt-electric) nuclear power
reactor. How much waste results? The annual operation of this reactor would typically create about 100 cubic meters of LLW (including some ILW that quickly
decays to LLW). This waste consists of (a) contaminated materials such as resins, filters, rags, metals, clothes and mud; and (b) material such as mortar that is added
to stabilise these wastes. Generating the city's electricity for one year would use about 20 tonnes of uranium-based fuel, leaving an equivalent amount of UNF. The
volume of highly radioactive material to be disposed of depends on how the UNF is treated: If UNF is conditioned for direct disposal, the 20 tonnes of UNF converts
to a volume of about 40 cubic metres. If UNF is reprocessed and conditioned for disposal, the resulting waste converts to volumes of 3 cubic metres of HLW and 4
cubic metres of ILW - a reduction of over 75%. The HLW consists mainly of fission products and the material added (usually glass or concrete) to stabilise the waste.
The ILW consists of parts of the UNF fuel cladding, which have been compacted. The LLW Burden: The 100 cubic metres of LLW can be disposed of in a surface or
shallow repository for this kind of material. Over 10 years, the LLW burden for a city of one million people would be 1,000 cubic metres. This volume would fit on a
football field with a depth of 15 centimetres - about ankle deep. The UNF-HLW and ILW Burden: The material requiring disposal in a deep geological repository is
the 40 cubic metres of UNF or, if there is reprocessing, the 7 cubic metres of HLW and ILW. Over 10 years, the accumulated volume for a city of one million people
would be either 400 or 70 cubic metres. If placed in a four-metre-high gallery in a deep geological repository, these 10-year volumes would require either 100
square metres of floor space (the size of a small home) or 17.5 square metres of floor space (the size of a bedroom). From these figures, it can be seen that
the
use of nuclear power to electrify a large city produces, even over the course of a decade, a remarkably small burden
of nuclear wastes of all kinds. To generate the same electricity using fossil fuel would require 160,000
times as much coal
- over 3 million tonnes annually. The associated
volume of waste from fossil fuel is correspondingly
large . Some of this waste (notably, coal ash) is often simply buried in landfill sites, even though it is toxic and slightly
radioactive. But much is not, and carbon dioxide and other pollutants are released freely into the air. In contrast, the nuclear
industry's waste is carefully confined , and the very minor radioactive discharges are rigorously limited to
ensure there is no adverse human health or environmental impact. This same city of 1 million people also generates a mass of industrial waste
each year that is typically 2,500 times greater than the nuclear waste created while electrifying the entire city. Of this industrial waste, the mass of the toxic waste
alone is some 100 times greater than the city's nuclear waste. Much of this toxic
waste lasts indefinitely, never "decaying" as does nuclear
wastes constitute a far greater challenge for safe
management than nuclear waste, requiring both more space and also isolation for an indefinite time scale. In contrast, after a few
decades, most nuclear waste contains only very low levels of radioactivity, and the small volume of waste with a long
waste. Thus, for reasons of volume and permanency, industrial
decay life is suitable for safe disposal in a geological repository. While the burden of nuclear waste is in any case remarkably small, reprocessing UNF offers a means
to reduce still further - by over 75% - the overall volume of material requiring disposal in a deep geological repository. Reprocessing yields HLW and ILW in a safe
and durable form of confinement, and meanwhile recovers the UNF's remaining energy value for potential recycling and reuse as nuclear fuel. A single nuclear
treatment plant that reprocesses 1,000 metric tonnes of UNF annually can, over a four-year period, recover the energy equivalent of 80-100 million tonnes of
petroleum - about the annual oil production of Kuwait. UNF contains valuable fissile material in two forms: plutonium (created during the fission of the original
uranium fuel) and still-unfissioned uranium. Recycling these fissile materials can improve the energy yield of the original uranium fuel and can also lead to creating
even more fuel. A plutonium-based Fast Breeder Reactor (FBR), while producing energy, can transform non-fissile uranium into fissile material, thus greatly
extending future world supplies of nuclear fuel. Widespread use of Fast Breeder Reactors is not on the near horizon. But anticipated advances in FBR technology
and in the use of nuclear power have begun to raise appreciation among planners that the nuclear waste of today could become an important energy source of
tomorrow. Meanwhile, in today's context, the
question of direct disposal of UNF versus reprocessing is not a matter of
safe waste management. Both methods are proven and highly sound.
No extinction – empirics – reactors leak literally all the time
Nichols 13 – columnist @ Veterans Today (Bob Nichols, 4/6/13, “All Nuclear Reactors Leak All of he
Time,” http://www.veteranstoday.com/2013/04/06/all-reactors-leak-all-the-time/)//twonily
(San Francisco) Reportedly Americans widely believe in God and lead the world in the percentage of citizens in prison and on parole. That is
actual reality from an imaginary character in a TV show. The Gallup Poll also says it is true and has been for years. Most
Americans
believe that nuke reactors are safe and quite sound, too. Wonder why they do that? Most people at one time in
their lives watched as steam escapes from a pressure cooker and accept it as real and true. A reactor is very
much the same thing . The “cooks,” called “Operators,” even take the lid off from time to time too. A nuclear reactor is just
an expensive, overly complicated way to heat water to make steam. Of course all reactors leak ! All nuclear
reactors also actually manufacture more than 1,946 dangerous and known radioactive metals, gases and aerosols. Many isotopes, such
as radioactive hydrogen, simply cannot be contained . So, they barely even try. It is mostly just a show for the rubes.[1]
Even explosions don’t cause leaks – empirics
Bellona News 11 – (9/12/11, “Breaking: Explosion rocks French nuclear facility; no radiation leaks
apparent,” http://bellona.org/news/nuclear-issues/accidents-and-incidents/2011-09-breakingexplosion-rocks-french-nuclear-facility-no-radiation-leaks-apparent)//twonily
There is no immediate evidence of a radioactive leak after a blast at the southern French nuclear
facility of Marcoule near Nimes which killed one person and injured four others, one seriously, French media have reported and
safety officials have confirmed. There was no risk of a radioactive leak after the blast , caused by a fire near a furnace in the
Centraco radioactive waste storage site, said officials according to various media reports. The plant’s owner, national electricity provider EDF, said it had been
“an industrial accident, not a nuclear accident.” “For the time being nothing has made it outside ,” said one spokesman
for France’s Atomic Energy Commission who spoke anonymously to the BBC. The Centraco treatment centre, which has been operational since February of 1999,
belongs to a subsidiary of EDF. It produces MOX fuel, which recycles plutonium from nuclear weapons. “[Marcoule] is French version of Sellafield. It is difficult to
evaluate right now how serious the situation is based on the information we have at the moment. But it can develop further,” said Bellona nuclear physicist Nils
Bøhmer. The local Midi Libre newspaper, on its web site, said an oven exploded at the plant, killing one person and seriously injuring another. No
radiation
leak was reported, the report said, adding that no quarantine or evacuation orders were issued for neighboring
towns. A security perimeter has been set up because of the risk of leakage. The explosion hit the site at 11:45 local time. The EDF spokesman said the
furnace affected had been burning contaminated waste, including fuels, tools and clothing, which had been used in nuclear
energy production. “The fire caused by the explosion was under control,” he told the BBC. The International Atomic Energy Agency (IAEA) said it was in touch with
the French authorities to learn more about the nature of the explosion. IAEA Director General Yukiya Amano said the organisation’s incident centre had been
“immediately activated,” Reuters reports. A statement issued by the Nuclear Safety Authority also said there have been no radiation leaks outside of the plant. Staff
at the plant reacted to the accident according to planned procedures, it said. France’s Nuclear Safety Authority, however, is not noted for its transparency.
Operational since 1956, the Marcoule plant is a major site involved with the decommissioning of nuclear facilities, and operates a pressurised water reactor used to
produce tritium. The site is has also been used since 1995 by French nuclear giant Areva to produce MOX fuel at the site’s MELOX factory, which recycles plutonium
from nuclear weapons. Part of the process involves firing superheated plutonium and uranium pellets in an oven. The Marcoule plant is located in the Gard
department in Languedoc-Roussillon region, near France’s Mediterranean coast. Marcoule: Sellafield’s French brother Its first major role upon opening was
weapons production as France sought a place among nuclear nations. Its reactors
generated the first plutonium for France’s first
nuclear weapons test in 1960. Its reactor producing tritium as fuel for hydrogen as well as other weapons related reactors sprang up as the arms race
gained international traction. The site also houses an experimental Phenix fast-breeder reactor which since 1995 has combine fissile uranium and plutonium into
mixed oxide or MOX fuel that can be used in civilian nuclear power stations.
Coal plants disprove the impact – they emit way more radiation than a global meltdown
Worstall 13 – Forbes Contributor focusing on business and technology (Tim Worstall, 8/10/13, “The
Fukushima Radiation Leak Is Equal to 76 Milion Bananas,”
http://www.forbes.com/sites/timworstall/2013/08/10/the-fukushima-radiation-leak-is-equal-to-76million-bananas/)//twonily
Not that Greenpeace is ever going to say anything other than that nuclear power is the work of the very devil of course. And the headlines do indeed
seem alarming: Radioactive Fukushima groundwater rises above barrier – Up to 40 trillion becquerels released into Pacific ocean so far – Storage for
radioactive water running out. Or: Tepco admitted on Friday that a cumulative 20 trillion to 40 trillion becquerels of radioactive
tritium may have leaked into the sea since the disaster. Most of us haven’t a clue what that means of course. We don’t instinctively understand what
a becquerel is in the same way that we do pound, pint or gallons, and certainly trillions of anything sounds hideous. But don’t forget that trillions of picogrammes of
dihydrogen monoxide is also the major ingredient in a glass of beer. So what we
really want to know is whether 20 trillion
becquerels of radiation is actually an important number. To which the answer is no, it isn’t. This is actually
around and about (perhaps a little over) the amount of radiation the plant was allowed to dump into the environment
before the disaster. Now there are indeed those who insist that any amount of radiation kills us all stone dead while we
sleep in our beds but I’m afraid that this is incorrect . We’re all exposed to radiation all the time and we all
seem to survive long enough to be killed by something else so radiation isn’t as dangerous as all that. At which point we can offer
a comparison. Something to try and give us a sense of perspective about whether 20 trillion nasties of radiation is something to get all concerned about or not. That
comparison being that the radiation leakage from Fukushima appears to be about the same as that from 76 million bananas. Which is a lot of bananas I agree, but
again we can put that into some sort of perspective. Let’s start from the beginning with the banana equivalent dose, the BED. Bananas contain potassium, some
portion of potassium is always radioactive, thus bananas contain some radioactivity. This gets into the human body as we digest the lovely fruit (OK, bananas are an
herb but still…): Since a typical banana contains about half a gram of potassium, it will have an activity of roughly 15 Bq. Excellent, we now have a unit that we can
grasp, one that the human mind can use to give a sense of proportion to these claims about radioactivity. We know that bananas are good for us on balance, thus
this amount of radioactivity isn’t all that much of a burden on us. We also have that claim of 20 trillion becquerels of radiation
having been dumped into the Pacific Ocean in the past couple of years. 20 trillion divided by two years by 365 days by 24 hours gives us an hourly rate of
1,141,552,511 becquerels per hour. Divide that by our 15 Bq per banana and we can see that the radiation spillage from Fukushima is running at 76 million bananas
per hour. Which is, as I say above, a lot of bananas. But it’s not actually that many bananas. World production of them is some 145 million tonnes a year. There’s a
thousand kilos in a tonne, say a banana is 100 grammes (sounds about right, four bananas to the pound, ten to the kilo) or 1.45 trillion bananas a year eaten around
the world. Divide again by 365 and 24 to get the hourly consumption rate and we get 165 million bananas consumed per hour. We can do this slightly differently
and say that the 1.45 trillion bananas consumed each year have those 15 Bq giving us around 22 trillion Bq each year. The Fukushima leak is 20 trillion Bq over two
years: thus our two calculations agree. The current
leak is just under half that exposure that we all get from the global
consumption of bananas. Except even that’s overstating it. For the banana consumption does indeed get into our bodies: the Fukushima
leak is getting into the Pacific Ocean where it’s obviously far less dangerous. And don’t forget that all that radiation in the bananas ends up in the oceans as well,
given that we do in fact urinate it out and no, it’s not something that the sewage treatment plants particularly keep out of the rivers. There are some who are
viewing this radiation leak very differently: Arnold Gundersen, Fairewinds Associates: [...] we are contaminating the Pacific Ocean which is extraordinarily serious.
Evgeny Sukhoi: Is there anything that can be done with that, I mean with the ocean? Gundersen: Frankly, I don’t believe so. I think we will continue to release
radioactive material into the ocean for 20 or 30 years at least. They have to pump the water out of the areas surrounding the nuclear reactor. But frankly, this water
is the most radioactive water I’ve ever experienced. I have to admit that I simply don’t agree. I’m not actually arguing that radiation is good for us but I really don’t
think that half the radiation of the world’s banana crop being diluted into the Pacific Ocean is all that much to worry about. And why
shouldn’t worry about it
we really
all that much. The radiation that fossil fuel plants spew into the environment each year is around 0.1 EBq. That’s
ExaBecquerel, or 10 to the power of 18. Fukushima
is pumping out 10 trillion becquerels a year at present. Or 10 TBq, or 10 of 10 to the power of 12.
Or, if you prefer, one ten thousandth of the amount that the world’s coal plants are doing . Or even, given that
there are only about 2,500 coal plants in the world, Fukushima is, in this disaster, pumping out around one
quarter of the radiation that a coal plant does in normal operation . You can worry about it if you want but it’s not
something that’s likely to have any real measurable effect on anyone or anything.
Nuke power’s not inevitable – Fukushima devastated public confidence and caused restrictive overregulation
Moniz 11 – Energy Secretary of the US, American nuclear physicist (Ernest Moniz, Nov/Dec 2011, Council
on Foreign Relations Foreign Affairs Magazine, “Why We Still Need Nuclear Power,”
http://www.foreignaffairs.com/articles/136544/ernest-moniz/why-we-still-need-nuclearpower)//twonily
In the years following the major accidents at Three Mile Island in 1979 and Chernobyl in 1986, nuclear power fell out of
favor , and some countries applied the brakes to their nuclear programs. In the last decade, however, it began experiencing
something of a renaissance. Concerns about climate change and air pollution, as well as growing demand for electricity, led many
governments to reconsider their aversion to nuclear power, which emits little carbon dioxide and had built up an impressive safety and reliability
record. Some countries reversed their phaseouts of nuclear power, some extended the lifetimes of existing reactors, and many developed plans for new ones.
Today, roughly 60
nuclear plants are under construction worldwide, which will add about 60,000 megawatts of generating capacity -equivalent to a sixth of the world's current nuclear power capacity. But the movement lost momentum in March, when a 9.0-magnitude
earthquake and the massive tsunami it triggered devastated Japan's Fukushima nuclear power plant. Three reactors
were severely damaged, suffering at least partial fuel meltdowns and releasing radiation at a level only a few
times less than Chernobyl. The event caused widespread public doubts about the safety of nuclear power to
resurface. Germany
announced an accelerated shutdown of its nuclear reactors , with broad public support,
and Japan made a similar declaration, perhaps with less conviction. Their decisions were made easier thanks to the fact that electricity
demand has flagged during the worldwide economic slowdown and the fact that global regulation to limit climate change seems
less imminent now than it did a decade ago. In the U nited S tates, an already slow approach to new nuclear plants slowed
even further in the face of an unanticipated abundance of natural gas.
Studies prove the impact’s impossible
Alvarez et al 3 (*Robert, a Senior Scholar at IPS, where he is currently focused on nuclear disarmament,
environmental, and energy policies, served as a Senior Policy Advisor to the Secretary and Deputy
Assistant Secretary for National Security and the Environment. *Jan Beyea, PhD, earth science and
environmental studies *Klaus Janberg, *Jungmin Kang, *Ed Lyman, *Allison Macfarlane, *Gordon
Thompson, *Frank N. von Hippel, PhD Princeton University. “Reducing the Hazards from Stored Spent
Power-Reactor Fuel in the United States” Science and Global Security, 11:1–51, 2003 www.irssusa.org/pages/documents/11_1Alvarez.pdf) //twonily
The U.S. Nuclear Regulatory Commission (NRC) has estimated the probability of a loss of coolant from a spent-fuel
storage pool to be so small (about 10−6 per pool-year) that design requirements to mitigate the consequences
have not been required.1 As a result, the NRC continues to permit pools to move from open-rack configurations, for which naturalconvection air cooling would have been effective, to “dense-pack” configurations that eventually fill pools almost wall to wall. A 1979 study
done for the NRC by the Sandia National Laboratory showed that, in case of a sudden loss of all the water in a pool, dense-packed spent fuel,
even a year after discharge, would likely heat up to the point where its zircaloy cladding would burst and then catch fire.2 This would result in
the airborne release of massive quantities of fission products.
1nc – at: middle east war (scenario 1)
No escalation – leaders check
Maloney 7 (Suzanne, Senior Fellow – Saban Center for Middle East Policy, Steve Cook, Fellow – Council
on Foreign Relations, and Ray Takeyh, Fellow – Council for Foreign Relations, “Why the Iraq War Won’t
Engulf the Mideast”, International Herald Tribune, 6-28, http://www.brookings.edu/views/oped/maloney20070629.htm)//twonily
Underlying this anxiety was a scenario in which Iraq's sectarian and ethnic violence spills over into neighboring countries, producing conflicts
between the major Arab states and Iran as well as Turkey and the Kurdistan Regional Government. These wars then destabilize the entire
region well beyond the current conflict zone, involving heavyweights like Egypt. This is scary stuff indeed, but with the exception of the conflict
between Turkey and the Kurds, the scenario is far from an accurate reflection of the way Middle Eastern leaders view the situation in
Iraq and calculate their interests there. It is abundantly clear that major outside powers like Saudi Arabia, Iran and Turkey are heavily involved
in Iraq. These countries have so much at stake in the future of Iraq that it is natural they would seek to influence political developments in the
country. Yet, the Saudis, Iranians, Jordanians, Syrians, and others are
very unlikely to go to war either to protect their own
sect or ethnic group or to prevent one country from gaining the upper hand in Iraq. The reasons are fairly
straightforward. First, Middle Eastern leaders, like politicians everywhere, are primarily interested in one thing: selfpreservation. Committing forces to Iraq is an inherently risky proposition, which, if the conflict went badly, could
threaten domestic political stability. Moreover, most Arab armies are geared toward regime protection
rather than projecting power and thus have little capability for sending troops to Iraq. Second, there is cause for concern about the
so-called blowback scenario in which jihadis returning from Iraq destabilize their home countries, plunging the region into conflict. Middle
Eastern leaders are preparing for this possibility. Unlike in the 1990s, when Arab fighters in the Afghan jihad against the Soviet Union returned
to Algeria, Egypt and Saudi Arabia and became a source of instability, Arab security services are being vigilant about who is coming in and going
from their countries. In the last month, the Saudi government has arrested approximately 200 people suspected of ties with militants. Riyadh is
also building a 700 kilometer wall along part of its frontier with Iraq in order to keep militants out of the kingdom. Finally, there is
no
precedent for Arab leaders to commit forces to conflicts in which they are not directly involved. The Iraqis
and the Saudis did send small contingents to fight the Israelis in 1948 and 1967, but they were either ineffective or never made it. In the 1970s
and 1980s, Arab countries other than Syria, which had a compelling interest in establishing its hegemony over Lebanon, never committed
forces either to protect the Lebanese from the Israelis or from other Lebanese. The civil war in Lebanon was regarded as someone else's fight.
Indeed, this is the way many leaders view the current situation in Iraq. To Cairo, Amman and Riyadh, the situation in Iraq is worrisome, but in
the end it is an Iraqi and American fight. As far as Iranian mullahs are concerned, they have long preferred to press their interests through
proxies as opposed to direct engagement. At a time when Tehran has access and influence over powerful Shiite militias, a massive cross-border
incursion is both unlikely and unnecessary. So Iraqis will remain locked in a sectarian and ethnic struggle that outside powers may abet, but will
remain within the borders of Iraq. The
Middle East is a region both prone and accustomed to civil wars. But given its experience with
ambiguous conflicts, the region has also developed an intuitive ability to contain its civil strife and prevent local conflicts
from enveloping the entire Middle East.
Cooperation solves
Fettweis 7 (Christopher, Asst Prof Poli Sci – Tulane, Asst Prof National Security Affairs – US Naval War
College, “On the Consequences of Failure in Iraq,” Survival, Vol. 49, Iss. 4, December, p. 83-98)//twonily
Without the US presence, a second argument goes, nothing would prevent Sunni-Shia violence from sweeping into every country where the religious divide exists.
A Sunni bloc with centres in Riyadh and Cairo might face a Shia bloc headquartered in Tehran, both of which would face enormous pressure from their
own people to fight proxy wars across the region. In addition to intra-Muslim civil war, cross-border warfare could
not be ruled out. Jordan might be the first to send troops into Iraq to secure its own border; once the dam breaks, Iran, Turkey, Syria and Saudi Arabia
might follow suit. The Middle East has no shortage of rivalries, any of which might descend into direct conflict after a destabilising US
withdrawal. In the worst case, Iran might emerge as the regional hegemon, able to bully and blackmail its neighbours with its new nuclear arsenal. Saudi Arabia and
Egypt would soon demand suitable deterrents of their own, and a
nuclear arms race would envelop the region. Once again, however,
none of these outcomes is particularly likely. Wider war No matter what the outcome in Iraq, the region is not likely to
devolve into chaos. Although it might seem counter-intuitive, by most traditional measures the Middle East is very
stable. Continuous, uninterrupted governance is the norm, not the exception; most Middle East regimes have
been in power for decades. Its monarchies, from Morocco to Jordan to every Gulf state, have generally been in power since
these countries gained independence. In Egypt Hosni Mubarak has ruled for almost three decades, and Muammar Gadhafi in Libya for almost
four. The region's autocrats
have been more likely to die quiet, natural deaths than meet the hangman or post-coup
one of the few exceptions to this pattern
of stability, and he met an end unusual for the modern Middle East. Its regimes have survived potentially destabilising shocks before,
and they would be likely to do so again. The region actually experiences very little cross-border warfare, and even
less since the end of the Cold War. Saddam again provided an exception, as did the Israelis, with their adventures in Lebanon. Israel fought four wars
with neighbouring states in the first 25 years of its existence, but none in the 34 years since. Vicious civil wars that once engulfed
Lebanon and Algeria have gone quiet, and its ethnic conflicts do not make the region particularly unique. The biggest risk of an American
withdrawal is intensified civil war in Iraq rather than regional conflagration. Iraq's neighbours will likely not prove eager to fight each
other to determine who gets to be the next country to spend itself into penury propping up an unpopular puppet regime next door. As much as the
Saudis and Iranians may threaten to intervene on behalf of their co-religionists, they have shown no eagerness to
replace the counter-insurgency role that American troops play today. If the United States, with its remarkable military and unlimited resources,
could not bring about its desired solutions in Iraq, why would any other country think it could do so?17 Common interest, not the presence of the US
military, provides the ultimate foundation for stability. All ruling regimes in the Middle East share a common (and
understandable) fear of instability. It is the interest of every actor - the Iraqis, their neighbours and the rest of the world - to see a
stable, functioning government emerge in Iraq. If the United States were to withdraw, increased regional cooperation to address
that common interest is far more likely than outright warfare.
firing squads. Saddam's rather unpredictable regime, which attacked its neighbours twice, was
1nc – at: terrorists steal fuel (scenario 1)
Spent fuel can’t be turned into weapons
McGregor 1 Douglas S. McGregor, director of the Semiconductor Materials and Radiological
Technologies Laboratory at the University of Michigan, 4/23/01 [The New American, “Rethinking
Nuclear Power,” http://www.thenewamerican.com/tna/2001/04-232001/vo17no09_nuclear_print.htm]//twonily
• Plutonium build-up: Western nuclear power reactors are constructed and engineered in a manner that minimizes plutonium build-up, and much of the plutonium
that is produced inside the reactor is used during an ordinary fuel cycle. Moreover, it should be kept in mind that using
fissile material for reactor
fuel is a far better method of preventing nuclear proliferation than storage or burying those materials.
After the fissile material has been used as nuclear fuel, it cannot possibly be used for weapons, thereby
eliminating the possibility of use by potential terrorists.
Reactor grade plutonium can’t do any damage
Cohen 90 Bernard L. Cohen, Professor of Physics at the University of Pittsburgh, 1990 [The Nuclear
Energy Option, http://home.pacbell.net/sabsay/nuclear/index.html]//twonily
However, there is a subtle aspect to producing plutonium by the reactor - reprocessing method, and to explain it we will divert briefly to review our Chapter 7 discussion of how a plutonium
bomb works. There are two stages in its operation: first, there is an implosion in which the plutonium is blown together and powerfully compressed by chemical explosives that surround it,
and then there is the explosion in which neutrons are introduced to start a rapidly escalating chain reaction of fission processes that release an enormous amount of energy very rapidly to
blow the system apart. All of this takes place within a millionth of a second, and the timing must be precise — if the explosion phase starts much before the implosion process is completed, the
one of the principal methods that has been considered for defending against
nuclear bombs is to shower them with neutrons to start the explosion early in the implosion process,
thereby causing the bomb to fizzle. For a bomb to work properly, it is important that no neutrons come upon the scene until the implosion process approaches
completion. Plutonium fuel, Pu-239, is produced in a reactor from U-238, but if it remains in the reactor it may be
converted into Pu-240, which happens to be a prolific emitter of neutrons. In a U.S. power plant, the fuel typically remains in the
reactor for 3 years, as a consequence of which something like 30% of the plutonium produced comes out as Pu-240. If this material is used in a bomb, the Pu240 produces a steady shower of 2 million neutrons per second,12 which on an average would reduce the
power of the explosion tenfold, but might cause a much worse fizzle. In short, a bomb made of this material, known as
"reactor-grade plutonium," has a relatively low explosive power and is highly unreliable. It is also far
more difficult to design and construct.
power of the bomb is greatly reduced. In fact,
1nc – at: terrorists attack reactors (scenario 2)
They can’t solve unless they get rid of all nuclear power plants
Blair 1, Dr. Bruce 10/1/1 (President of Center for Defense Information “What if the terrorist go nuclear?”
http://www.cdi.org/terrorism/nuclear.cfm Accessed on 9/12/11//KH)
A terrorist attack on a commercial nuclear power plant with a commercial jet or heavy munitions could have a similar
affect to a radiological bomb, and cause for greater casualties. If such an attack were to cause either a meltdown of the
reactor core (similar to the Chernobyl disaster), or a dispersal of the spent fuel waste on the site, extensive casualties could be
expected. In such an instance, the power plant would be the source of the radiological contamination,
and the plane or armament would be the explosive mechanism for spreading lethal radiation over large
areas.
Russia makes the impact inevitable
Blair 1, Dr. Bruce 10/1/1 (President of Center for Defense Information “What if the terrorist go nuclear?”
http://www.cdi.org/terrorism/nuclear.cfm Accessed on 9/12/11//KH)
In Russia, security for nuclear waste is especially poor, and the potential for diversion and actual use by
Islamic radicals has been shown to be very real indeed. In 1996, Islamic rebels from the break-away province of Chechnya
planted, but did not detonate, such a device in Moscow's Izmailovo park to demonstrate Russia's vulnerability. This dirty bomb
consisted of a deadly brew of dynamite and one of the highly radioactive by-products of nuclear fission
— Cesium 137.
1nc – at: enviro racism
Natives want the waste – it bolsters their economies
AP 6 (“Utah Tribe Divided Over Nuclear Waste,” 6/25/2006, Fox News,
http://www.foxnews.com/story/2006/06/25/utah-tribe-divided-over-nuclearwaste/)//twonily
sees
¶ Nuclear waste ¶
Bear wants to store 4,000 steel and concrete canisters of highly
radioactive used fuel from nuclear power plants ¶ The American Indian tribe would reap tens of millions
of dollars in rent over the next 40 years.¶ I've been shown there's no problem
it's safe,"
¶
companies signed
a lease with the tribe to put 40,000 tons of reactor waste on the reservation ¶
¶
¶
it is the way to
riches that will mean new homes, new jobs and better health care
SKULL VALLEY, Utah –
Leon Bear,
a stocky man in T-shirt and jeans, peers across the sagebrush-pocked valley where his ancestors once chased Pony Express riders and
the future for his dwindling tribe.
.
Just west
of the gun-barrel straight, two-lane road that darts through the Skull Valley Goshute Reservation,
.
"
year-old tribal leader insists, escorting a visitor around the reservation in a glistening new pickup truck.
. The way they plan to handle it,
the 46-
The truck is an example of the largess the tribe already has received from a consortium of eight electric utilities. Nine years ago, the
.
would oppose, that spells "not in my back yard" in the brightest of colors. Utah's establishment in Salt Lake City, the capital 45 miles away, is enraged.
It is the kind of deal that other tribes have rejected, that most communities
Critics, including some within the tribe, call it environmental racism at its rawest.
Bear says
for the 118 members of his tribe. Only about two dozen — including children — still live on the 18,000-
acre reservation, but this project will bring many of the others back, he predicts. ¶
No internal link – new legislation ensures safety
Hing 11 (Julianne, reporter and blogger for Colorines.com, “GOP Reopens Fight Over
Nuclear Waste in Sacred Yucca Mountain,” 3/15/2011, ColorLines,
http://colorlines.com/archives/2011/03/japans_nuclear_power_crisis_may_halt_yucca
_mountain_waste.html)//twonily
House Republicans introduced a bill to
permit 200 more commercial nuclear reactors in the U.S
Tucked into that bill is a
clause that revives
nuclear waste storage in Nevada’s Yucca Mountain,
A little over a week before an 8.9-magnitude earthquake ripped open a fissure in the Earth, triggered a deadly tsunami and set off a potential worldwide nuclear catastrophe,
., “enough to triple current megawatt capacity, by 2040.”
the long debate around
a move that Native American and environmental groups have been
resisting for decades.¶ Nuclear power may not produce pollution like fossil fuels, but it does produce waste that carries with it the risk of radioac tive contamination. There’s no expanding nuclear power without pinning down a nuclear waste storage site, which is one of the reasons
the House bill calls on the Nuclear Regulatory Commission to complete a review of the Yucca Mountain
site “without political interference.”¶
Native American groups have long opposed the construction of a nuclear waste storage site in Yucca Mountain, which is a sacred spiritual and religious site for local Western Shoshone
and Pauite tribes.¶ “A Yucca Mountain nuclear waste repository will leak, impacting the land and people of the Great Basin sooner or later,” testified Margene Bullcreek, president of the Native Community Action Council, at the Nuclear Regulatory Commission Atomic Safety Licensing
Board Panel Construction Authorization Board in 2010.¶ Bullcreek’s group represents local tribes that have suffered from radiation exposure after U.S. nuclear weapons testing in the area. They say storing nuclear waste in the mountain would desecrate the sacred lands, and also expose
the Department of Energy withdrew its application to pursue Yucca Mountain
It was a political, not scientific, decision
It is incumbent on
the administration to come up with a disposal plan for this real problem facing our nation ¶
local residents to significant health risks.¶ In 2010
waste dump, but Republicans have not abandoned the idea.¶ “
as a site for a nuclear
,” said Republican Sen. Lindsey Graham, McClatchy reported. “
.”
Today, however, the future of the
House bill and the fate of the tenuous bipartisan coalition pushing for nuclear power expansion in the U.S. are in question as Japan battles its largest nuclear power crisis since World War II.¶ On Tuesday, a third and the most serious explosion at the Fukushima Daiichi nuclear power
plant had engineers scrambling anew to keep the core in the most damaged reactor cool enough to avoid a nuclear catastrophe. The explosion also resulted in a fire in a fourth reactor, which triggered short-term spikes in radiation in the vicinity. The explosions were not nuclear
explosions—last Friday’s earthquake and the subsequent tsunami jammed the reactors’ backup cooling systems, causing a pressure buildup that scientists suspect caused the explosion.¶ It’s a dangerous, precarious rush to contain the damage right now. Tens of thousands of people
have been evacuated from the surrounding area, and at least 22 people have been exposed to radiation. That number is expected to climb. ¶ The disaster has led to widespread panic about nuclear power, even as Japanese officials and industry leaders have maintained that the health
risks are minimal. After the third explosion, Prime Minister Naoto Kan acknowledged “a very high risk” of more radiation leakage, suggesting that things would get worse before they got any better, the New York Times reported. Officials have warned residents within 20 miles to stay
indoors and stop using their air conditioning.¶ On Sunday Sen. Joseph Lieberman said that the U.S. ought to reassess plans to expand nuclear power, which President Obama has been pushing vocally.¶ “The reality is that we’re watching something unfold,” Lieberman said on “Face the
Nation.” “We don’t know where it’s going with regard to the nuclear power plants in Japan right now. I think it calls on us here in the U.S.—naturally not to stop building nuclear power plants, but to put the brakes on right now until we understand the ramifications of what’s happened in
Japan.”¶ On Monday the Obama administration said despite the crisis, it still remains committed to nuclear power as a part of its “clean energy” plan. The Obama administration did not respond to Lieberman’s call today, but maintained its line that nuclear power is a secure option for
nuclear power plants in this country operate safely and securely,”
the nuclear power industry has successfully rebranded itself as a low-cost, clean,
non-polluting alternative energy source
¶
¶ Native Americans have been at the front lines of alternative energy conversations in the country
the country.¶ “Right now we continue to believe that
Nuclear Regulatory Commission Chairman Gregory
Jaczko said today, Politico reported.¶ In recent years
. Obama has pledged $8 billion in guaranteed loans for the construction of a nuclear power plant, the first to be built in the country in over 30 years.
Native Americans Seek
Alternatives
as
developers try to move in to reservations. In 2010 the Black Mesa Water Coalition in northern Arizona successfully defeated a coal mining operation that was set to move into Navajo and Hopi land. Last week, Denison Mines Corp, a Canadian company, obtained permits from an Arizona
state environmental agency to reopen three mines near the Grand Canyon, Indian Country Today reported. Denison still needs to get federal approval to move ahead, but the approval is especially controversial since the Department of the Interior instituted a two-year moratorium in
2009 on uranium mining exploration within a million acres of the Grand Canyon.¶ There
are currently 104 licensed nuclear power plants in the country.
On Monday the New
York Times reported that most of them share “some or all of the risk factors that played a role at Fukushima Daiichi: locations on tsunami-prone coastlines or near earthquake faults, aging plants and backup electrical systems that rely on diesel generators and batteries that could fail in
extreme circumstances.”¶ Nonetheless, the U.S. is highly dependent on nuclear power. The U.S. gets 20 percent of its electrical output from nuclear power production—Japan gets 30 percent of its energy from nuclear power. Native American environmental groups and anti-nuclear
power activists have said that instead of pushing ahead with dangerous and hazardous energy exploration, the country ought to develop the political will to get serious about energy conservation and sustainable alternative energy sources.
Plants are only there if native agree – independently net-beneficial for their culture
Wolfson 2k (Hannah, Staff Writer for Mindfully.org, “Goshute Native American Tribe
Turns to Nuclear Waste,” December 2000, Mindfully.org,
http://www.mindfully.org/Nucs/Goshute-Tribe-Nuc-Waste.htm)//twonily
INDIAN RESERVATION, Utah -- Leon Bear knows the boundaries of his tribe's land by heart.¶ ¶ ``They want us
to be self-determined and they want us to be self-governed, and yet when we make these judgments, they don't like it,''¶ Bear, Goshute tribal
chairman¶ ¶ Goshute Seal¶ ¶ From the reservoir that provides water to his tiny village, Bear sweeps his arm across the parched valley, pointing
¶ SKULL VALLEY
out fences and smokestacks that ring the last remnant of his tribe's traditional lands.¶ ¶ To
the north, a magnesium plant sits on
the shore of the Great Salt Lake; to the south, the Army tests equipment for exposure to nerve gas on a stretch of desert as large as Rhode
Island. A bombing range and hazardous waste incinerator lie over the Cedar Mountains to the west; a stockpile of chemical weapons and the
incinerator that destroys them sit to the east.¶ ¶ Now the
tiny Skull Valley Band of Goshutes has agreed to turn its
reservation into one of the country's largest nuclear waste dumps.¶ ¶ Opponents, including other tribe members, say
the plan could endanger people, the wildlife of the West Desert and the region's economy.¶ ¶ But that hasn't stopped Bear from pressing
forward with the project, which he says could be the only salvation for his dying tribe.¶ ¶ ``They made that an industrial waste zone out there,''
said Bear, the Goshutes' tribal chairman and the project's main supporter. ``Nobody asked the Goshutes, 'Do you mind if we do this out here on
your traditional territory?' Nobody said, 'Hey, it could be dangerous for you guys to be out here.'''¶ ¶ ``When a neighbor does that to you, you
don't want to be like them,'' he added. ``So we gave our neighbor, the state of Utah, an opportunity to be a part of this, and the first reaction
was 'Over my dead body.'''¶ ¶ If Bear gets his way, about a
square mile of the reservation will be fenced off for nuclear
waste, and 450 acres will be covered with concrete pads. On top will sit 16-foot tall, concrete-and-steel casks filled
with radioactive rods -- as many as 4,000 of them holding 40,000 metric tons of used-up nuclear reactor fuel.¶ ¶ The fuel will
come from Private Fuel Storage, a consortium of eight power companies from California, New York, Minnesota, Wisconsin,
Michigan, Georgia, Pennsylvania, Florida and Alabama. Neither the consortium or the Goshutes will say what the deal costs.¶ ¶ The
consortium has promised to build a cultural center on the reservation to revive the tribe's fading
language and crafts, Bear says, and has pledged to give Goshutes and other tribes the first shot at about 40
jobs at the site.¶ ¶ The money is sorely needed. Most of the estimated 150 Goshutes have fled the 17,000-acre reservation.
Fewer than 30 remain, most living in a tiny cluster of run-down trailers. Jobs are virtually nonexistent.¶ ¶ It's not that the tribe hasn't
tried. At the village entrance, the last examples of one failed project -- portable toilets and showers built for the military -- sit unused.¶ ¶ Only
two real options remained: nuclear waste and gambling, an industry Mormon-dominated Utah considers nearly as toxic.¶ ¶ ``How can you
blame Leon?'' said Chip Ward, author of an environmental history of the West Desert and a project opponent.
do? Grow food? No one's going to buy a tomato off this land.''¶ ¶
``What's he going to
2nc – at: accidents
Negative chance of an accident
Alvarez et al 3 (*Robert, a Senior Scholar at IPS, where he is currently focused on nuclear disarmament,
environmental, and energy policies, served as a Senior Policy Advisor to the Secretary and Deputy
Assistant Secretary for National Security and the Environment. *Jan Beyea, PhD, earth science and
environmental studies *Klaus Janberg, *Jungmin Kang, *Ed Lyman, *Allison Macfarlane, *Gordon
Thompson, *Frank N. von Hippel, PhD Princeton University. “Reducing the Hazards from Stored Spent
Power-Reactor Fuel in the United States” Science and Global Security, 11:1–51, 2003 www.irssusa.org/pages/documents/11_1Alvarez.pdf) //twonily
The cooling water in a spent-fuel pool could be lost in a number of ways, through accidents or malicious acts. Detailed discussions of
sensitive information are not necessary for our purposes. Below, we provide some perspective for the following generic
cases: boil-off; drainage into other volumes through the opening of some combination of the valves, gates and pipes that hold the
water in the pool; a fire resulting from the crash of a large aircraft; and puncture by an aircraft turbine shaft or a
shaped charge. Boil Off Keeping spent fuel cool is less demanding than keeping the core in an operating reactor cool.
Five minutes after shutdown, nuclear fuel is still releasing 800 kilowatts of radioactive heat per metric ton of uranium (kWt/tU)30 . However,
after several days, the decay heat is down to 100 kWt/tU and after 5 years the level is down to 2–3 kWt/tU (see Figure 5). In
case of a loss
of cooling, the time it would take for a spent-fuel pool to boil down to near the top of the spent fuel
would be more than 10 days if the most recent spent-fuel discharge had been a year before. If the entire
core of a reactor had been unloaded into the spent fuel pool only a few days after shutdown, the time
could be as short as a day.31 Early transfer of spent fuel into storage pools has become common as
reactor operators have reduced shutdown periods. Operators often transfer the entire core to the pool
in order to expedite refueling or to facilitate inspection of the internals of the reactor pressure vessel and
identification and replacement of fuel rods leaking fission products.32 Even a day would allow considerable time to provide
emergency cooling if operators were not prevented from doing so by a major accident or terrorist act
such as an attack on the associated reactor that released a large quantity of radioactivity. In this article, we do not discuss scenarios in which
spent-fuel fires compound the consequences of radioactive releases from reactors. We therefore focus on the possibility of an accident or
terrorist act that could rapidly drain a pool to a level below the top of the fuel. Drainage All spent-fuel pools are connected via fuel-transfer
canals or tubes to the cavity holding the reactor pressure vessel. All can be partially drained through failure of interconnected piping systems,
moveable gates, or seals designed to close the space between the pressure vessel and its surrounding reactor cavity.33 A 1997 NRC report
described two incidents of accidental partial drainage as follows:34 Two loss of SFP [spent fuel pool] coolant inventory events occurred in which
SFP level decrease exceeded 5 feet [1.5 m]. These events were terminated by operator action when approximately 20 feet [6 m] of coolant
remained above the stored fuel. Without operator actions, the inventory loss could have continued until the SFP level had dropped to near the
top of the stored fuel resulting in radiation fields that would have prevented access to the SFP area. Once the pool water level is below the top
of the fuel, the gamma radiation level would climb to 10,000 rems/hr at the edge of the pool and 100’s of rems/hr in regions of the spent-fuel
building out of direct sight of the fuel because of scattering of the gamma rays by air and the building structure (see Figure 6).35 At the lower
radiation level, lethal doses would be incurred within about an hour.36 Given such dose rates, the NRC staff assumed that further ad hoc
interventions would not be possible.37 Fire A
crash into the spent fuel pool by a large aircraft raises concerns of both
puncture (see below) and fire. With regard to fire, researchers at the Sandia National Laboratory, using water to simulate kerosene,
crashed loaded airplane wings into runways. They concluded that at speeds above 60 m/s (135 mph), approximately 50% of the liquid is
so finely atomized that it evaporates before reaching the ground. If this were fuel, a fireball would certainly have been
the result, and in the high-temperature environment of the fireball a substantially larger fraction of the mass would have evaporated.39 The
blast that would result from such a fuel-air explosion might not destroy the pool but could easily
collapse the building above, making access difficult and dropping debris into the pool. A potentially destructive fuel-air
deflagration could also occur in spaces below some pools. Any remaining kerosene would be expected to pool and burn at
a rate of about 0.6 cm/minute if there is a good air supply.40 The burning of 30 cubic meters of kerosene—about one third as
much as can be carried by the type of aircraft which struck the World Trade Center on September 11, 200141 —would release about
1012 joules of heat—enough to evaporate 500 tons of water. However, under most circumstances, only a relatively small
fraction of the heat would go into the pool. Puncture by an Airplane Engine Turbine Shaft, Dropped Cask or Shaped Charge As
Figure 2 suggests, many spent-fuel pools are located above ground level or above empty cavities. Such pools
could drain completely if their bottoms were punctured or partially if their sides were punctured. Concerns that the
turbine shaft of a crashing high-speed fighter jet or an act of war might penetrate the wall of a spent-fuel storage pool and cause a loss of
coolant led Germany in the 1970s to require that such pools be sited with their associated reactors inside thick-walled containment buildings.
When Germany decided to establish large away-from-reactor spent-fuel storage facilities, it rejected large spent-fuel storage pools and decided
instead on dry storage in thick-walled cast-iron casks cooled on the outside by convectively circulating air. The casks are stored inside
reinforced-concrete buildings that provide some protection from missiles.42 Today,
the turbine shafts of larger, slower-moving
passenger and freight aircraft are also of concern. After the September 11, 2001 attacks against the World Trade Center, the
Swiss nuclear regulatory authority stated that From the construction engineering aspect, nuclear power plants (worldwide) are not protected
against the effects of warlike acts or terrorist attacks from the air. . . . one cannot rule out the possibility that fuel elements in the fuel pool or
the primary cooling system would be damaged and this would result in a release of radioactive substances [emphasis in original]43 The NRC
staff has decided that it is prudent to assume that a turbine shaft of a large aircraft engine could penetrate and drain a spent-fuel-storage
pool.44 Based
on calculations using phenomenological formulae derived from experiments with projectiles
cannot be ruled out for a high-speed crash but seems unlikely for a lowspeed crash.45 This is consistent with the results of a highly-constrained analysis recently publicized by the
Nuclear Energy Institute (NEI).46 The analysis itself has not been made available for independent peer review “because of security
considerations.” According to the NEI press release, however, it concluded that the engine of an aircraft traveling at the low
speed of the aircraft that struck the Pentagon on Sept. 11, 2001 (approximately 350 miles/hr or 156 m/s) would not
penetrate the wall of a spent-fuel-storage pool. Crashes at higher speed such as that against the World Trade Center
South Tower (590 miles/hr or 260 m/s), which had about three times greater kinetic energy, were ruled out because the
“probability of the aircraft striking a specific point on a structure—particularly one of the small size of a
nuclear plant—is significantly less as speed increases.”
incident on reinforced concrete, penetration
2nc – xt: nuclear inevitable
Nuclear transition is inevitable and triggers the impact
Haider 14 – Prof Physics @ Fordhams Univ. (Quamrul Haider, 2/27/14, “Nuclear power plant: Security,
dirty bombs, and civil rights,” http://www.thedailystar.net/nuclear-power-plant-security-dirty-bombsand-civil-rights-13144)//twonily
ONE consequence of nuclear power that dominates all others is the safety and security of a nuclear
reactor
facility. The
use of nuclear power inevitably brings an unquantifiable but real danger of nuclear blackmail
and sabotage from terrorists , extremists , criminals and lunatics . The safe and secure transportation
of nuclear materials is also of great concern. Decisions on policy regarding the development of nuclear energy involve judgments
concerning the hazards of plutonium and other actinides produced as radioactive wastes in a reactor. According to the World Nuclear Association, total world
generation of plutonium in spent fuel rods is about 70 thousand kilograms per year. It takes approximately 10 kilograms of nearly pure plutonium-239 to make a
bomb. The production
of staggering amount of plutonium gives rise to the risk of its diversion to make nuclear
weapons by rogue nations and terrorists. The grim reality is that any country that has nuclear power plants will have
access to
the materials and technology needed for developing
nuclear bombs . In 1974, India exploded a “peaceful nuclear device,” and with it also
exploded the belief that there is a practical distinction between peaceful and military uses of nuclear energy. In the often heated controversy over the future of
nuclear power, it is the risk of proliferation of nuclear weapons that appears to be the one most intractable to technical resolution and, as well, most insistently
fundamental to the way people feel about nuclear power. If
worldwide plutonium industry develops, then theft of plutonium, or even
growth of an international black market in plutonium, seems quite likely . A market of few hundred kilograms worth millions of dollars per year is large
enough to interest criminal groups and to have a major impact on nuclear terrorism. The information and non-nuclear materials needed to make a “dirty” fission
bomb is now widely distributed and available to the general public on the internet. Dozens
of nations have or could acquire skills and
facilities required to design and build dirty bombs using plutonium diverted from their civilian nuclear power programmes. Although
crude, inefficient and unpredictable, such devices would nonetheless be highly destructive. Furthermore, fission explosives small enough to be
transported by automobile could be built by small groups of people, even conceivably by individuals working alone, if they somehow manage to acquire the needed
10 kilograms of plutonium. The concerns
about plutonium arise not only for its explosive properties but also for its
extreme radiotoxicity. Dispersed into the atmosphere by nuclear explosive devices, a small quantity of plutonium could cause an indeterminate number of
deaths from lung cancer or fibrosis of the lung. The psychological impact of such a situation would be profound, normal activity in the affected area would be
disrupted and decontamination could be very expensive. Thus if
nuclear power plants are to be well-enough protected to be
totally immune to the above risks, the unavoidable consequence is a society dominated by prohibitions,
surveillance and constraints, all justified by the magnitude of the danger. Consequently, it is inevitable that preference should be given to pliant and
obedient character type workers. The use of nuclear energy, therefore, epitomises the centralisation of the
government's power, thereby resulting in infringement on the civil rights of the citizens.
2nc – xt: ssd not inevitable
No SSD absent the plan – means there’s only a risk they destroy the environment
Sands 14 – editor @ Foundation for Environmental Law and Development (Philippe Sands, 1/14/14,
“Greening International Law,”
http://books.google.com/books?id=JmmYAgAAQBAJ&printsec=frontcover&source=gbs_ge_summary_r
&cad=0#v=onepage&q&f=false)//twonily
Since a number of delegations were concerned about forcing the issue to a formal vote, particularly given the high
tension that was experienced at the 1983 meeting with the moratorium on LLW dumping, no vote was called. Instead, both resolutions
were attached to the final report of the 8th Consultative Meeing, and tabled. It is thought that, partly as a result of the debate within
the 1972 London Convention, and the moratorium on the dumping of LWW at sea, as well as its extremely high cost, the
subseabed disposal programme has been virtually abandoned.
Only the plan causes SSD
Bala their author 4/11/14—editor in chief at the Environmental Affairs Law Review, law student at
Boston College Law School, (Amal, "Sub-Seabed Burial of Nuclear Waste: If the Disposal Method Could
Succeed Technically, Could It Also Succeed Legally?", Boston College Environmental Affairs Law Review,
41 B.C. Envtl. Aff. L. Rev. 455,
lawdigitalcommons.bc.edu/cgi/viewcontent.cgi?article=2147&context=ealr)//twonily
193 If the United States were to reconsider sub - seabed disposal as a potential option for disposal of the nation ’ s
collective SNF, America would need to restart the research that was previously abandoned . 194 Restarting such
research could benefit from a fresh look at national and international laws that could apply. 195 The feasibility of sub - seabed disposal as a
national solution for SNF requires careful consideration of applicable laws that could make the disposal method impractical from a
legal standpoint . 196
2nc – xt: no leaks
No leaks – and even if there are, no impact – their authors are alarmist hacks
Pearce 12 – freelance author and journalist, environmental consultant for New Scientist magazine (Fred
Pearce, 10/22/12, “Why Are Environmentalists Taking Anti-Science Positions?”
http://e360.yale.edu/feature/why_are_environmentalists_taking_antiscience_positions/2584/)//twonily
From Rachel Carson’s Silent Spring to James Hansen’s modern-day tales of climate apocalypse, environmentalists have long looked to good science and good
scientists and embraced their findings. Often we have had to run hard to keep up with the crescendo of warnings coming out of academia about the perils facing the
world. A generation ago, biologist Paul Ehrlich’s The Population Bomb and systems analysts Dennis and Donella Meadows’ The Limits to Growth shocked us with
their stark visions of where the world was headed. No wide-eyed greenie had predicted the opening of an ozone hole before the pipe-smoking boffins of the British
Antarctic Survey spotted it when looking skyward back in 1985. On issues ranging from ocean acidification and tipping points in the Arctic to the dangers of
nanotechnology, the scientists have always gotten there first — and the environmentalists have followed. And yet, recently, the
environment
movement seems to have been turning up on the wrong side of the scientific argument. We have been
making claims that simply do not stand up. We are accused of being anti-science — and not without reason. A few, even close friends, have
begun to compare this casual contempt for science with the tactics of climate contrarians. That should hurt . Three current issues suggest
that the risks of myopic adherence to ideology over rational debate are real: genetically modified (GM) crops, nuclear power,
and shale gas development. The conventional green position is that we should be opposed to all three. Yet the voices of
those with genuine environmental credentials , but who take a different view, are being drowned out
by
sometimes abusive and
irrational argument . In each instance, the issue is not so much which side environmentalists should be on, but rather the
mind-set behind those positions and the tactics adopted to make the case. The wider political danger is that by The issue is not which side environmentalists should
be on, but rather the mind-set behind their positions. taking anti-scientific positions, environmentalists end up helping the anti-environmental sirens of the new
right. Most major environmental groups — from Friends of the Earth to Greenpeace to the Sierra Club — want a ban or moratorium on GM crops, especially for
food. They fear the toxicity of these “Frankenfoods,” are concerned the introduced genes will pollute wild strains of the crops, and worry that GM seeds are a
weapon in the takeover of the world’s food supply by agribusiness. For myself, I am deeply concerned about the power of business over the world’s seeds and food
supply. But GM crops are an insignificant part of that control, which is based on money and control of trading networks. Clearly there are issues about gene
pollution, though research suggesting there is a problem is still very thin. Let’s do the research, rather than trash the test fields, which has been the default response
of groups such as Greenpeace, particularly in my home country of Britain. As for the Frankenfoods argument, the evidence is just not there. As the British former
campaigner against GMs, Mark Lynas, points out: “Hundreds of millions of people have eaten GM-originated food without a single substantiated case of any harm
done whatsoever.” The most recent claim, published in September in the journal Food and Chemical Toxicology, that GM corn can produced tumors in rats, has
been attacked as flawed in execution and conclusion by a wide range of experts with no axe to grind. In any event, the controversial study was primarily about the
potential impact of Roundup, a herbicide widely used with GM corn, and not the GM technology itself. Nonetheless, the reaction of some in the environment
community to the reasoned critical responses of scientists to the paper has been to claim a global conspiracy among researchers to hide the terrible truth. One
scientist was dismissed on the Web site GM Watch for being “a longtime member of the European Food Safety Authority, i.e. the very body that approved the GM
corn in question.” That’s like dismissing the findings of a climate scientist because he sits on the Intergovernmental Panel on Climate Change — the “very body” that
warned us about climate change. See what I mean about aping the worst and most hysterical tactics of the climate contrarians? Stewart Brand wrote in his 2009
book Whole Earth Discipline: “I dare say the environmental movement has done more harm with its opposition to genetic engineering than any other thing we’ve
been wrong about.” He will see nods of ascent from members of a nascent “green genes” movement — among them environmentalist scientists, such as Pamela
Ronald of the University of California at Davis — who say GM crops can advance the cause of sustainable agriculture by improving resilience to changing climate and
reducing applications of agrochemicals. Yet such people are routinely condemned as apologists for an industrial conspiracy to poison the world. Thus, Greenpeace
in East Asia claims that children eating nutrient-fortified GM “golden rice” are being used as “guinea pigs.” And its UK Web site’s introduction to its global campaigns
says, “The introduction of genetically modified food and crops has been a disaster, posing a serious threat to biodiversity and our own health.” Where, ask their
critics, is the evidence for such claims? The problem is the same in the energy debate. Many
environmentalists who argue, as I do, that
climate change is probably the big overarching issue facing humanity in the 21st century, nonetheless often refuse
to recognize that nuclear power could have a role in saving us from the worst. For environmentalists to fan the flames
of fear of nuclear power seems reckless and anti-scientific. Nuclear power is the only large-scale source of low-carbon electricity
that is fully developed and ready for major expansion. Yes, we need to expand renewables as fast as we can. Yes, we need to reduce further the already small risks
of nuclear accidents and of leakage of fissile material into weapons manufacturing. But as George Monbiot, Britain’s most prominent environment columnist, puts
it:
“To abandon our primary current source of low carbon energy during a climate change emergency
is madness.”
Monbiot attacks the gratuitous misrepresentation of the risks of radiation from nuclear plants. It is widely suggested, on the basis of a
thoroughly discredited piece of Russian head-counting, that up to a million people were killed by the Chernobyl nuclear accident in 1986. In fact, it
is far
from clear that many people at all — beyond the 28 workers who received fatal doses while trying to douse the flames at the stricken reactor —
actually died from Chernobyl radiation. Certainly, the death toll was nothing remotely on the scale claimed. “We have a moral
duty,”
Monbiot says,
“not to spread unnecessary and unfounded fears.
If we persuade people that they or their children are
likely to suffer from horrible and dangerous health problems, and if these fears are baseless, we cause great distress and anxiety, needlessly damaging the quality of
people’s lives.” Many people have a visceral fear of nuclear power and its invisible radiation. But for environmentalists to fan
the flames — especially when it gets in the way of fighting a far more real threat, from climate change — seems reckless, anti-scientific and deeply damaging to the
world’s climate future. One
sure result of Germany deciding to abandon nuclear power in the wake of last year’s Fukushima
be rising carbon emissions
from a revived coal industry. By one estimate, the end of nuclear power in Germany will result in an extra 300 million tons of carbon dioxide
nuclear accident (calamitous, but any death toll will be tiny compared to that from the tsunami that caused it) will
reaching the atmosphere between now and 2020 — more than the annual emissions of Italy and Spain combined.
2nc – xt: empirics disprove
Empirics disprove the impact even if it spreads
Wald 6 – staff writer @ NYT (Matthew Wald, 3/17/6, “Nuclear Reactors Found to Be Leaking Radioactive
Water,” http://www.nytimes.com/2006/03/17/national/17nuke.html)//twonily
WASHINGTON, March 16 — With power cleaner than coal and cheaper than natural gas , the nuclear industry, 20
years past its last meltdown, thinks it is ready for its second act: its first new reactor orders since the 1970's. But
there is a catch. The public's acceptance of new reactors depends in part on the performance of the old ones,
and lately several of those have been discovered to be leaking radioactive water into the ground . Near
Braceville, Ill., the Braidwood Generating Station, owned by the Exelon Corporation, has leaked tritium into underground
water that has shown up in the well of a family nearby. The company, which has bought out one property owner and is
negotiating with others, has offered to help pay for a municipal water system for houses near the plant that have private wells. In a survey of all
10 of its nuclear plants, Exelon found tritium in the ground at two others. On Tuesday, it said it had had another spill at
Braidwood, about 60 miles southwest of Chicago, and on Thursday, the attorney general of Illinois announced she was filing a lawsuit against
the company over that leak and five earlier ones, dating to 1996. The suit demands among other things that the utility provide substitute water
supplies to residents. In New York, at the Indian
Point 2 reactor in Buchanan, workers digging a foundation adjacent to the
plant's spent fuel pool found wet dirt, an indication that the pool was leaking . New monitoring wells are tracing the
tritium's progress toward the Hudson River. Indian Point officials say the quantities are tiny, compared with the amount of tritium that Indian
Point is legally allowed to release into the river. Officials said they planned to find out how much was leaking and declare the leak a "monitored
release pathway." Nils J. Diaz, the chairman of the Nuclear Regulatory Commission, said he would withhold judgment on the proposal until after
it reached his agency, but he added, "They're going to have to fix it." This month, workers at the Palo Verde plant in New Mexico found tritium
in an underground pipe vault. The Union of Concerned Scientists, which is critical of nuclear power safety arrangements, said recently that in
the past 10 years, tritium had leaked from at least seven reactors. It called for a systematic program to ensure there were no more leaks. Tami
Branum, who lives close to the Braidwood reactor and owns property in the nearby village of Godley, said in a telephone interview, "It's just
absolutely horrible, what we're trying to deal with here." Ms. Branum and her children, 17-year-old twin girls and a 7-year-old boy, drink only
bottled water, she said, but use municipal water for everything else. "We're bathing in it, there's no way around it," she said. Ms. Branum said
that her property in Godley was worth about $50,000 and that she wanted to sell it, but that no property was changing hands now because of
the spill. A spokesman for Exelon, Craig Nesbit, said that neither Godley's water nor Braidwood's water system was threatened, but that the
company had lost credibility when it did not publicly disclose a huge fuel oil spill and spills of tritium
from
1996 to 2003. No well outside company property shows levels that exceed drinking water standards, he said. Mr. Diaz of the regulatory agency,
speaking to a gathering of about 1,800 industry executives and government regulators last week, said utilities were planning to apply for 11
reactor projects, with a total of 17 reactors. The Palo Verde reactor was the last one that was ordered, in October 1973, and actually built. As
the agency prepares to review license applications for the first time in decades, it is focusing on "materials
term for
cracks , rust
and
degradation," a catch-all
other ills to which nuclear plants are susceptible . The old metal has to hold together, or be
patched or replaced as required, for the industry to have a chance at building new plants, experts say. Tritium, a form of hydrogen with two
additional neutrons in its nucleus, is especially vexing. The atom is
unstable and returns to stability by emitting a
radioactive particle. Because the hydrogen is incorporated into a water molecule, it is almost impossible to filter
out .
2nc – xt: status quo solves
Interim storage is coming now – solves sufficiently
WNA 5 – World Nuclear Association (August 2013, “Safe Management of Nuclear Waste and Used
Nuclear Fuel,” http://www.world-nuclear.org/WNA/Publications/WNA-Position-Statements/SafeManagement-of-Nuclear-Waste-and-Used-Nuclear-Fuel/)// twonily
NUCLEAR WASTE AND USED NUCLEAR FUEL 1) Origin of Nuclear Waste and UNF. Nuclear power comes from the huge
amount of energy, stored in the atomic nucleus, which is released as heat under controlled conditions in a reactor. This energy release results
from the splitting of atoms of uranium in a process known as "fission". Uranium is one of the "radioactive" elements. Also referred to as
radionuclides or radioisotopes, these are atoms that continue to transform themselves into other elements while decaying to a stable (nonradioactive) state. Naturally occurring uranium consists of three radioisotopes: uranium-238 (99.3%), uranium-235 (0.7%) and uranium-234
(trace amounts), with the difference lying in the number of neutrons in the atomic nucleus. Of them, only U-235 is fissile, meaning able to be
split. The end products of controlled nuclear fission contain a diverse group of radioactive elements that
decay at greatly differing rates. These end products are classified either as nuclear waste or as used nuclear fuel (UNF). 2) Categories of Nuclear
Waste. Nuclear waste is categorised according to its radioactivity levels in three broad classes: low level waste (LLW), intermediate level waste
(ILW), and high level waste (HLW). Some ILW decays rapidly to become LLW; some ILW, such as parts of UNF fuel cladding removed during
reprocessing, decays slowly. Heat generation is a relevant concern only with HLW-UNF. This heat is described as the "thermal burden" in
managing and disposing of these materials. 3) Energy Value in Used Nuclear Fuel. UNF contains radioactive substances that still have a great
deal of energy potential. Some 96% of the mass of UNF can potentially be recovered and recycled for further use as nuclear fuel. 4) Role
of
Interim Storage of Reactor End Products. UNF-HLW is generally stored for several years in a pond at the power
plant or at a reprocessing plant. On-site storage or storage at an interim surface-storage facility allows for
natural radioactive decay to reduce both the radioactivity and the associated thermal burden of this
end product.
**ilaw
1nc – ilaw
Legitimacy doesn’t matter – we’ll never ratify LOST regardless
Groves 12 – Bernard and Barbara Lomas Fellow at the Margaret Thatcher Center for Freedom at The
Heritage Foundation (Steven Groves, 5/18/12, “Law of the Sea Treaty once again rears its ugly head in
U.S. Senate,” http://www.deseretnews.com/article/765576815/Law-of-the-Sea-Treaty-once-againrears-its-ugly-head-in-US-Senate.html?pg=all)//twonily
A proposed treaty would redirect countless U.S. dollars to an international organization that could then
redistribute that money to corrupt developing countries around the world. It's bad enough when American tax dollars are blown on governmentcreated debacles such as Solyndra and "Operation Fast and Furious." But at least in those instances the expenditures carried a bare modicum of democratic legitimacy. What if, on the other
the U.S. Treasury was raided for billions of dollars , which were then redistributed to the rest of the world by
an international bureaucracy headquartered in Kingston, Jamaica? That's what will surely happen if the U.S. Senate
gives its advice and consent to the U nited N ations C onvention on the L aw o f the S ea, a deeply flawed treaty that was
rejected by President Ronald Reagan in 1982. (The treaty was revived by President Clinton, who sent it to the Senate in 1994. It has languished there ever since.) Like a vampire,
the Law of the Sea Treaty (a.k.a. "LOST") is never quite dead. It rises from the grave every few years for Senate
hearings, as it has done in 1994, 2003 and 2007. And so it is again in 2012. The Obama administration is pushing for Senate action on the treaty, and Sen. John Kerry, D-Mass., is
hand,
currently scheduling a series of hearings to extol the purported benefits of LOST, the first of which is set for May 23. Of course, the vampire must feed, and its sustenance is American dollars,
sucked out of the U.S. Treasury by a provision of LOST known as Article 82. If the U.S. joins LOST, it will be required by Article 82 to forfeit royalties generated from oil and gas development on
the continental shelf beyond 200 nautical miles — an area known as the "extended continental shelf" (ECS).
Currently, oil companies pay 100 percent of
royalties generated from such development to the U.S. Treasury based on the value of oil and natural gas extracted from the Gulf of Mexico and in the Arctic Ocean.
The Treasury retains a part of those royalties, and the remainder is divided between Gulf states and the National Historic Preservation Fund. But under LOST, the U nited
the
S tates would be forced to transfer a part of that revenue to the I nternational S eabed A uthority, a new international bureaucracy
created by the treaty and based in Jamaica. Voila! What was once income paid into the Treasury for the benefit of the American people is transformed into "international royalties" by LOST. To
borrow a phrase from former presidential candidate Ross Perot, that "giant sucking sound" you hear is American dollars heading from Washington to Kingston. How much blood, ahem, money
are we talking about? While it's difficult to estimate the total value of all the oil and gas on the vast areas of U.S. ECS, an interagency study group known as the Extended Continental Shelf Task
ECS resources "may be worth many billions, if not trillions of dollars." The royalties that
the American people stand to lose is obviously significant. Where would all of these American dollars go? Well, LOST directs that the revenue be
Force estimates that the
distributed to "developing States" (such as Somalia, Burma ... you get the picture) and "peoples who have not attained full independence" (such as the Palestinian Liberation Organization ...
hey, don't they sponsor terrorism?). The assembly — the "supreme organ" of the International Seabed Authority in which the United States has a single vote to cast — has the final say
regarding the distribution of America's transmogrified "international" royalties. The assembly may vote to distribute royalties to undemocratic, despotic or brutal governments in Belarus,
China or Zimbabwe — all members of LOST. Perhaps those dollars will go to regimes that are merely corrupt; 13 of the world's 20 most corrupt nations, according to Transparency
International, are parties to LOST. Even Cuba and Sudan, both considered state sponsors of terrorism, could receive dollars fresh from the U.S. Treasury. Unfortunately no one will hear about
Article 82 at the May 23rd hearing. That's because Sen. Kerry is permitting testimony only from witnesses who already favor LOST: Secretary of State Hillary Clinton, Secretary of Defense Leon
Panetta, and Joint Chiefs Chairman Martin Dempsey. No Abraham Van Helsing, ahem, opposition witnesses have been invited to testify. After all, it is in the interests of those who favor U.S.
membership in LOST that the treaty not be exposed to direct sunlight.
This is absurd – supreme court decisions don’t send an international signal – no cohesion
Stark 2 (Barbara, Visiting Professor of Law – Hofstra Law School, “Violations of Human Dignity and
Postmodern International Law, Yale Journal of International Law, 27 Yale J. Int'l L. 315, Summer,
Lexis)//twonily
Unlike domestic law, international law remains fragmentary: there is no Supreme Court to reconcile
warring districts, no legislature to fill in doctrinal gaps. Indeed, international "law -making" is often so
contentious that no law is made at all; in many areas there are more gaps than law. International law is
unapologetically "discontinuous"; the decisions of the International Court of Justice have no
precedential value, and those of the International Criminal Tribunal for the Former Yugoslavia (ICTY) and the International Criminal
Tribunal for Rwanda (ICTR) are similarly ad hoc. Treaty law applies only to the specific subject the particular treaty addresses and is binding
only on the parties to the treaty. While customary international law ("CIL") applies more broadly, states may persistently dissent from CIL and
exempt themselves from its coverage. Many of the broad general principals that comprise CIL, moreover, such as the duty to avoid harm to
neighboring states, prove difficult to apply in specific cases.
CIL fails – vagueness
McGinnis 3 (John, Prof of Law – Northwestern, Fall, 44 Va. J. Int'l L. 229, Lexis)//twonily
This essay is divided into two parts. First, I compare and contrast the process for generating global multilateral treaties and for customary international law. I
suggest that provisions of multilateral agreements should be followed even if they conflict with customary international law.n9 Multilateral agreements are likely to
produce more legitimate and beneficent norms than customary international law. The process of discovering customary
international law is
fraught with difficulty and uncertainty and tends to result in principles with vague and uncertain contours. Both
its process of derivation and its vagueness tend to undermine its legitimacy and claim to beneficence. In contrast, multilateral
agreements are generated by a process not unlike a supermajoritarian legislative process. Like legislatures, multilateral bargaining sessions among nation-states can
generate codes of conduct rather than vague principles.
Supreme Court incorporation of international law now
Spiro 9 (Peter, Professor of Law – Temple University, “Wishing International Law Away”, Yale Law
Journal, 9-29, http://www.yalelawjournal.org/content/view/821/20/)//twonily
As for courts, they are more evidently recognizing international law’s consequence. Though not subject to direct
leveraging by international actors, the courts have long been sensitive to international norms, even to the end of
diluting constitutional rights. In recent years, federal judges have been more directly socialized to the
reality of international law with the emergence of an international community of courts. International
human rights practice was decisive in the Supreme Court’s invalidation of the death penalty against
juvenile offenders in Roper v. Simmons. It was also an important atmospheric in the detainee cases. In none of
those decisions did international law supply the primary analytical hook. Once again, however, that fact does little to defeat the broader idea
that judges are increasingly sensitive to international law. As with Congress and the Law of Nations Clause, Paulsen enables the courts as
players in matters relating to international law and foreign relations by dismissing the political question doctrine. That position makes sense as
interstate relations become more stable, but it also removes an important barrier to the assimilation of international norms. Deprived of a
jurisdictional shield, the anti-internationalists will inevitably suffer greater losses as courts add international law to their decisional armory.
International law doesn’t deter conflict
Wippman 96 (David, Associate Professor – Cornell Law School, Columbia Human Rights Law Review, 27
Colum. Human Rights L. Rev. 435, Spring, Lexis)//twonily
What international law has long attempted to prohibit, or at least to regulate, is foreign involvement in
internal conflict. Foreign [*436] participation in an internal conflict heightens the risk that the conflict will
spread to other states and transform an internal struggle into an interstate war. In addition, foreign involvement may deny
the people of the affected state the right to determine their own political future. As a result, foreign involvement in internal conflicts often
undermines two of the principal goals of the international legal order: the containment of conflict and the preservation of the internal
4
autonomy of each state. 5 Accordingly, contemporary international law is formally non-interventionist: no state is supposed to interfere in civil strife in another state. 6 Nonetheless, foreign
in internal conflicts is more the rule than the exception. In the past, foreign intervention
consisted almost exclusively of unilateral acts by individual states. During the Cold War, political
polarization between East and West made it virtually impossible to achieve the consensus necessary to
support collective interventions. With the end of the Cold War, however, collective interventions have
become more common. When individual states intervene unilaterally in internal conflicts, they typically seek to justify their involvement under legal principles deemed
intervention
7
consistent with, or in some cases, deemed more important than, the principle of non-intervention. In some cases, states rely on consent of the affected state, on the theory that the principle
of non-intervention only bars conduct that amounts to "dictatorial interference" in a state's internal affairs. 8 States also frequently justify intervention as necessary to insulate a state from the
states rely on international
human rights norms or democratic principles to justify their support for one faction or another in a
particular conflict.
effects of another state's prior, illegal intervention, or as necessary to defend a state from an illegal external attack. 9 On occasion,
Double bind—either nodule mining will be safe or public protests will kill it
Santo et al 13 – Nat. Univ. Singapore AND Univ. West. Australia AND Bharati Shipyard Ltd AND Univ.
Southhampton AND Nanyang Technological Univ. (H. Santo, P. Hu, B. Agarwal, M. Placidi, J. Zhou, June
9-14, 2013, “A Proposed Concept Design Solution Towards Full-Scale Manganese Nodule Recovery,”
http://proceedings.asmedigitalcollection.asme.org/proceeding.aspx?articleid=1786265)//twonily
This paper, a product of an intensive eight-week Lloyd’s Register Educational Trust (LRET) Collegium held during July – September 2012 in Southampton, UK,
presents an innovative engineering system concept design for manganese nodule recovery. Issues
associated with environmental impacts,
such as insufficient or lack of transparent impact studies of any potential full-scale seabed mining, are identified as the key obstacles which
could lead to public protest , thus prevent the mining project from taking place . Hence, the proposed system
introduces an environmentally friendly solution with the innovative concept of a black box, which performs in-situ nodule-sediment separation and waste discharge,
and allows recirculation of waste water. The use of a
modularised mining system with small, active hydraulic, crawler-type collectors is
proposed to minimise environmental footprint and increase system redundancy. This yields a comparable estimated
sediment-to-dry nodule ratio with previous studies in sediment plume impact assessment. The proposed system is a big leap
towards a more environmentally friendly solution
for achieving (the first) full-scale manganese nodule recovery. Together with the
intended small production scale of 0.5 millions dry nodules per year, the proposed system can also be considered as a full-scale experiment or field measurement: a
platform for full-scale research concurrently, particularly in the area of environmental impacts. The proposed system, intended to spur more interest in
environmental impact studies and to be more transparent to the public, could benefit both industry and research institutes, for the benefit of everybody.
No impact
Amos and Roels 8 – research associate @ Univ. Texas Marine Science Institute AND Professor and
Director of the Port Aransas Marine Laboratory @ Univ. Texas Marine Science Institute (Anthony F.
Amos, Oswald A. Roels, 5/2/8, “Environmental aspects of
manganese nodule mining,”
http://www.sciencedirect.com/science/article/pii/S0422989408710274)//twonily
The disturbances caused by deep-ocean mining of ferro-manganese nodules is generally extremely small
compared to natural , large-scale processes of oceanic circulation and sediment redistribution by turbidity currents. If mining
effluents are discharged at the surface of the ocean they will remain within the upper mixed layer but will mix rapidly with
the surrounding waters and be transported out of the mining area by prevailing equatorial surface currents. Such
currents will transport the effluent far from any continental regions, dilution of the effluent will be
considerable, and the ultimate concentrations of dissolved effluent constituents will be barely
measurable . Furthermore, some of the predicted changes can only be verified by monitoring actual mining operations or pilot tests so
to the marine environment can be detected and recommendations made for
that processes potentially hazardous
their elimination.
Their evidence is about nodule mining – it won’t happen – crusts are more economical and solve the
impact
ISA 8, (International Seabed Authority, 3/8/2008, “Cobalt-Rich Crusts,”
http://www.isa.org.jm/files/documents/EN/Brochures/ENG9.pdf)//twonily
Besides the high cobalt content compared to abyssal manganese nodules, exploitation of crusts is viewed as advantageous
because high-quality crusts occur within the exclusive economic zones of island nations, in shallower
waters closer to shore facilities. Recognition in the late 1970s of the economic potential of crusts was enhanced by
the fact that the price of cobalt skyrocketed in 1978 as the result of civil strife in the mining areas of Zaire (now the Democratic
Republic of the Congo), then the world’s largest producer of cobalt. By 2005, Democratic Repblic of Congo, Zambia and Canada together
accounted for more than half of world mine production of about 53,500 tonnes. Historically the price
of cobalt has tended to
be volatile: during the 1979 disturbances in Shaba Province of the former Zaire, the price quadrupled within a matter of
weeks. At that time Zaire provided almost half of world supply. Output is now much less geographically concentrated, but demand tends
to be price-inelastic in the short to medium term. After reaching peak price in 1995, the price of cobalt slumped steadily
and came down to 1990 levels in 2002-03.However, over the past four years there has been a sharp increase in cobalt
prices, which stand now at around 54.5 $/kg. If demand continues to increase, or if a supply problem is perceived,
the price may increase further over a relatively short period. Since 2001 there has been steady increase in demand for
both copper and cobalt metal which is evident from the increased production. In 2001, the world cobalt metal production was 38,000 tonnes
where as in 2005 it was 53,500 tonnes. Demand
for one or more of the many metals concentrated in crusts, in
addition to that of cobalt, may ultimately be the driving force for seabed mining. Despite the
economic and technological uncertainties, at least three companies have expressed interest in crust
mining. Several evolving circumstances may change the economic environment and promote mining in the
oceans - for example, land-use priorities, fresh-water issues and environmental concerns in areas of land-based mines. There is a
growing recognition that cobalt-rich crusts are an important potential resource. Accordingly, it is
necessary to fill the information gap concerning various aspects of crust mining through research,
exploration and technological development.
2nc – xt: status quo solves
Spatial planning solves the impact
Schlacher et al 13, (Thomas A. Schlacher, Faculty of Science, Health and Education, University of
Sunshine Coast, Amy R. Baco, Department of Oceanography at Florida State University, Ashley A.
Rowden, National Institute for Water and Atmospheric Research, Timothy D. O’Hara, Museum Victoria,
Malcolm R. Clark, National Institute for Water and Atmospheric Research, Chris Kelley, Hawaii Undersea
Research Laboratory, and John F. Dower, Department of Biology, University of Victoria, 2013,
“Seamount benthos in a cobalt-rich crust region of the central Pacific: conservation challenges for future
seabed mining,” http://www.isa.org.jm/files/documents/EN/Press/Schlache-etal2013.pdf)//twonily
The International Seabed Authority (ISA) is progressing plans to provide guidance to future mining contractors
through the development of environmental guidelines (ISA, 2007), and has progressed regulations for prospecting and
exploration (ISA, 2012). As part of its responsibility under UNCLOS, the ISA is charged with ensuring effective protection of
the marine environment from the effects of mining activities, and the protection and conservation of the flora and fauna
of the marine environment. In recently developing an Environmental Management Plan for the Clarion-Clipperton
Zone (an area of manganese nodule abundance in the eastern Pacific), it was acknowledged that best-practice generally involves
the use of spatial management tools, including the protection of areas thought to be representative of
the full range of habitats, biodiversity and ecosystem structure and function within the management
area (ISA, 2011). The cobalt-rich crust regulations further emphasize minimizing impacts on vulnerable marine
ecosystems, in particular those associated with seamounts and cold-water corals.
**fossil fuels
1nc – fossil fuels
Can’t solve warming – nuke power causes a net increase in emissions
Kivi 14 – contributer @ USAToday specializing in nuclear energy and habitat conservation (Rose Kivi,
2014, “How does Nuclear Energy Affect the Environment?” http://www.ehow.com/howdoes_4566966_nuclear-energy-affect-environment.html)//twonily
Introduction Nuclear energy has been proposed as an answer to the need for a clean energy source as opposed to
CO2-producing plants. Nuclear energy is not necessarily a clean energy source. The effects nuclear energy have
on the environment pose serious concerns that need to be considered, especially before the decision to build additional nuclear
power plants is made. Carbon Dioxide Nuclear power has been called a clean source of energy because the power plants
do not release carbon dioxide. While this is true, it is deceiving. Nuclear power plants may not emit carbon dioxide
during operation, but high amounts of carbon dioxide are emitted in activities related to building and
running the plants. Nuclear power plants use uranium as fuel. The process of mining uranium releases high
amounts of carbon dioxide into the environment. Carbon dioxide is also released into the environment when new nuclear
power plants are built. Finally, the transport of radioactive waste also causes carbon dioxide emissions.
Fossil fuel usage is inevitable—empirics prove previous transitions have failed
Lorenzini, 11/27/13—a retired PacifiCorp executive and former general manager of contract operations
at DOE’s nuclear defense facilities (Paul, "A Second Look at Nuclear Power", Issues in Science and
Technology, issues.org/21-3/lorenzini/)//twonily
Because the market has failed, efforts are now being made to force a shift to renewable energy through
legislated mandates coupled with direct subsidies. The European Union has set an aggressive target of 22 percent of
electricity from renewable sources by 2020. Many countries, including Denmark and the United Kingdom, have enacted targets into law. A
dozen U.S. states have followed suit, legislating goals for renewable supplies, with penalties if they are not achieved.
It is doubtful that
these mandates will be fully successful . Unless the penalties are very high, it is often cheaper to pay the
penalty than the high price of renewable energy. But even if they succeed, the energy future would
not change dramatically . The IEA forecasts that, even with such mandates, more than 60 percent of all new
energy will still come from fossil fuels during the 30-year forecast period, and such fuels will still supply roughly 80
percent of all energy in the final year. And this projection applies only to the developed countries, where renewable energy
mandates have been popularized. Globally, 87 percent of incremental new energy will still come from fossil fuels
during the period, and coal consumption is expected to increase by 42 percent. The grim conclusion is
unavoidable. Both in the United States and around the globe, our hope that renewable energy will displace fossil
fuels has left us with a de facto fossil fuel energy policy.
Even if anti-prolif tech exists, it’s not being implemented
Lowe 5 – Emeritus Professor of Science, Technology, and Society @ Griffith University, President of the
Australian Conservation Foundation (Prof. Ian Lowe, September 2005, “Nuclear Power: No Solution to
Climate Change,”
http://www.acfonline.org.au/sites/default/files/resources/Nuclear_Power_No_Solution_to_Climate_Ch
ange.pdf)//twonily
The weapons proliferation problem cannot be satisfactorily resolved . Proliferation-resistant technologies
are the subject of much discussion and some research (a number of examples are discussed in Australian Safeguards and NonProliferation Office, n.d.) However, there is little reason to believe that minimising proliferation risks will be a priority
in the evolution of nuclear power technology. The growing stockpiles of unirradiated and separated plutonium provide
compelling evidence of the low priority given to non-proliferation initiatives. Further, a number of the
‘advanced’ reactor concepts being studied involve the large-scale use of plutonium and the operation of fast breeder
reactors (Burnie, 2005). The only way to avoid reliance on enrichment plants (with the capacity to produce HEU) is to use non-enriched uranium
fuel, which maximises production of the other main alternative ingredient for (fission) nuclear weapons – plutonium. On the other hand, a
complete cessation of reprocessing in favour of a once-through cycle would represent a major step forward in
relation to overall proliferation risks, but it would require greater uranium resources and potentially lead
to the expansion and spread of enrichment technology. (Feiveson, 2001.) Technical developments in the field of
enrichment technology – such as the development of laser enrichment technology by the Silex company at Lucas Heights in Australia – could
worsen the situation. Silex will potentially provide proliferators with an ideal enrichment capability as it is
expected to have relatively low capital cost and low power consumption, and it is based on relatively simple and
practical separation modules. (Greenpeace, 2004; Boureston and Ferguson, 2005.) back to table of contents 7 Dr. Tilman Ruff, president-elect
of the Medical Association for the Prevention of War, has called on the Australian government to end support for enrichment research in
Australia: “The technology which is being developed here in a publicly funded facility ... is of profound concern. What this Silex technology
brings is an easier, smaller, simpler, cheaper and more concealable way to enrich uranium. That’s of grave concern from a proliferation point of
view.” (Quoted in Greenpeace, 2005.)
Ocean acidification is empirically denied and good – internal link turns marine biodiversity and
overfishing
Eschenbach 11 - B.A. Psychology, Sonoma State University (Willis, “The Ocean Is Not Getting Acidified”
December 27, 2011 http://wattsupwiththat.com/2011/12/27/the-ocean-is-not-gettingacidified/)//gingE
There’s an interesting study out on the natural pH changes in the ocean. I discussed some of these pH changes a year
ago in my post “The Electric Oceanic Acid Test“. Before getting to the new study, let me say a couple of things about pH.¶ The pH scale
measures from zero to fourteen. Seven is neutral, because it is the pH of pure water. Below seven is acidic. Above seven is
basic. This is somewhat inaccurately but commonly called “alkaline”. Milk is slightly acidic. Baking soda is slightly basic (alkaline).¶ The first thing
of note regarding pH is that alkalinity is harder on living things than is acidity. Both are corrosive of living tissue, but alkalinity has a stronger
effect. It seems counterintuitive, but it’s true. For example, almost all of our foods are acidic. We eat things with a pH of 2, five units below the
neutral reading of 7 … but nothing with a corresponding pH of 12, five units above neutral. The most alkaline foods are eggs (pH up to 8) and
dates and crackers (pH up to 8.5). Heck, our stomach acid has a pH of 1.5 to 3.0, and our bodies don’t mind that at all … but don’t try to drink
Drano, the lye will destroy your stomach.¶ That’s why when you want to get rid of an inconvenient body, you put lye on it, not acid. It’s also
why ocean fish
often have a thick mucus layer over their skin, inter alia to protect them from the
alkalinity. Acidity is no problem for life compared to alkalinity.¶ Next, a question of terminology. When a base is
combined with an acid, for example putting baking soda on spilled car battery acid, that is called “neutralizing” the acid. This is because it is
moving towards neutral. Yes, it increases the pH, but despite that, it is called “neutralizing”, not “alkalizing”.¶ This same terminology is used
when measuring pH. In a process called “titration”, you measure how much acid it takes to neutralize an unknown basic solution. If you add too
much acid, the pH drops below 7.0 and the mixture becomes acidic. Add too little acid, and the mixture remains basic. Your goal in titration is
to add just enough acid to neutralize the basic solution. Then you can tell how alkaline it was, by the amount of acid that it took to neutralize
the basic solution.¶ Similarly, when rainwater (slightly acidic) falls on the ocean (slightly basic), it has a neutralizing effect on the slightly alkaline
ocean. Rainwater slightly decreases the pH of the ocean. Despite that, we don’t normally say that rainwater is “acidifying” the ocean. Instead,
because it is moving the ocean towards neutral, we say it is neutralizing the ocean.¶ The problem with using the term “acidify” for what
rainwater does to the ocean is that people misunderstand what is happening. Sure, a hard-core scientist hearing “acidify” might think
“decreasing pH”. But most people think “Ooooh, acid, bad, burns the skin.” It
leads people to say things like the following
gem that I came across yesterday:¶ Rapid increases in CO2 (such as today) overload the system, causing
surface waters to become corrosive.¶ In reality, it’s quite the opposite. The increase in CO2 is making the
ocean, not more corrosive, but more neutral . Since both alkalinity and acidity corrode things, the truth is that
rainwater (or more CO2) will make the ocean slightly less corrosive, by marginally neutralizing its slight
alkalinity. That is the problem with the term “acidify”, and it is why I use and insist on the more accurate term “neutralize”. Using “acidify”,
is both alarmist and incorrect. The ocean is not getting acidified by additional CO2. It is getting neutralized by
additional CO2.¶ With that as prologue, let me go on to discuss the paper on oceanic pH.¶ The paper is called “High-Frequency Dynamics
of Ocean pH: A Multi-Ecosystem Comparison” (hereinafter pH2011). As the name suggests, they took a look at the actual variations of pH in a
host of different parts of the ocean. They show 30-day “snapshots” of a variety of ecosystems. The authors comment:¶ These biome-specific pH
signatures disclose current levels of exposure to both high and low dissolved CO2, often demonstrating that resident organisms are already
experiencing pH regimes that are not predicted until 2100.¶ First, they show the 30-day snapshot of both the open ocean and a deepwater
open ocean reef:¶ I note that even in
the open ocean, the pH is not constant, but varies by a bit over the thirty
days. These changes are quite short , and are likely related to rainfall events during the month. As
mentioned above, these slightly (and temporarily) neutralize the ocean surface, and over time mix in to the lower waters. Over Kingman reef,
there are longer lasting small swings.¶ Compare the two regions shown in Fig. 1 to some other coral reef “snapshots” of thirty days worth of
continuous pH measurements.¶ There are a couple of things of note in Figure 3. First, day-to-night variations in pH are from the CO2 that is
produced by the reef life as a whole. Also, day-to-night swings on the Palmyra reef terrace are about a quarter of a pH unit … which is about
60% more than the projected change from CO2 by the year 2100.¶ Moving on, we have the situation in a couple of upwelling areas off of the
California coast:¶ Here we see even greater swings of pH, much larger than the possible predicted change from CO2. Remember that this is only
over the period of a month, so there will likely be an annual component to the variation as well.¶ Again we see a variety of swings of pH, both
long- and short-term. Inshore, we find even larger swings, as shown in Figure 6.¶ Again we see large pH changes in a very short period of time,
both in the estuary and the near-shore area.¶ My conclusions from all of this?¶ First, there are a number of places in the ocean where the pH
swings are both rapid and large. The life in those parts of the ocean doesn’t seem to be bothered by either the size or the speed these swings.¶
Second, the size of the possible pH change by 2100 is not large compared to the natural swings.¶ Third, due to a host of buffering mechanisms
in the ocean, the possible pH change by 2100 may be smaller, but is unlikely to be larger, than the forecast estimate shown above.¶ Fourth, I
would be very surprised if we’re still burning much fossil fuel ninety years from now. Possible, but doubtful in my book. So from this effect as
well, the change in oceanic pH may well be less than shown above.¶ Fifth, as the authors commented, some parts of the ocean are already
experiencing conditions that were not forecast to arrive until 2100 … and are doing so with no ill effects.¶ As
a result, I’m not
particularly concerned about a small change in oceanic pH from the change in atmospheric CO2. The
ocean will adapt , some creatures’ ranges will change a bit , some species will be slightly advantaged
and others slightly disadvantaged. But CO2 has been high before this.
Overall, making the ocean slightly more
neutral will likely be beneficial to life, which doesn’t like alkalinity but doesn’t mind acidity at all.
The impact of climate change is hype
IBD 14 (5/13/2014, Investor’s Business Daily, “Obama Climate Report: Apocalypse Not,” Factiva, JMP)
Climate: Not since Jimmy Carter falsely spooked Americans about overpopulation, the world running out of
food, water and energy, and worsening pollution, has a president been so filled with doom and gloom as
this one. Last week's White House report on climate change was a primal scream to alarm Americans
into action to save the earth from a literal meltdown. Maybe we should call President Obama the
Fearmonger in Chief. While scientists can argue until the cows come home about what will happen in the future with the planet's
climate, we do have scientific records on what's already happened. Obama moans that the devastation from climate change is already here as
more severe weather events threaten to imperil our very survival. But, according
presumably the White House can get —
to the government's own records — which
severe weather events are no more likely now than they were 50 or 100
years ago and the losses of lives and property are much less devastating. Here is what government data reports
and top scientists tell us about extreme climate conditions: • Hurricanes: The century-long trend in Hurricanes is slightly
down, not up. According to the National Hurricane Center, in 2013, "There were no major hurricanes in the North Atlantic Basin for the
first time since 1994. And the number of hurricanes this year was the lowest since 1982." According to Dr. Ryan Maue at Weather Bell Analytics,
"We are currently in the longest period since the Civil War Era without a major hurricane strike in the U.S. (i.e., category 3, 4 or 5)" • Tornadoes:
Don't worry, Kansas. The
N ational O ceanic and A tmospheric A dministration says there has been no change in severe
tornado activity. "There has been little trend in the frequency of the stronger tornadoes over the past 55 years." • Extreme heat and cold
temperatures: NOAA's U.S. Climate Extremes Index of unusually hot or cold temperatures finds that over the last 10 years, five years have been
below the historical mean and five above the mean. • Severe drought/extreme moisture: While higher than average portions of the country
were subjected to extreme drought/moisture in the last few years, the 1930's, 40's and 50's were more extreme in this regard. In fact, over the
last 10 years, four years have been below the average and six above the average. • Cyclones: Maue reports: "the global frequency of tropical
cyclones has reached a historical low." • Floods: Dr. Roger Pielke Jr., past
chairman of the American Meteorological
Society Committee on Weather Forecasting and Analysis, reports, "floods have not increased in the U.S. in frequency or
intensity since at least 1950. Flood losses as a percentage of U.S. GDP have dropped by about 75% since 1940." • Warming: Even NOAA
admits a "lack of significant warming at the Earth's surface in the past decade" and a pause "in global
warming observed since 2000." Specifically, NOAA last year stated, "since the turn of the century, however, the change in Earth's
global mean surface temperature has been close to zero." Pielke sums up: "There is no evidence that disasters are getting
worse because of climate change. ... It is misleading, and just plain incorrect, to claim that disasters
associated with hurricanes, tornadoes, floods or droughts have increased on climate time scales either
in the U.S. or globally." One big change between today and 100 years ago is that humans are much more capable of dealing with
hurricanes and earthquakes and other acts of God. Homes and buildings are better built to withstand severe storms and alert systems are much
more accurate to warn people of the coming storms. As a result, globally, weather-related losses have actually decreased by about 25% as a
proportion of GDP since 1990. The liberal hubris is that government can do anything to change the earth's climate or prevent the next big
hurricane, earthquake or monsoon. These are the people in Washington who can't run a website, can't deliver the mail and can't balance a
budget. But they are going to prevent droughts and forest fires. The
President's doomsday claims last week served mostly to
undermine the alarmists' case for radical action on climate change. Truth always seems to be the first
casualty in this debate. This is the tactic of tyrants. Americans are wise to be wary about giving up our basic freedoms and
lowering our standard of living to combat an exaggerated crisis.
Warming not real – no scientific consensus – fluctuation within natural range
Kelly 14 – 5/29/14 (Jack Kelly, “the facts don’t add up for human-caused global warming, Pittsburgh Post
Gazette, http://www.post-gazette.com/opinion/jack-kelly/2014/05/29/The-facts-don-t-add-up-forhuman-caused-global-warming/stories/201405290275) //JL
The first five months of 2014 have been the coldest since the National Weather Service began keeping
records in 1888. If “climate change” alarmists got out more, they might have noticed. Between 1979 —
when weather satellites started measuring temperatures in the lower troposphere — and 1997, they rose about 1.1 degrees Celsius (1.98
degrees Fahrenheit). Temperatures stopped rising then, have fallen since 2012. The “pause” in warming (212 months) is now longer
than the warming trend was (211 months). The earth has warmed about 16 degrees F since the last ice age. The net
increase since 1979 — 0.19 degrees C (0.34 F) — is well within the range of natural fluctuation. So why, as
President Barack Obama says so often, do 97 percent of scientists agree climate change is “real, man-made, and
dangerous?” They don’t. This bogus stat is derived from two questions University of Illinois researchers asked in a survey of earth scientists in 2008: 1.
“When compared with pre-1800s levels, do you think that mean global temperatures have generally risen, fallen, or remained relatively constant?” 2. “Do you think
human activity is a significant contributing factor in changing mean global temperatures?” The researchers culled from 3,146 responses those of 79 climate
scientists who’d been published in peer reviewed journals. Seventy-six answered “risen” to the first question; 75 “yes” to the second. Temperatures have risen since
the Little Ice Age ended around 1870, skeptics agree. Most think the activities of humans have some effect on them. The
key question is whether
that effect is big enough to do harm, but that’s not what the scientists were asked. John Cook, climate
communication fellow (a publicist, not a climate scientist) at the University of Queensland in Australia and eight colleagues examined abstracts of 11,944 articles on
climate published between 1991 and 2011. “Among abstracts expressing a position . . . 97.1
percent endorsed the consensus position
that humans are causing global warming,” they concluded in a paper last May. Which is as meaningless as the “consensus” in the
two-question survey, for the same reason. Skeptics agree humans cause some warming. Mr. Cook et. al. included papers by prominent skeptics Willie Soon, Craig
Idso, Nocola Scafetta, Nir Shaviv, Nils-Axel Morner and Alan Carlin in their 97.1 percent “consensus.” Only 41 papers (0.3 percent) explicitly state support for Mr.
Cook’s assertion that humans have caused most of the warming since 1950, former Delaware state climatologist David Legates and three colleagues found in a peer
reviewed study last September. “It is astonishing that any journal could have published a paper claiming a 9 percent climate consensus when on the authors’ own
analysis the true consensus was well below 1 percent,” Mr. Legates said. Carbon
dioxide in the atmosphere has increased from
about 285 parts per million 250 years ago to about 380 ppm today. CO2 is a “greenhouse” gas -- it holds
heat in the atmosphere -- so if humans are generating more, it should have a warming effect. But
probably not much of one. Greenhouse gases comprise less than 1 percent of the earth’s atmosphere; carbon dioxide is less than 4 percent of
greenhouse gases; 96 percent of CO2 in the atmosphere was put there by Mother Nature. Compared to variations in solar radiation and other natural forces, the
effect of greenhouse gases on climate is trivial. “There
is no convincing scientific evidence that human release of carbon
dioxide, methane, or other greenhouse gases is causing or will, in the foreseeable future, cause
catastrophic heating of the Earth’s atmosphere and disruption of the Earth’s climate,” says a petition
signed by more than 31,000 American scientists in climate-related disciplines. That’s rather more than
79 or 41. There is no scientific consensus on human-caused global warming, and there shouldn’t be. “If it’s
science, it isn’t consensus,” said Mr. Soon, a solar expert at the Harvard-Smithsonian Center for Astrophysics. “If it’s consensus, it isn’t science.” Scientists search for
truth by observation and experimentation, not by taking polls. Consensus is a political concept. The skeptics are true to the scientific method. The abusers of science
are those who politicize it.
Asia pollution offsets any US action – global warming is inevitable
Knappenberger 12 – Mr. Paul Knappenberger is the Assistant Director of the Cato Institute’s Center for
the Study of Science. He holds an M.S. degree in Environmental Sciences (1990) from the University of
Virginia as well as a B.A. degree in Environmental Sciences (1986) from the same institution.His over 20
years of experience as a climate researcher have included 10 years with the Virginia State Climatology
Office and 13 years with New Hope Environmental Services, Inc. June 7th, 2012, "Asian Air Pollution
Warms U.S More than Our GHG Emissions (More futility for U.S. EPA)"
www.masterresource.org/2012/06/asian-air-pollution-warming/
“The whims of foreign nations, not to mention Mother Nature, can completely offset any climate changes induced by U.S. greenhouse gas
emissions reductions…. So, what’s the point of forcing Americans into different energy choices?”¶ A
new study provides evidence
that air pollution emanating from Asia will warm the U.S. as much or more than warming from U.S.
greenhouse gas (GHG) emissions. The implication? Efforts by the U.S. Environmental Protection Agency
(and otherwise) to mitigate anthropogenic climate change is moot .¶ If the future temperature rise in
the U.S. is subject to the whims of Asian environmental and energy policy, then what sense does it make
for Americans to have their energy choices regulated by efforts aimed at mitigating future temperature
increases across the country—efforts which will have less of an impact on temperatures than the
policies enacted across Asia?¶ Maybe the EPA should reconsider the perceived effectiveness of its
greenhouse gas emission regulations—at least when it comes to impacting temperatures across the U.S.¶ New Study¶ A new
study just published in the scientific journal Geophysical Research Letters is authored by a team led by
Haiyan Teng from the National Center for Atmospheric Research, in Boulder, Colorado. The paper is titled “Potential
Impacts of Asian Carbon Aerosols on Future US Warming.Ӧ Skipping the details of this climate modeling study and cutting to the chase, here is
the abstract of the paper:¶ This
study uses an atmosphere-ocean fully coupled climate model to investigate
possible remote impacts of Asian carbonaceous aerosols on US climate change. We took a 21st century mitigation
scenario as a reference, and carried out three sets of sensitivity experiments in which the prescribed carbonaceous aerosol concentrations over
a selected Asian domain are increased by a factor of two, six, and ten respectively during the period of 2005–2024.¶ The
resulting
enhancement of atmospheric solar absorption (only the direct effect of aerosols is included) over Asia
induces tropospheric heating anomalies that force large-scale circulation changes which, averaged over
the twenty-year period, add as much as an additional 0.4°C warming over the eastern US during
winter and over most of the US during summer. Such remote impacts are confirmed by an atmosphere
stand-alone experiment with specified heating anomalies over Asia that represent the direct effect of
the carbon aerosols.¶ Usually, when considering the climate impact from carbon aerosol emissions (primarily in the form of black
carbon, or soot), the effect is thought to be largely contained to the local or regional scale because the atmospheric
lifetime of these particulates is only on the order of a week (before they are rained out). Since Asia lies on the far side of the Pacific Ocean—a
distance which requires about a week for air masses to navigate—we usually aren’t overly concerned about the quality of Asian air or the
quantity of junk that they emit into it. By the time it gets here, it has largely been naturally scrubbed clean. ¶ But in the Teng et al. study, the
authors find that, according to their climate model, the
local heating of the atmosphere by the Asian carbon aerosols
(which are quite good at absorbing sunlight) can impart changes to the character of the larger-scale
atmospheric circulation patterns. And these changes to the broader atmospheric flow produce an
effect on the weather patterns in the U.S. and thus induce a change in the climate here characterized by “0.4°C [surface air
temperature] warming on average over the eastern US during winter and over almost the entire US during summer” averaged over the 2005–
2024 period.¶ While most of the summer warming doesn’t start to kick in until Asian carbonaceous aerosol emissions are upped in the model to
10 times what they are today, the winter warming over the eastern half of the country is large (several tenths of a °C) even at twice the current
rate of Asian emissions.¶ Now let’s revisit just how much “global warming” that stringent U.S. greenhouse gas emissions reductions may avoid
averaged across the country.¶ In my Master Resource post “Climate Impacts of Waxman-Markey (the IPCC-based arithmetic of no gain)” I
calculated that a
more than 80% reduction of greenhouse gas emissions in the U.S. by the year 2050 would
result in a reduction of global temperatures (from where they otherwise would be) of about 0.05°C.
Since the U.S. is projected to warm slightly more than the global average (land warms faster than the
oceans), a 0.05°C of global temperature reduction probably amounts to about 0.075°C of temperature
“savings” averaged across the U.S., by the year 2050.¶ Comparing the amount of warming in the U.S.
saved by reducing our greenhouse gas emissions by some 80% to the amount of warming added in the
U.S. by increases in Asian black carbon (soot) aerosol emissions (at least according to Teng et al.) and
there is no clear winner. Which points out the anemic effect that U.S. greenhouse gas reductions will
have on the climate of the U.S. and just how easily the whims of foreign nations, not to mention
Mother Nature, can completely offset any climate changes induced by our greenhouse gas emissions
reductions.¶ And even if the traditional form of air pollution (e.g., soot) does not increase across Asia (a slim chance of that), greenhouse
gases emitted there certainly will. For example, at the current growth rate,
new greenhouse gas emissions from China will
completely subsume an 80% reduction in U.S. greenhouse gas emission in just over a decade.
Once
again, pointing out that a reduction in domestic greenhouse gases is for naught, at least when it comes to mitigating climate change.¶ So,
what’s the point, really, of forcing Americans into different energy choices? As I have repeatedly pointed out,
nothing we do here
(when it comes to greenhouse gas emissions) will make any difference either domestically, or
globally, when it comes to influences on the climate.
What the powers-that-be behind emissions reduction schemes in
the U.S. are hoping for is that 1) it doesn’t hurt us too much, and 2) that China and other large developing nations will follow our lead.¶ Both
outcomes seem dubious at time scales that make a difference.
Squo solves warming – volcanoes
Santer 14 – PhD in Climatology, climate researcher at Lawrence Livermore National Laboratory and
former researcher at the University of East Anglia's Climatic Research Unit
(Benjamin, “Volcanic contribution to decadal changes in tropospheric temperature,” Nature Geoscience,
doi:10.1038/ngeo2098)//BB
Despite continued growth in atmospheric levels of greenhouse gases, global mean surface and
tropospheric temperatures have shown slower warming
since 1998 than previously1, 2, 3, 4, 5. Possible
explanations for the slow-down include internal climate variability3, 4, 6, 7, external cooling influences 1, 2, 4, 8, 9,
10, 11 and observational errors12, 13. Several recent modelling studies have examined the contribution of early
twenty-first-century volcanic eruptions1, 2, 4, 8 to the muted surface warming. Here we present a detailed analysis of the impact
of recent volcanic forcing on tropospheric temperature, based on observations as well as climate model simulations. We identify
statistically significant correlations between observations of stratospheric aerosol optical depth and satellitebased estimates of both tropospheric temperature and short-wave fluxes at the top of the atmosphere. We show that climate
model simulations without the effects of early twenty-first-century volcanic eruptions overestimate the
tropospheric warming
observed since 1998. In two simulations with more realistic volcanic influences following the 1991 Pinatubo
eruption, differences between simulated and observed tropospheric temperature trends over the period 1998 to 2012 are up to 15% smaller,
with large uncertainties in the magnitude of the effect. To reduce these uncertainties, better observations of eruption-specific properties of
volcanic aerosols are needed, as well as improved representation of these eruption-specific properties in climate model simulations.
U.S. climate leadership ineffective – can’t solve poor countries and developed
countries refuse to curb emissions.
Loris and Schaefer 13- Loris is Herbert and Joyce Morgan Fellow in the Thomas A. Roe Institute for Economic Policy
Studies and, Shaefer is Jay Kingham Fellow in International Regulatory Affairs in the Margaret Thatcher Center for Freedom, a
division of the Kathryn and Shelby Cullom Davis Institute for International Studies (Nicolas Loris and Brett D. Shaefer, 1/24/13,
“Climate Change: How the U.S. should lead, http://www.heritage.org/research/reports/2013/01/climate-change-how-the-usshould-lead) //JL
During his 2013 inaugural address, President Obama told Americans that the United States “will respond to the threat of climate change” and will take the lead for
other countries to follow suit. Even
assuming the accuracy of climate change models, unilateral action by the U.S. is
a costly symbolic gesture that would do nothing to successfully resolve climate challenges. However, there are
sufficient questions about the underlying climate change assumptions, evidence and predictions to justify caution before implementing a policy response.
Moreover, without concerted commitments by the major economies, including those in the developing
world, Congress would be foolish to impose unilateral restrictions on the U.S. economy and the past
four years have demonstrated conclusively that there is no international consensus for action.
Nonetheless, the President is instead pursuing costly regulatory actions to unilaterally reduce U.S.
greenhouse gas emissions, which impose vast costs on Americans and slow economic growth without
significantly reducing global greenhouse gas emissions. Congress should demonstrate leadership by removing costly backdoor
climate regulations imposed by this Administration. Failed Legislative Attempts with Even Dimmer Prospects Now Despite having a Democrat-controlled Congress in
his first two years, President
Obama failed to push through cap-and-trade legislation that would have adopted a goal of reducing
reason for this failure was clear:
The proposed law would greatly harm the U.S. economy and undermine job growth. Indeed, The Heritage
Foundation’s Center for Data Analysis estimated that the cap-and-trade bill proposed by Senators
Barbara Boxer (D–CA) and newly nominated Secretary of State John Kerry (D–MA) would have resulted
in income losses of nearly $10 trillion and over 2.5 million lost jobs.[1] Reviving cap and trade or implementing a similarly
costly carbon tax[2] is even less likely today than four years ago, as the House of Representatives is controlled by Republicans, many of whom are unconvinced of
carbon dioxide and other greenhouse gas emissions to 83 percent below 2005 levels by the year 2050. The
the magnitude of climate change or of the efficacy of unilateral action. Similarly, interfering in the market through preferential policies and subsidies financed by
taxpayer dollars designed to support green energy projects—adding to the tens of billions of dollars passed in the 2009 American Recovery and Reinvestment Act—
is unlikely to elicit bipartisan support when Congress is grappling with how to reduce spending. A Failed International Approach The
past four years
have seen successive annual U.N. conferences (Copenhagen in 2009, Cancun in 2010, Durban in 2011,
and Doha in 2012) frantically trying to reach agreement among nearly 200 countries on a successor to
the Kyoto Protocol. In essence, these conferences have succeeded only in wresting vague pledges from
developed countries to reduce emissions, contribute funds to help developing countries adapt to climate change, and meet again to try to negotiate a
binding treaty by 2015. The problem is that the basic approach is unworkable. International negotiations have centered on placing the economic
burden of addressing climate change on a few dozen developed countries while asking nothing from more than 150 developing countries. But the primary
source of greenhouse gas emissions is increasingly the developing world. Any approach to effectively
address increasing emissions of greenhouse gases must capture emissions from developed and
developing countries. This notion was the central feature of the 1997 Byrd–Hagel Resolution, which unanimously passed the Senate, establishing
conditions for the U.S. becoming a signatory to the Kyoto Protocol and remains the primary reason why the U.S. never ratified that treaty. But developing
countries, primarily India and China, have made it quite clear that they have no appetite to slow
economic growth or curb use of conventional fuels to control emissions. For this reason, Canada, Japan,
and Russia refused to sign onto a new agreement committing them to emissions reductions unless
major developing country emitters were also included. Until and unless this issue is resolved, the U.S.
would be foolish to consider unilateral restrictions on the U.S. economy that, in the end, would be
merely symbolic without significant effect on global emissions reductions. When You Can’t Legislate, Regulate The
Environmental Protection Agency (EPA), the Department of Interior, and the Department of Labor are all promulgating stringent emission standards for new power
plants that would effectively prohibit construction of new coal-fired electricity generating capacity unless it is equipped with carbon-capture technology—a
prohibitively costly technological requirement. The EPA has also finalized new federal air quality standards for hydraulically fractured wells, a critical extraction
process necessary to help tap vast supplies of oil and natural gas. The EPA contends that the regulations are necessary to reduce emissions of volatile organic
compounds and hazardous air pollutants. However, the EPA quantifies only environmental benefits from methane, clearly indicating this rule was more about
regulating a greenhouse gas. Furthermore, although the Obama Administration was not the first to implement federal fuel efficiency standards, the President’s EPA
and Department of Transportation required the first-ever greenhouse gas emissions standards for vehicles. The Obama Administration recently finalized new fuelefficiency rules for cars and light trucks for model years 2017–2025 that require a near doubling of the current standards. Combined with the more stringent rules
for 2011–2016, the new standards will increase the average cost of a new car by $3,000 by 2025. [3] The
onslaught of regulations to reduce
greenhouse gas emissions will hurt consumers directly through higher energy costs and indirectly
through higher prices for goods and services. Like cap and trade, the result is higher unemployment and
less economic output. How the U.S. Can Truly Lead President Obama said in his inaugural address that if the U.S. leads in reducing emissions, the rest of
the world would follow in America’s footsteps. The statements and actions of other countries over the past four years reveal that there is no nascent consensus for
effective international action that could be catalyzed by unilateral U.S. action. Instead, the U.S. should demonstrate real leadership by: Undertaking independent
efforts to more accurately determine the severity of climate change and verify U.N. claims. The lack of warming in recent years is raising fundamental questions
about the underlying assumptions of climate-change predictions. Undertaking actions with grave implications for the U.S. economy without greater confidence is
irresponsible. Working with a smaller group of nations through informal arrangements such as the Major Economies Forum to undertake appropriate steps that are
cost-effective, verifiable, and effectual. Calling for a moratorium on U.N. climate change conferences that emphasize financial transfers and reinforce the flawed,
ineffective Kyoto methodology of differentiated responsibilities. Prohibiting the EPA and other agencies from regulating greenhouse gas emissions and prohibiting
the EPA and other agencies from using any funds to promulgate or enforce any regulation intended to mitigate global warming unless it is expressly authorized to
do so by Congress. Follow the Leader? Restricting
greenhouse gas emissions, whether unilaterally or multilaterally,
would result in significant economic costs for the U.S. economy. This is a serious decision with grave
consequences. The U.S. should not unilaterally assume these burdens as a symbolic gesture hoping that
other countries might emulate our example—repeated U.N. negotiations demonstrate the small
likelihood of that outcome. The U.S. should act prudently by increasing its certainty about the underlying assumptions and resulting predictions of
climate change, ensuring that the benefits justify the costs, and undertaking concrete commitments and actions only in the context of an overarching strategy that
would actually have an impact on climate change if action is n
2nc – xt: no impact
No impact – warming will take centuries and adaptation solves
Mendelsohn 9Robert O. Mendelsohn 9, the Edwin Weyerhaeuser Davis Professor, Yale School of
Forestry and Environmental Studies, Yale University, June 2009, “Climate Change and Economic
Growth,” online: http://www.growthcommission.org/storage/cgdev/documents/gcwp060web.pdf
These statements are largely alarmist and misleading . Although climate change is a serious problem that deserves attention,
society’s immediate behavior has an extremely low probability of leading to catastrophic
consequences . The science and economics of climate change is quite clear that emissions over the next few
decades will lead to only mild consequences . The severe impacts
predicted by alarmists
require a century
(or two in the case of Stern 2006) of no mitigation . Many of the predicted impacts assume there will be no or
little adaptation. The net economic impacts from climate change over the next 50 years will be small regardless. Most of the more
severe impacts will take more than a century or even a millennium to unfold and many of these
“potential” impacts will never occur because people will adapt. It is not at all apparent that immediate
and dramatic policies need to be developed to thwart long‐range climate risks. What is needed are long‐run
balanced responses.
Even rapid intervals of extreme warming don’t cause extinction – the fossil record
clearly disproves any existential impact
Willis et. al, ’10[Kathy J. Willis, Keith D. Bennett, Shonil A. Bhagwat& H. John B. Birks (2010): 4 °C and
beyond: what did this mean for biodiversity in the past?, Systematics and Biodiversity, 8:1, 3-9,
http://www.tandfonline.com/doi/pdf/10.1080/14772000903495833, ]
The most recent climate models and fossil evidence for the early Eocene Climatic Optimum (53–51
million years ago) indicate that during this time interval atmospheric CO2 would have exceeded
1200ppmv and tropical temperatures were between 5–10 ◦ C warmer than modern values (Zachos et al.,
2008). There is also evidence for relatively rapid intervals of extreme global warmth and massive carbon
addition when global temperatures increased by 5 ◦ C in less than 10 000 years (Zachos et al., 2001). So
what was the response of biota to these ‘climate extremes’ and do we see the large-scale extinctions
(especially in the Neotropics) predicted by some of the most recent models associated with future
climate changes (Huntingford et al., 2008)? In fact the fossil record for the early Eocene Climatic
Optimum demonstrates the very opposite. All the evidence from low-latitude records indicates that, at
least in the plant fossil record, this was one of the most biodiverse intervals of time in the
Neotropics(Jaramillo et al., 2006). It was also a time when the tropical forest biome was the most
extensive in Earth’s history, extending to mid-latitudes in both the northern and southern hemispheres
– and there was also no ice at the Poles and Antarctica was covered by needle-leaved forest (Morley,
2007). There were certainly novel ecosystems, and an increase in community turnover with a mixture of
tropical and temperate species in mid latitudes and plants persisting in areas that are currently polar
deserts. [It should be noted; however, that at the earlier Palaeocene–Eocene Thermal Maximum (PETM)
at 55.8 million years ago in the US Gulf Coast, there was a rapid vegetation response to climate change.
There was major compositional turnover, palynological richness decreased, and regional extinctions
occurred (Harrington & Jaramillo, 2007). Reasons for these changes are unclear, but they may have
resulted from continental drying, negative feedbacks on vegetation to changing CO2 (assuming that CO2
changed during the PETM), rapid cooling immediately after the PETM, or subtle changes in plant–animal
interactions (Harrington & Jaramillo, 2007).]
2nc – xt: no warming
Warming not real-their authors have a political agenda and skew data - temperatures
haven’t increased in 15 years
Chasmar 14 – 2/26/14 (Jessica Chasmar, Patrick Moore-PhD in ecology British Columbia, “Greenpeace
co-founder says ‘no scientific proof’ humans cause climate change”, Washington times,
http://www.washingtontimes.com/news/2014/feb/26/greenpeace-co-founder-says-no-scientific-proofhum/)//JL
A co-founder of Greenpeace told a Senate panel on Tuesday that there is no scientific evidence to back
claims that humans are the “dominant cause” of climate change. Patrick Moore, a Canadian ecologist who was a member of
Greenpeace from 1971-86, told members of the Senate Environment and Public Works Committee environmental groups like
Greenpeace use faulty computer models and scare tactics in further promoting a political agenda, Fox News
reported. “There is no scientific proof that human emissions of carbon dioxide (CO2) are the dominant cause
of the minor warming of the Earth’s atmosphere over the past 100 years,” Mr. Moore said. “Today, we live in an unusually
cold period in the history of life on earth and there is no reason to believe that a warmer climate would be anything but beneficial
for humans and the majority of
other species. “It is important to recognize, in the face of dire predictions about a [two degrees Celsius] rise in global average temperature, that humans
are
a tropical species,” he continued. “We evolved at the equator in a climate where freezing weather did not
exist. The only reasons we can survive these cold climates are fire, clothing, and housing. “The fact that we had
both higher temperatures and an ice age at a time when CO2 emissions were 10 times higher than they are today fundamentally contradicts the certainty that
human-caused CO2 emissions are the main cause of global warming,” he said. Mr. Moore left Greenpeace in 1986, accusing the organization of political activism.
“After 15 years in the top committee I had to leave as Greenpeace took a sharp turn to the political left, and began to adopt policies that I could not accept from my
scientific perspective,” he said. “Climate change was not an issue when I abandoned Greenpeace, but it certainly is now.” A
United Nations report by
the Intergovernmental Panel on Climate Change released in September indicated that global surface
temperatures had not increased for the past 15 years.
Global warming is a fallacy – prefer newer, qualified, and unbiased studies
Bast et al, 14
(Joseph, president and CEO of The Heartland Institute, “Global Warming: Not a Crisis”,
http://heartland.org/ideas/global-warming-not-crisis, nd 2014, ak.)We do endorse the gendered
language in this card
Isn’t There a Consensus? Science doesn’t advance by “consensus.” A single scientist or study can
disprove a theory that is embraced by the vast majority of scientists. The search for a consensus is
actually part of what philosophers call “post-normal science,” which isn’t really science at all. Still, many
people ask: What do scientists believe? Most surveys cited by those who claim there is a consensus ask
questions that are too vague to settle the matter. It is important to distinguish between the statement
that global warming is a crisis and the similar-sounding but very different statements that the climate is
changing and that there is a human impact on climate. Climate is always changing, and every scientist
knows this. Our emissions and alterations of the landscape are surely having impacts on climate, though
they are often local or regional (like heat islands) and small relative to natural variation. There is plenty
of evidence that there is no scientific consensus that climate change is man-made and dangerous. The
multi-volume Climate Change Reconsidered series cites thousands of articles appearing in peerreviewed journals that challenge the basic underlying assumptions of AGW (Climate Change
Reconsidered 2008, 2009, 2011, 2013, 2014). More than 30,000 scientists have signed a petition saying
there is no threat that man-made global warming will pose a threat to humanity or nature (Petition
Project). Alarmists often cite an essay by Naomi Oreskes claiming to show that virtually all articles about
global warming in peer-reviewed journals support the so-called consensus. But a no-less-rigorous study
by Benny Peiser that attempted to replicate her results searched the abstracts of 1,117 scientific journal
articles on “global climate change” and found only 13 (1 percent) explicitly endorse the “consensus
view” while 34 reject or cast doubt on the view that human activity has been the main driver of warming
over the past 50 years. A more recent search by Klaus-Martin Schulte of 928 scientific papers published
from 2004 to February 2007 found fewer than half explicitly or implicitly endorse the so-called
consensus and only 7 percent do so explicitly (Schulte, 2008). A survey that is frequently cited as
showing consensus actually proves just the opposite. German scientists Dennis Bray and Hans von
Storch have surveyed climate scientists three times, in 1996, 2003, and 2007 (Bray and von Storch,
2010). Their latest survey found most of these scientists say they believe global warming is man-made
and is a serious problem, but most of these same scientists do not believe climate science is sufficiently
advanced to predict future climate conditions. For two-thirds of the science questions asked, scientific
opinion is deeply divided, and in half of those cases, most scientists disagree with positions that are at
the foundation of the alarmist case (Bast, 2011). On August 2, 2011, von Storch posted the following
comment on a blog: “From our own observations of discussions among climate scientists we also find
hardly consensus [sic] on many other issues, ranging from changing hurricane statistics to the speed of
melting Greenland and Antarctica, spreading of diseases and causing mass migration and wars” (von
Storch, 2011). These are not minor issues. Extreme weather events, melting ice, and the spread of
disease are all major talking points for Al Gore and other alarmists in the climate debate. If there is no
consensus on these matters, then “skeptics” are right to ask why we should believe global warming is a
crisis. Cognitive Dissonance? How can scientists say they believe global warming is a problem, but at the
same time not believe there is sufficient scientific evidence to predict future climate conditions?
2nc – xt: nuclear inevitable
Nuclear transition’s inevitable – India proves
ET 8 – The Economic Times (7/9/8, “Nuclear energy an inevitable option for India: official,”
http://articles.economictimes.indiatimes.com/2008-07-09/news/27696228_1_nuclear-power-nuclearenergy-atomic-energy)//twonily
BANGALORE: Nuclear energy is an "inevitable option" for India as the country's capacity to produce nuclear
power could be up to as high as 2,00,000 MW by the year 2050 with the addition of more fast breeder reactors, a top government
official has said.Nuclear power currently accounts for around 4,000 MW of the total 1,40,000 MW installed electricity generating capacity in the
country excluding captive power plants, said Chidambaram, Principal Scientific Advisor to the Government of India. "The country's main thrust
would be coal-based power in the next 20 years," he said. Installed capacity of nuclear energy is projected to go
20,000 MW by the year 2020, Chidambaram told a China-India-US science, technology and innovation workshop here on Tuesday.
up to
"Thereafter it would grow very rapidly with more fast breeder reactors being introduced in the system. The
capacity can grow to as much as 2,00,000 MW by the year 2050", said Chidambaram, who is also professor at the Department of the Atomic
Energy, BARC.But he hastened to add: "It (India's nuclear power production capacity going up to 2,00,000 MW by 2050) depends
upon
how international situation changes. It depends on how quickly proliferation misconceptions are removed from the system.
"Emphasising that nuclear power is an inevitable option to meet India's rapid growth in future, he said the
current production would not be able to cater the surging energy demand.
2nc – xt: warming inevitable
We’re already past the tipping point
Guterl 9 –Fred Guterl 9, Executive Editor of Scientific American, Will Climate Go Over The Edge?, 2009
http://www.newsweek.com/id/185822
Since the real world is so messy, climate scientists Gerard Roe and Marcia Baker turned for insight to the distinctly neater world of
mathematics. Last year, they published an analysis in the journal Science arguing that climate models were skewed in
the direction of underestimating the warming effect of carbon. The report reasoned that carbon emissions have the
potential to trigger many changes that amplify the warming effect—water absorbs more sunlight than ice, humidity traps more heat, and
so on—but few that would mitigate it. The odds, they figure, are about one in three that temperatures will rise by 4.5 degrees C (the top of
the IPCC's range), but there's little chance at all that they'll rise by less than 2 degrees C. "We've had a hard time
eliminating the possibility of very large climate changes," says Roe. The answer is still couched in probabilities, but they've
shifted in a worrying direction.¶ What can be done? Can a diplomatic miracle in Copenhagen save the planet from the dreaded tipping
point? Sea ice in the Antarctic was supposed to last for 5,000 years until scientists found that the melting was proceeding at a faster pace
than expected. Now it will all be gone in a mere 850 years. Bringing it back would require something like 10,000 years of cooler
temperatures.
Is there any way to halt the process before it goes too far?¶ No , says Susan Solomon, a
climate scientist at the National Oceanic and Atmospheric Administration in Boulder, Colorado. In a recent study in the
Proceedings of the National Academies of Science, she found that most of the
carbon we've already released into the
atmosphere will hang around for another 1,000 years. Even if world leaders somehow managed to
persuade everybody to stop driving cars and heating their homes— bring ing
carbon emissions down to zero
immediately —the Earth would continue to warm for centuries. The effect of rising temperatures on rainfall
patterns is also
irreversible , says Solomon. Parts of the world that tend to be dry (Mexico, north Africa, southern Europe and the
western parts of Australia and the United States) will continue to get drier, while wet areas (the South Pacific islands, the horn of Africa)
will keep getting wetter. "You have to think of it as being like a dial that can only turn one way," she says. "We've
cranked up the dial, and we don't get to crank it back." The point of a climate treaty, then, isn't so much to roll things back as
to keep them from getting a whole lot worse—a worthy and important goal, if not a particularly inspiring one.
2nc – xt: no ocean acidification impact
Any claims to the contrary are exaggerated
Ridley 10 – PhD in Zoology, visiting professor at Cold Spring Harbor Laboratory (Matt, June 15, 2010,
“Threat From Ocean Acidification Greatly Exaggerated,” http://www.thegwpf.org/theobservatory/1106-matt-ridley-threat-from-ocean-acidification-greatly-exaggerated.html)//gingE
Lest my critics still accuse me of cherry-picking studies, let me refer them also to the results of Hendrikset al. (2010, Estuarine, Coastal and Shelf
Science 86:157). Far from being a cherry-picked study, this is a massive meta-analysis. The
authors observed that ‘warnings that
ocean acidification is a major threat to marine biodiversity are largely based on the analysis of predicted
changes in ocean chemical fields’ rather than empirical data . So they constructed a database of 372 studies in which
the responses of 44 different marine species to ocean acidification induced by equilibrating seawater with CO2-enriched air had been actually
measured. They found that only
a minority of studies demonstrated ‘significant responses to acidification’ and
there was no significant mean effect even in these studies. ¶ They concluded that the world's marine biota are
‘ more resistant to ocean acidification than suggested by pessimistic predictions identifying ocean
acidification as a major threat to marine biodiversity’ and that ocean acidification ‘may not be the
widespread problem conjured into the 21st century…Biological processes can provide homeostasis
against changes in pH in bulk waters of the range predicted during the 21st century.’ ¶ This important
paper alone contradicts Hoegh-Gudlberg’s assertion that ‘the vast bulk of scientific evidence shows that calcifiers…
are being heavily impacted already’. In conclusion, I rest my case. My five critics have not only failed to contradict, but have
explicitly confirmed the truth of every single one of my factual statements. We differ only in how we interpret the facts. It is hardly surprising
that my opinion is not shared by five scientists whose research grants depend on funding agencies being persuaded that there will be a severe
and rapid impact of carbon dioxide emissions on coral reefs in coming decades. I merely report accurately that the
and theoretical research suggests that the likely impact has been exaggerated.
latest empirical
2nc – xt: can’t solve prolif
Expanding nuclear energy makes prolif inevitable – regulations are irrelevant – NPT legitimacy makes
it worse
Lowe 5 – Emeritus Professor of Science, Technology, and Society @ Griffith University, President of the
Australian Conservation Foundation (Prof. Ian Lowe, September 2005, “Nuclear Power: No Solution to
Climate Change,”
http://www.acfonline.org.au/sites/default/files/resources/Nuclear_Power_No_Solution_to_Climate_Ch
ange.pdf)//twonily
Global expansion of nuclear power could contribute to an increase in the number of nuclear weapons
states
– as it has in the past. It would
probably lead to an increase in the number of ‘threshold’ or ‘breakout’
nuclear states which could quickly produce weapons drawing on expertise , facilities and materials
from their ‘civil’ nuclear program. Nuclear expansion would also increase the availability of nuclear materials for
use in nuclear weapons or radioactive ‘dirty bombs’ by terrorist groups. Supposedly ‘peaceful’ nuclear facilities and
materials have been used in various ways in secret weapons programs, including the production of highly enriched uranium
(used in the Hiroshima bomb) and plutonium (used in the Nagasaki bomb). Of the 60 countries which have built nuclear power or research
reactors, over 20 are known to have used their ‘peaceful’ nuclear facilities for covert weapons research and/or
production. (Nuclear Weapon Archive, n.d.; Green, 2002; Institute for Science and International Security, n.d.) In some cases the military R&D
was small-scale and shortlived, but in other cases nation
states have succeeded in producing nuclear weapons under
cover of a peaceful nuclear program – India , Pakistan , Israel , South Africa and possibly North Korea . In
other cases, substantial progress had been made towards a weapons capability before the weapons program
was terminated, with Iraq’s nuclear program from the 1970s until 1991 being the most striking of several examples. The current tensions
around the nuclear programs in Iran and North Korea further highlight the potential use of ‘peaceful’ nuclear facilities for nuclear weapons
production. The International Atomic Energy Agency’s (IAEA)
safeguards system still suffers from flaws and limitations
despite improvements over the past decade. At least eight Nuclear Non-Proliferation Treaty (NPT) member states have
carried out weapons-related projects in violation of their NPT agreements, or have carried out permissible (weaponsrelated) activities but failed to meet their reporting requirements to the IAEA – Egypt, Iraq, Libya, North Korea, Romania, South Korea, Taiwan,
and Yugoslavia. Recent statements from the IAEA and US President George W. Bush about the need to limit the spread of enrichment and
reprocessing technology, and to establish multinational control over sensitive nuclear facilities, are an effective acknowledgement of the
fundamental flaws and limitations of the international non-proliferation system. The NPT enshrines
an ‘inalienable right’ of
member states to all ‘civil’ nuclear technologies, including dual-use technologies with both peaceful and military capabilities.
In other words, the NPT enshrines the ‘right’ to develop a nuclear weapons threshold or breakout capability .
Another serious deficiency is that the NPT places no stronger obligation on the five ‘declared’ nuclear weapons states – the US, Russia, the UK,
France and China – than to engage in negotiations on nuclear disarmament. The intransigence of the nuclear weapons states provides
incentives and excuses for other states to pursue nuclear weapons – and civil programs can provide the expertise, the facilities and the
materials to pursue military programs. IAEA Director-General Mohamed El Baradei noted in a 2004 speech to the Council on Foreign Relations
in New York: ‘’There
are some who have continued to dangle a cigarette from their mouth and tell
everybody else not to smoke.’’ (Quoted in Traub, 2004.)
**nuclear leadership
1nc – nuclear leadership
Other sources of fissile material are alt causes
World Nuclear Organization 11 [International organization studying nuclear power and weapons,
“Safeguards to Prevent Nuclear Proliferation”, June 2011, http://www.worldnuclear.org/info/inf12.html]//twonily
Civil nuclear power has not been the cause of or route to nuclear weapons in any country that has
nuclear weapons, and no uranium traded for electricity production has ever been diverted for military
use. All nuclear weapons programmes have either preceded or risen independently of civil nuclear power*,
as shown most recently by North Korea. No country is without plenty of uranium in the small
quantities needed for a few weapons. Former US Vice-President Al Gore said (18/9/06) that "During my eight years in the White
House, every nuclear weapons proliferation issue we dealt with was connected to a nuclear reactor program. Today, the dangerous weapons
programs in both Iran and North Korea are linked to their civilian reactor programs." He is not correct. Iran has failed to convince anyone that
its formerly clandestine enrichment program has anything to do with its nuclear power reactor under construction (which is fuelled by Russia),
and North Korea has no civil reactor program. In respect to India and Pakistan, which he may have had in mind, there is evidently a link
Perspective is relevant : As little as five
tonnes of natural uranium is required to produce a nuclear weapon. Uranium is ubiquitous, and if cost is
no object it could be recovered in such quantities from most granites, or from sea water - sources which
would be quite uneconomic for commercial use. In contrast, world trade for electricity production is
almost 70,000 tonnes of uranium per year, all of which can be accounted for. There is no chance that
between military and civil, but that is part of the reason they are outside the NPT.
the resurgent problem of nuclear weapons proliferation will be solved by turning away from nuclear
power or ceasing trade in the tens of thousands of tonnes each year needed for it.
Squo solves – even with court rulings a shift from Yucca mountain is inevitable
Silverstein 13 – Forbes contributor, global energy business specialist (Ken Silverstein, 8/24/13, “Nuclear
Waste Will Never Be Laid To Rest At Yucca Mountain,”
http://www.forbes.com/sites/kensilverstein/2013/08/24/nuclear-waste-will-never-be-laid-to-rest-atyucca-mountain/)//twonily
Some thought the idea of using Yucca Mountain that is 90-miles outside of Las Vegas as a permanent nuclear waste disposal site
was, well, buried. Not so, now that a federal appeals court has ordered the Nuclear Regulatory Commission to continue the licensing
process. Yucca Mountain Yucca Mountain (Photo credit: Wikipedia) The big question is whether the U.S. Congress will pony up so that those
nuclear regulators can finish the job. Right now, the fund to continue the research has withered to just $11 million. Potentially wasting more
money is one issue. So is “flouting the law,” which is what one of the judge’s said has been going on — a slap in the face to “our constitutional
system.” Congress authorized the study of this site back in 1987 and unless or until it pulls the plug on it, lawmakers must bankroll it and
regulators must carry out their will. “By making it a permanent repository, we opened the flood gates for every theory as to why not to do it,”
says former Energy Secretary Spencer Abraham, in a phone conversation with this reporter. “We have given critics and opponents the grounds
on which to make their case. But centralizing a facility, as opposed to on-site situations, is a much safer approach. Settings in metropolitan
areas are not safer than storing nuclear waste under a mountain that is 1,000 feet below the earth.” Abraham, who is now the board chair for
Uranium Energy UEC -1.89% Corp., goes on to say that the
“wiser approach” would have been to recommend a 250year repository to store nuclear waste. Yucca Mountain, by comparison, is a 10,000-year site. Such a “compromise,”
would be political feasible, he says, adding that national policy provides incentives to New Mexico to store low-level radioactive waste.
Abraham’s vision is similar to that of one by a blue ribbon commission appointed by the U.S. Department of Energy. That body, which crafted its
positions two years ago, says that these decisions must be removed from the political realm and put into the hands of those who have the
authority to take action.
Best idea : Take the nuclear waste from the current interim storage facilities and
move it into a series of regional repositories . The political undertone: Since 1987, when Congress first approved the study of
Yucca Mountain, the American people have contributed $31 million — a process in which the state of Nevada fully complied, and one in which
it has received plenty of financial benefits. But the folks there — and it’s hard to blame them — don’t want 70,000 plus tons of nuclear waste in
their backyard. So, President Obama joined forces with Senate Majority Leader Harry Reid to “kill” Yucca Mountain in 2008. In 2009, Reid’s
former staffer, Greg Jaczko, headed the Nuclear Regulatory Commission. The Energy Department then nixed the licensing process in 2010. But
this past week, the D.C. Court of Appeals
Doing otherwise is simply
ruled in a 2-to-1 decision that the nuclear regulators must follow the law.
an act of defiance . A year earlier, this same court had ruled that the Nuclear Regulatory Commission
could not extend the current interim storage facilities from 30-years to 60-years without more detailed environmental analyses. Today’s
storage is on-site and in above-ground concrete-encased casks. In 1983, nuclear energy companies such as Dominion Resources D -0.17%, Duke
Energy DUK +1.16% Entergy ETR +0.71% Corp, Exelon Corp. and Southern Co. began signing contracts and paying fees to enable a permanent
waste disposal site. In 2002, Yucca Mountain became the designated site. Today, electric utility customers pay a fraction of their bill towards
the study of this site. Beside the political obstacles, engineers have expressed concerns that some of the spent nuclear material could
eventually escape from its encasements and cause damage to ground water supplies in the area. That may be true but it does not obviate the
need to comply with the law and to officially complete the analysis as to whether Yucca Mountain would be a viable location. “This case is not
so much about Yucca Mountain as it is about due process,” says Philip Jones, president of the National Association of Regulatory Utility
Commissioners. “Existing law requires the Nuclear Regulatory Commission to determine whether the facility and location is safe for storing
spent-nuclear fuel. Even
if it does, the fate of Yucca Mountain remains uncertain .” Yucca Mountain
may get
will never be used as a permanent nuclear waste storage facility. Too many
political, economic and engineering hurdles stand in the way. The time spent examining the issue, though, has not been
studied some more. But it
wasted. The lessons learned are that 250-year disposal sites that are regionally placed could be more practical.
Laundry list of reasons prove the aff is a non-starter – and other countries solve
Lowe 5 – Emeritus Professor of Science, Technology, and Society @ Griffith University, President of the
Australian Conservation Foundation (Prof. Ian Lowe, September 2005, “Nuclear Power: No Solution to
Climate Change,”
http://www.acfonline.org.au/sites/default/files/resources/Nuclear_Power_No_Solution_to_Climate_Ch
ange.pdf)//twonily
First, a range of alternative technologies (e.g. transmutation – discussed below) or options (e.g. sea-bed disposal) have
been discussed. However, all are seen to be non-starters for economic , technological or political
reasons . Putting a positive spin on this situation, there is said to be an ‘international consensus’ on the
wisdom of placing high-level waste in deep underground repositories. Second, deep repositories are promoted
as final disposal sites and contrasted with storage or other options which require ongoing vigilance for long periods
into the future. However there is some movement within the nuclear industry towards accepting the need for monitoring and ‘retrievability’ of
radioactive waste in case of leaks and other problems. This shift in favour of retrievable waste management is generally supported by
environmental organisations, but it undercuts the alleged ‘benefit’ of disposal by conceding that high-level waste will be a burden on future
generations whether or not it is placed in repositories. Third, partly driven by the failure to establish national repositories, there
has been
growing interest in attempting to establish multinational/ international repositories . However, there is also
acknowledgement that multinational repositories could generate more intense public opposition than national repositories, e.g. the fierce
opposition to Pangea Resources in Australia. Russia
may accept foreign-origin high-level waste for disposal, and the
UK may dispose of some wastes previously destined for return to their country of origin.
Laundry list of alt causes
Rus and Wadley 13 – Executive Director of Nuclear AND Nuclear Business Development Director @
Black and Veatch (Steve Rus, Mike Wadley, 2013, “Q&A: Nuclear Industry Wrestles with Low Natural Gas
Prices, Post-Fukushima Changes,” http://bv.com/Home/news/solutions/energy/nuclear-industrywrestles-with-low-natural-gas-prices)//twonily
You’ve characterized the nuclear industry as being in a “state of uncertainty.” What specific issues are causing that? Steve
Rus: There are a number of issues affecting the nuclear industry. In a nutshell, you have questions about post-Fukushima
requirements, court rulings over waste disposal, low natural gas prices that make nuclear investment difficult,
the overall world economic conditions, questions about whether the demand for power will resume, the issues
surrounding the debt ceiling and environmental regulations. I think uncertainty summarizes it quite well. The
industry is influenced by parameters that are very volatile right now.
Nuclear power can’t solve climate change
Lowe 5 – Emeritus Professor of Science, Technology, and Society @ Griffith University, President of the
Australian Conservation Foundation (Prof. Ian Lowe, September 2005, “Nuclear Power: No Solution to
Climate Change,”
http://www.acfonline.org.au/sites/default/files/resources/Nuclear_Power_No_Solution_to_Climate_Ch
ange.pdf)//twonily
There are significant constraints on the growth of nuclear power, such as its high capital cost and, in many countries, lack of
public acceptability. As a method of reducing g reen h ouse g as emissions, nuclear power is further limited because
it is used almost exclusively for electricity generation, which is responsible for less than one third of global
greenhouse gas emissions. Because of these problems, the potential for nuclear power to help reduce greenhouse gas emissions by
replacing fossil fuels is limited. Few predict a doubling of nuclear power output by 2050, but even if it did
eventuate it would still only reduce greenhouse gas emissions by about 5% – less than one tenth of the
reductions required
to stabilise atmospheric concentrations of greenhouse gases. Nuclear
power is being promoted as the
solution to climate change, as a technical fix or magic bullet. Clearly it is no such thing . As a senior analyst from the
International Atomic Energy Agency, Alan McDonald (2004), said: “Saying that nuclear power can solve global warming by itself is way over the
Nuclear power is not a ‘renewable’ energy source. High-grade, low-cost uranium ores are limited and will be
exhausted in about 50 years at the current rate of consumption. The estimated total of all conventional uranium
top”.
reserves is estimated to be sufficient for about 200 years at the current rate of consumption. (Nuclear Energy Agency and International Atomic
Energy Agency, 2004.) But in
a scenario of nuclear expansion, these reserves will be depleted more rapidly. Claims
that nuclear power is ‘greenhouse free’ are incorrect as substantial greenhouse gas emissions are generated across the
nuclear fuel cycle. Fossil-fuel generated electricity is more greenhouse intensive than nuclear power, but this comparative
benefit will be eroded as higher-grade uranium ores are depleted. Most of the earth’s uranium is found in very poor grade ores,
and recovery of uranium from these ores is likely to be considerably more greenhouse intensive. (van Leeuwen and Smith,
2004.) Nuclear power emits more greenhouse gases per unit energy than most renewable energy sources, and that comparative deficit will
widen as uranium ore grades decline.
No international cooperation – domestic politics block sharing agreements
Lowe 5 – Emeritus Professor of Science, Technology, and Society @ Griffith University, President of the
Australian Conservation Foundation (Prof. Ian Lowe, September 2005, “Nuclear Power: No Solution to
Climate Change,”
http://www.acfonline.org.au/sites/default/files/resources/Nuclear_Power_No_Solution_to_Climate_Ch
ange.pdf)//twonily
Radioactive wastes arise across the nuclear fuel cycle. High-level waste – which includes spent nuclear fuel and the waste
stream from reprocessing plants – is by far the most hazardous of the waste types. A typical power reactor produces
25-30 tonnes of spent fuel annually. Annually, about 12,000 to 14,000 tonnes of spent fuel are produced by power reactors worldwide.
About 80,000 tonnes of spent fuel have been reprocessed, representing about one third of the global output of spent fuel.
Reprocessing poses a major proliferation risk because it involves the separation of plutonium from spent fuel. It
also poses major public health and environmental hazards as reprocessing plants release significant quantities of radioactive
wastes into the sea and gaseous radioactive discharges into the air. Cogema’s reprocessing plant at La Hague in France, and British
Nuclear Fuel’s plant at Sellafield in the UK, are the largest source of radioactive pollution in the European environment.
(WISE-Paris, 2001.) Not a single repository exists anywhere in the world for the disposal of high-level waste
from nuclear power . Only a few countries – such as Finland, Sweden, and the US – have identified potential sites
for a high-level waste repository. The legal limit for the proposed repository at Yucca Mountain in the US is less then the
projected output of high-level waste from currently operating reactors in the US. If global nuclear output was
increased three-fold, new repository storage capacity equal to the legal limit for Yucca Mountain would have to be created somewhere in the
world every 3-4 years. (Ansolabehere et al., 2003.) With a ten-fold increase in nuclear power, new repository storage capacity equal to the legal
limit for Yucca Mountain would have to be created somewhere in the world every single year. Attempts
to establish international
repositories are likely to be as unpopular and unsuccessful
as was the attempt by Pangea Resources to win support for
such a repository in Australia. Synroc – the ceramic waste immobilisation technology developed in Australia – seems destined to be a
permanently ‘promising’ technology. As nuclear advocate Leslie Kemeny (2005) notes, Synroc “showed great
early promise but so
its international marketing and commercialisation agendas have failed ”. The nuclear industry transfers risks
and costs to future generations. As AMP Capital Investors (2004) notes in its Nuclear Fuel Cycle Position Paper: “The waste problems of
the uranium mining and power generation are numerous and long lasting. Due to the long half lives and inability ... to
find an acceptable final disposal method for radioactive materials, the problem will continue for a long time without a solution. Therefore
there are significant concerns about whether an acceptable waste disposal option currently exists. From a
sustainability perspective, while the nuclear waste issues remain unresolved, the uranium/nuclear power industry is transferring
far
the risks, costs and responsibility to future generations.”
Prolif doesn’t cause war
Waltz 7 (Kenneth, Professor – UC Berkeley, “A Nuclear Iran”, Journal of International Affairs, 3-22, Lexis)
First, nuclear proliferation is not a problem because nuclear weapons have not proliferated. "Proliferation" means to spread like
wildfire. We have had nuclear military capability for over fifty years, and we have a total of nine militarily capable nuclear states. That's hardly
proliferation; that is, indeed, glacial spread. If another country gets nuclear weapons, and if it does so for good reasons, then that isn't an object
of great worry. Every once in a while, some prominent person says something that'sobviously true. Recently, Jacques Chirac [president of
France] said that if Iran had one or two nuclear weapons, it would not pose a danger. Well, he was right. Of course, he had to quickly retract it
and say, "Oh no, that slipped out, I didn't know the microphone was on!" Second, it doesn't matter who has nuclear weapons.
Conversely, the spread of conventional weapons makes a great deal of difference. Forinstance, if a Hitler-type begins to establish conventional
superiority, it becomes very difficult to contain and deter him. But, with nuclear weapons,
it's been proven without
exception that whoever gets nuclear weapons behaves with caution and moderation. Every country-whether they are countries we trust and think of as being highly responsible, like Britain, or countries that we distrust greatly, and for very good
reasons, like China during the Cultural Revolution behaves
with such caution. It is now fashionable for political scientists to test
hypotheses.Well, I have one: If a country has nuclear weapons, it will not be attacked militarily in ways that threaten its
manifestly vital interests. That is 100 percent true, without exception, over a period of more than fifty years. Pretty
impressive.
No impact --- tech diffusion is inevitable but prolif is still limited and slow
Hymans 12 (Jacques E.C., Assistant Professor in the School of International Relations at the University of
Southern California, May/June 2012, “Botching the Bomb,” Foreign Affairs, Vol. 91, No. 3)
"TODAY, ALMOST any industrialized country can produce a nuclear weapon in four to five years," a former chief of
Israeli military intelligence recently wrote in The New York Times, echoing a widely held belief. Indeed, the more nuclear technology
and know-how have diffused around the world, the more the timeline for building a bomb should have
shrunk. But in fact, rather than speeding up over the past four decades, proliferation has gone into slow motion .
Seven countries launched dedicated nuclear weapons projects before 1970, and all seven succeeded in relatively short order. By contrast, of
the ten countries that have launched dedicated nuclear weapons projects since 1970, only three have
achieved a bomb. And only one of the six states that failed -- Iraq -- had made much progress toward its ultimate goal by the time it gave
up trying. (The jury is still out on Iran's program.) What is more, even the successful projects of recent decades have
needed a long time to achieve their ends. The average timeline to the bomb for successful projects launched before 1970 was about
seven years; the average timeline to the bomb for successful projects launched after 1970 has been about 17 years.
International security experts have been unable to convincingly explain this remarkable trend. The first and most credible
conventional explanation is that the Nuclear Nonproliferation Treaty (NPT) has prevented a cascade of new nuclear weapons states by
creating a system of export controls, technology safeguards, and on-site inspections of nuclear facilities. The NPT regime has certainly closed off
the most straightforward pathways to the bomb. However, the
NPT became a formidable obstacle to would-be nuclear states only
in the 1990s, when its export-control lists were expanded and Western states finally became serious about enforcing them and when
international inspectors started acting less like tourists and more like detectives. Yet the proliferation slowdown started at least
20 years before the system was solidified. So the NPT, useful though it may be, cannot alone account for this
phenomenon.
2nc – xt: alt causes to prolif
Specifically, research reactors are an alt cause
Cohen 90 Bernard L. Cohen, Professor of Physics at the University of Pittsburgh, 1990 [The Nuclear
Energy Option, http://home.pacbell.net/sabsay/nuclear/index.html]
Another alternative would be to use a research reactor, designed to provide radiation for research
applications* rather than to generate electricity. At least 45 nations now have research reactors, and in at
least 25 of these there is a capability of producing enough plutonium to make one or more bombs every 2 years. Research reactors are usually
designed with lots of flexibility and space, so it would not be difficult to use them for plutonium
production. A plant for generating nuclear electricity is by necessity large and highly complex, with most of the
size and complexity due to reactor operation at a very high temperature and pressure, the production and handling of steam, and the equipment for generation and
distribution of electricity. It
would be impossible to keep construction or operation of such a plant secret. Moreover,
nation with this capability would provide
one for a foreign country without requiring elaborate international inspection to assure that its
plutonium is not misused. A production or research reactor, on the other hand, can be small and
unobtrusive. It has no high pressure or temperature, no steam, and no electricity generation or distribution equipment. Almost any nation has,
or could easily acquire, the capability of constructing one, and it probably could carry out the entire
project in secret. There would be no compulsion to submit to outside inspection. In view of the above
considerations, it would be completely illogical for a nation bent on making nuclear weapons to obtain a
power reactor for that purpose. It would be much cheaper, faster, and easier to obtain a plutonium production reactor; the plutonium it produces
only a very few of the most technologically advanced nations are capable of constructing one. No
would make much more powerful and reliable bombs with much less effort and expense.
2nc – xt: prolif is inevitable
Prolif inevitable --A) Tech and materials are widespread
Ellis 3 (Jason D., Senior Research Professor – Center for Counterproliferation Research, National Defense
University, “The Best Defense: Counterproliferation and U.S. National Security”, Washington Quarterly,
Spring, p. 119-120)
The Bush administration’s national security strategy starts with the reality of a post-proliferated
international security environment. The intricate network of nonproliferation treaties and regimes
established over the past several decades share one key feature: failure to prevent determined states
from developing nuclear, chemical, or biological weapons as well as increasingly capable missile and
related delivery systems. South Africa, for instance, successfully developed and produced six nuclear
devices despite its purported adherence to the Nuclear Non-Proliferation Treaty (NPT). Similarly, Iraq
was well on its way when the Gulf War interrupted its progress, and North Korea also sought
clandestinely to develop nuclear weapons in contravention of its international obligations. At the same
time, the voluntary and unenforceable gentleman’s agreement among supplier states to refrain from
exporting ballistic-missile development technologies to aspirant states has hardly kept key states—
whether Iran, North Korea, Pakistan, India, or others—from making steady, incremental progress toward
such developments. Several additional states also will develop the ability to produce land-attack cruise
missiles indigenously over the next several years. All told, nuclear- and missile-related treaties and
regimes have not prevented the acquisition or development of weapon capabilities, although they have
arguably served to slow the pace of development in the past. In the years ahead, foreign assistance—the
transfer or sale of technologies, material, or expertise with possible weapons-related applications by key
suppliers—and the growing phenomenon of secondary supply—exports or cooperative development of
WMD or missile delivery systems, their constituent enabling or production technologies, or the requisite
expertise necessary to their development or production by nontraditional suppliers—pose severe
challenges to the nonproliferation regime. At the same time, the continued insecurity (and large
quantity) of fissile material in the former Soviet Union and other regions, evident advancements in
indigenous weapons-related technology among less-developed states, and the potential availability of
germane technical expertise together suggest that existing multilateral constraining mechanisms are
bound to prove even less effective in the years ahead. In this context, traditional supply-side constraints
have and will continue to erode.
B) Conventional superiority
Gerson and Boyars 7 (Michael Gerson, Member – CNA’s Center for Strategic Studies, MA in
International Relations – University of Chicago, and Jacob, Intern – CNA’s Center for Strategic Studies,
MA in Security Studies – Georgetown University, “The Mix of New Subjects” and “Deterrent Against US
Power”, 9-18, http://www.cna.org/documents/D0017171.A2.pdf)
The mix of new subjects of U.S. deterrence, such as emerging peer competitors, belligerent rogues, and
terror groups, demands new thinking about old assumptions. Chief among these is the rational actor
model upon which deterrence theory is predicated. Effective deterrence requires identifying who
matters and how to influence them. Before, it was assumed that adversaries would utilize centralized
decision-making and exhibit unity of authority, and American assumptions about deterrence rested on
the belief that the enemy would have a high-confidence chain of command and would act in “rational”
ways that carefully considered cost/benefit calculations. In today’s threat environment, however, it is
possible that U.S. capabilities could cause adversaries to disperse their authority in a way that decreases
their control of the chain of command, causing the U.S. to lose confidence in its assessments of the
relevant actors and their individual goals and objectives. Individuals within a regime might act in ways
that are “rational” to them given their culture, worldview, and strategic objectives, but which may not
be in keeping with the predictions of the rational actor model. As one panelist argued, an opponent with
nothing to lose, such as a North Korean regime on its last legs, might, even without being suicidal,
undertake seemingly ill-advised actions with grave consequences. In this new deterrence calculus,
adversary intentions are just as important as their capabilities. The more diverse sources and types of
threats require the U.S. to think not just about how many nuclear weapons an adversary has in its
arsenal, but how a state or group might intend to use its more limited capabilities in ways we hope to
prevent. As one panelist argued, rogue states’ interest in nuclear weapons is partly a function of
America’s overwhelming conventional superiority. Just as American conventional forces were
outmatched in Western Europe during the Cold War, necessitating a strategic shift to reliance on
nuclear weapons to deter Soviet conventional aggression in Europe, so too rogues have shifted their
focus to nuclear weapons in response to U.S. conventional superiority. This suggests that U.S. nuclear
weapons strategies, whatever they may be, play no role in influencing whether rogues are interested in
obtaining nuclear weapons of their own. U.S. conventional, not nuclear, forces are the primary threat to
rogues, and as long as the U.S. maintains substantial conventional forces, rogue states will continue to
view nuclear weapons as the only possible deterrent against U.S. power.
C) Other incentives
Martel 94 (William and William Pendley, Associate Professors, Air War College, “Nuclear Coexistence:
Rethinking US Policy to Promote Stability in an Era of Proliferation,” Air War College Studies in National
Security #1, April, p. 26)
Summary. The prospect is that the already strong incentives for nuclear proliferation will increase the
value of nuclear weapons in the emerging international security environment. As long as nuclear
weapons deter other nuclear powers, equate with the achievement of regional or global major power
status, balance potentially overwhelming regional military odds, maintain a state’s freedom of action by
deterring intervention by major powers, and contribute to economic development and modernization—
states may well feel powerful incentives to possess nuclear weapons.
2nc – xt: prolif is slow
It’s slow
Yusuf 9 (Moeed, Fellow and Ph.D. Candidate in the Frederick S. Pardee Center for the Study of the
Longer-Range
Future – Boston University, “Predicting Proliferation: The History of the Future of Nuclear Weapons”,
Brookings Policy Paper 11, January,
http://www.brookings.edu/~/media/Files/rc/papers/2009/01_nuclear_proliferation_
yusuf/01_nuclear_proliferation_yusuf.pdf)
It is a paradox that few aspects of international security have been as closely scrutinized, but as
incorrectly forecast, as the future nuclear landscape. Since the advent of nuclear weapons in 1945, there
have been dozens, if not hundreds of projections by government and independent analysts trying to
predict horizontal and vertical proliferation across the world. Various studies examined which countries
would acquire nuclear weapons, when this would happen, how many weapons the two superpowers as
well as other countries would assemble, and the impact these developments might have on world
peace. The results have oscillated between gross underestimations and terrifying overestimations.
Following the September 11, 2001 attacks, the fear that nuclear weapons might be acquired by so-called
“rogues states” or terrorist groups brought added urgency – and increased difficulty – to the task of
accurately assessing the future of nuclear weapons. A survey of past public and private projections
provides a timely reminder of the flaws in both the methodologies and theories they employed. Many of
these errors were subsequently corrected, but not before, they made lasting impressions on U.S.
nuclear (and non-nuclear) policies. This was evident from the time the ‘Atoms for Peace’ program was
first promulgated in 1953 to the 1970 establishment of the Nuclear Non- Proliferation Treaty (NPT), and
more recently during the post-Cold War disarmament efforts and debates surrounding U.S. stance
towards emerging nuclear threats. This study offers a brief survey of attempts to predict the future of
nuclear weapons since the beginning of the Cold War.1 The aim of this analysis is not merely to review
the record, but to provide an overall sense of how the nuclear future was perceived over the past six
decades, and where and why errors were made in prediction, so that contemporary and future
predictive efforts have the benefit of a clearer historical record. The survey is based on U.S. intelligence
estimates as well as the voluminous scholarly work of American and foreign experts on the subject. Six
broad lessons can be gleaned from this history. First, it reveals consistent misjudgments regarding the
extent of nuclear proliferation. Overall, projections were far more pessimistic than actual developments;
those emanating from independent experts more so than intelligence estimates. In the early years of
the Cold War, the overly pessimistic projections stemmed, in part, from an incorrect emphasis on
technology as the driving factor in horizontal proliferation, rather than intent, a misjudgment, which
came to light with the advent of a Chinese bomb in 1964. The parallel shift from developed-world
proliferation to developing-world proliferation was accompanied by greater alarm regarding the impact
of proliferation. It was felt that developing countries were more dangerous and irresponsible nuclear
states than developed countries. Second, while all the countries that did eventually develop nuclear
weapons were on the lists of suspect states, the estimations misjudged when these countries would go
nuclear. The Soviet Union went nuclear much earlier than had been initially predicted, intelligence
estimates completely missed China’s nuclear progress, and India initially tested much later than U.S.
intelligence projections had anticipated and subsequently declared nuclear weapon status in 1998 when
virtually no one expected it to do so. Third, the pace of proliferation has been consistently slower than
has been anticipated by most experts due to a combination of overwhelming alarmism, the intent of
threshold states, and many incentives to abstain from weapons development. In the post-Cold War
period, the number of suspected threshold states has gradually decreased and the geographical focus
has shifted solely to North-East Asia, South Asia, and the Middle East. There is also much greater
concern that a nuclear chain reaction will break out than was the case during the Cold War.
History proves
Hersman 9 (Rebecca, Senior Research Professor – Center for Counterproliferation Research, “Trend
Lines and Tipping Points for Nuclear Proliferation”, Stimson Center, 1-14,
http://www.stimson.org/events.cfm?ID=655)
Rebecca Hersman noted that proliferation is a multi-step process, and that this ‘dial-up’ or ‘dial-down’
process is not linear. A national proliferation strategy can therefore take a few or many years. In her
view, cascades require that multiple countries match capability to intent at an accelerating rate. She
noted that synchronizing capability and intent is very difficult. The concept of “tipping points” is
problematic in that it suggests sudden and rapid decisions by multiple countries to cross a singular
proliferation boundary. The historical record suggests otherwise – that nuclear decision-making is
usually incremental, and could stall or reverse course at many stages. There is little historical evidence
to suggest that a rapid expansion in the number of nuclear-armed states is likely in the future, let alone
inevitable.
Alarmist predictions are wrong
Mueller 7 (John, Professor of Political Science – Ohio State University, “Radioactive Hype”, National
Interest, September / October, http://polisci.osu.edu/faculty/jmueller/NINFINL2.PDF)
As Langewiesche points out, the nuclear genie is out of the bottle, and just about any state can
eventually obtain nuclear weapons if it really wants to make the effort-although in many cases that
might involve, as a former president of Pakistan once colorfully put it, "eating grass" to pay for it.
Despite the predictions of generations of alarmists, nuclear proliferation has proceeded at a remarkably
slow pace. In 1958 the National Planning Association predicted "a rapid rise in the number of atomic
powers . . . by the mid-1960s", and a couple of years later, John Kennedy observed that there might be
"ten, fifteen, twenty" countries with a nuclear capacity by 1964. But over the decades a huge number of
countries capable of developing nuclear weapons has not done so-Canada, Sweden and Italy, for
example-and several others-Brazil, Argentina, South Africa, South Korea and Taiwan-have backed away
from or reversed nuclear-weapons programs. There is, then, no imperative for countries to obtain
nuclear weapons once they have achieved the appropriate technical and economic capacity to do so.
Insofar as states that considered acquiring the weapons, they came to appreciate several defects: The
weapons are dangerous, distasteful, costly and likely to rile the neighbors. If one values economic growth and
prosperity above all, the sensible thing is to avoid the weapons unless they seem vital for security. It has often been assumed that nuclear
weapons would prove to be important status symbols. However, as Columbia's Robert Jervis has observed, "India, China, and Israel may have
decreased the chance of direct attack by developing nuclear weapons, but it is hard to argue that they have increased their general prestige or
influence." How much more status would Japan have if it possessed nuclear weapons? Would anybody pay a great deal more attention to
Britain or France if their arsenals held 5,000 nuclear weapons, or would anybody pay much less if they had none? Did China need nuclear
weapons to impress the world with its economic growth? Perhaps the only such benefit the weapons have conferred is upon contemporary
Russia: With an economy the size of the Netherlands, it seems unlikely that the country would be invited to participate in the G-8 economic
It is also difficult to see how nuclear weapons benefited their owners in
specific military ventures. Israel's nuclear weapons did not restrain the Arabs from attacking in 1973, nor did
club if it didn't have an atomic arsenal.
Britain's prevent Argentina's seizure of the Falklands in 1982. Similarly, the tens of thousands of nuclear weapons in the arsenals of the
enveloping allied forces did not cause Saddam Hussein to order his occupying forces out of Kuwait in 1990. Nor did the bomb benefit America in
Korea or Vietnam, France in Algeria or the Soviet Union in Afghanistan. The handful of countries that have pursued nuclear-weapons programs
seem to have done so as an ego trip (think, again, of France) or else (or additionally) as an effort to deter a potential attack on themselves:
China, Israel, India, Pakistan and now North Korea. Although there were doubtless various elements in their motivations, one way to reduce the
likelihood such countries would go nuclear is a simple one: Stop threatening them. From this perspective, Bush's 2002 declaration grouping
Iraq, Iran and North Korea into an "axis of evil" was, to put it mildly, foolish. However, many of his supporters, particularly in the
neoconservative camp, went quite a bit further. In an article in this journal in the fall of 2004 proposing what he calls "democratic realism",
Charles Krauthammer urged taking "the risky but imperative course of trying to reorder the Arab world", with a "targeted, focused" effort on
"that Islamic crescent stretching from North Africa to Afghanistan." And in a speech in late 2006, he continued to champion what he calls "the
only plausible answer", an amazingly ambitious undertaking that involves "changing the culture of that area, no matter how slow and how
difficult the process. It starts in Iraq and Lebanon, and must be allowed to proceed." Any other policy, he has divined, "would ultimately bring
ruin not only on the U.S. but on the very idea of freedom." In their 2003 book, The War Over Iraq, Lawrence Kaplan and William Kristol stress
that, "The mission begins in Baghdad, but does not end there. . . . War in Iraq represents but the first installment. . . .Duly armed, the United
States can act to secure its safety and to advance the cause of liberty-in Baghdad and beyond." At a speech given at the Army War College as
Baghdad was falling in 2003, Richard Perle triumphantly issued an extensive litany of targets, adding for good measure, and possibly in jest,
France and the State Department. Most interesting is a call issued in Commentary by neoconservatism's champion guru, Norman Podhoretz, in
the run-up to the war. He strongly advocated expanding Bush's "axis of evil" beyond Iraq, Iran and North Korea "at a minimum" to "Syria and
Lebanon and Libya, as well as 'friends' of America like the Saudi royal family and Egypt's Hosni Mubarak, along with the Palestinian Authority."
More realistic about democracy than other neoconservatives, Podhoretz pointedly added, "the alternative to these regimes could easily turn
out to be worse, even (or especially) if it comes into power through democratic elections." Accordingly, he emphasized, "it will be necessary for
the United States to impose a new political culture on the defeated parties." These men, with their extravagant fantasies, do not, of course,
directly run the Bush Administration. However, given the important role such people have played in the administration's intellectual
development and military deployments, the designated target regimes would be foolish in the extreme not to take such existential threats very
seriously indeed. It is certainly preferable that none of these regimes (and quite a few others) ever obtain nuclear weapons. But if they do so
they are by far most likely to put them to use the same way other nuclear countries have: to deter. Nonetheless, even
threatened
states may not develop nuclear weapons. In the wake of the Iraq disaster, an invasion by the ever?threatening Americans can
probably now be credibly deterred simply by maintaining a trained and well-armed cadre of a few thousand troops dedicated to, and capable
of, inflicting endless irregular warfare on the hapless and increasingly desperate and ridiculous invaders. The Iranians do not yet seem to have
grasped this new reality, but perhaps others on the Bush Administration's implicit hit list will.
No political will
Waltz 00 (Kenneth, Professor of Political Science at UC Berkeley, Georgetown Journal of International
Affairs, v1 n1, Winter/Spring, http://www.ciaonet.org/olj/gjia/gjia_winspr00f.html, accessed 8/11/02
It is now estimated that about twenty–five countries are in a position to make nuclear weapons rather
quickly. Most countries that could have acquired nuclear military capability have refrained from doing
so. Most countries do not need them. Consider Argentina, Brazil, and South Africa. Argentina and Brazil
were in the process of moving toward nuclear military capability, and both decided against it–wisely I
believe–because neither country needs nuclear weapons. South Africa had about half a dozen warheads
and decided to destroy them. You have to have an adversary against whom you think you might have to
threaten retaliation, but most countries are not in this position. Germany does not face any security
threats–certainly not any in which a nuclear force would be relevant. I would expect the pattern of the
past to be the same as the pattern in the future, in which one or two states per decade gradually
develop nuclear weapons.
Prolif decreasing
Riecke 00 (Henning, Post-Doctoral Fellow – Weatherhead Center for International Affairs, Assistant
Professor International Relations – Schiller International University, Preventing the Use of Weapons of
Mass Destruction, p. 46)
Nuclear weapons proliferation has slowed down. Some possible candidates for proliferation have been
either forced to destroy their program, like Iraq, or have dropped the nuclear option. This is a sign, that
the non-use of nuclear weapons, the ‘nuclear taboo’ is gaining ground. This finding is in contradiction to
the signal sketched out above, that the use of atomic weapons in certain cases has a legitimate
character. The high costs in each case, however, might weigh heavier than the idea of appropriateness.
Chemical and biological weapons programs are still pursued by a small number of states that remain
unimpressed by the NATO campaign. They show no sign of entering the relevant non-proliferation
regimes (or, as in the case of Iran, they do with obvious qualification).
2nc – xt: alt causes to nuclear power
Specifically, low natural gas prices trigger the impact
Rus and Wadley 13 – Executive Director of Nuclear AND Nuclear Business Development Director @
Black and Veatch (Steve Rus, Mike Wadley, 2013, “Q&A: Nuclear Industry Wrestles with Low Natural Gas
Prices, Post-Fukushima Changes,” http://bv.com/Home/news/solutions/energy/nuclear-industrywrestles-with-low-natural-gas-prices)//twonily
Low natural gas prices are dramatically changing the demand for coal in the United States. Is the same happening with
nuclear power? Mike Wadley: You can’t ignore low gas prices, nor the low start-up costs compared to a nuclear
plant. The “dash to gas” makes investment in nuclear difficult for some. However, many clients understand the long-term
benefits of fuel diversification.op
2nc – xt: status quo solves
Squo solves nuclear leadership – plus economics take out solvency
Nivola 4 – senior fellow @ governance studies program @ Brookings (Pietro Nivola, May 2004, “The
Political Economy of Nuclear Energy in the US,”
http://www.ifri.org/files/CFE/Nivola_NuclearEnergy_US.pdf)//twonily
A tendency among commentators, even experts like the author of the sentence above, is to regard the complicated story of nuclear energy
in the U nited S tates as exceptionally troubled and frustrating . The root cause of the troubles and frustrations, moreover, is commonly
thought to be more political than economic. The promise of nuclear power in this country is said to have been dimmed primarily by an eccentrically risk-averse public and an
unusually hostile regulatory climate. Practically nowhere else, it is said, have political and legal institutions been so uncooperative. Supposedly the central governments of most other advanced
countries have lent far more support to their nuclear industries. And because those governments are assumed to be more aggressive in combating pollution, including greenhouse gas
emissions from burning fossil fuels, surely “the rest of the world” has been doing much more than America to level the playing field for the development of nuclear energy.
The
following paper challenges this conventional picture . With more than a hundred reactors currently in operation, the American
nuclear power industry remains the world’s largest . Continued reliance on nuclear energy is actually
more in question in several European countries than it is here. Because electric utilities in the U nited S tates, unlike
those in, say, France and Japan,
have access to vast reserves of coal and natural gas as well as huge hydro-electric facilities in certain regions, the U nited
S tates depends on nuclear generators to meet “only” about one-fifth of its demand for electricity . That share
percent)
is
still
greater than the worldwide average
(20
(17 percent), however, and in half a dozen U.S. states—some of which are large enough to compare to
percentage of electricity generated by nuclear plants in Illinois, for
exceeds the percentages in the Netherlands, the United Kingdom, Spain, or Germany. At any rate, even at onefifth, total U.S. electricity production from nuclear reactors approximately equals the combined total of
important nations abroad—nuclear energy supplies over half of the electricity consumed.2 The
example,
the world’s two other nuclear giants,
France and Japan.3 In fact,
America’s nuclear generating capacity amounts to
more than a third of the entire installed nuclear capacity of the industrial nations in the O rganization for
E conomic C ooperation and D evelopment (OECD).4 A nuclear sector of such magnitude hardly suggests that American
governmental institutions and policies, national or local, have always proven particularly unreceptive to
nuclear plants . The great majority of American states have accommodated such plants. Arguably, U.S. energy
policies and environmental-protection efforts at all levels of government have done at least as much to sustain as to
hinder the viability of these facilities along with their fuel suppliers, waste management requirements, and other supporting industries. And if anything like the
energy legislation that the Bush administration and the House of Representatives advanced in 2003 were to come to fruition, the amount of assistance would expand significantly. To be sure,
circumstances for nuclear energy in recent decades are a far cry from extraordinarily favorable
the
the
conditions
that prevailed before the energy crisis of the 1970s. Major additions to America’s already sizeable nuclear presence have not been in the offing for some time. (No new nuclear generating
station has been completed here since 1996.) But
the holding pattern into which nuclear power finds itself in the U nited S tates is
hardly unique . A pause in new plant construction has extended to many other countries. In America, it is safe to say,
the halt has to do with basic economic considerations, not just political obstacles, and how those economic
considerations are likely to play out ten or twenty years out is nigh-impossible to predict.
Squo solves the entire aff – on-site storage prevents terrorism and is sufficient for storage
AP 5/27 – Associated Press (AP, 5/27/14, “U.S. plants prepare long-term nuclear waste storage,”
http://www.clarionledger.com/story/news/2014/05/27/long-term-nuclear-storage/9639735/)//twonily
WATERFORD, Conn. (AP) — Nuclear power plants across the United States are building or expanding storage facilities to
hold their spent fuel — radioactive waste that by now was supposed to be on its way to a national dump. The steel and
concrete containers used to store the waste on-site were envisioned as only a short-term solution when introduced in the
1980s. Now they are the subject of reviews by industry and government to determine how they might hold up — if needed — for decades or
longer. With nowhere else to put its nuclear waste, the Millstone Power Station overlooking Long Island Sound is sealing it up in massive steel
canisters on what used to be a parking lot. The storage
pad, first built in 2005, was recently expanded to make room for seven
times as many canisters filled with spent fuel. Dan Steward, the first selectman in Waterford, which hosts Millstone, said he raises
the issue every chance he can with Connecticut's congressional members. "We do not want to become a nuclear waste site as a community,"
Steward said. The government is pursuing a new plan for nuclear waste storage, hoping to break an impasse left by the collapse of a proposal
for Nevada's Yucca Mountain. The Energy Department says it expects other states will compete for a repository, and the accompanying
economic benefits, and it's already heard from potential hosts in New Mexico, Texas and Mississippi. But the plan faces hurdles including a
need for new legislation that has stalled in Congress. So plants are preparing to keep the high-level nuclear waste in their backyards indefinitely.
Most of it remains in pools, which cool the spent fuel for several years once it comes out of the reactors. But with the pools at or nearing
capacity, the majority is expected within a decade to be held in dry casks, or canisters, which are used in 34 states. Only three of the 62
commercial nuclear sites in the U.S. have yet to announce plans to build their own. In the past few years since the Yucca Mountain plan was
abandoned, the government and industry have opened studies to address unanswered questions about the long-term performance of dry cask
storage. The
N uclear R egulatory C ommission in 2011 began offering 40-year license renewals for casks, up from 20-
year intervals. The tests are focusing on how to monitor degradation inside the canisters, environmental requirements for storage sites, and
how well the canisters hold up with "high burnup," or longer-burning fuels that are now widely used by American plants. "Now that we've
shown that the
national policy is shifting , we're having to relook at these systems to make sure they still meet
the regulations for longer and longer periods of time," said Eric Benner, an NRC official who has served as the inspections branch
chief with its spent fuel storage division. At Millstone, 19 canisters loaded with spent fuel are arrayed on a concrete pad, which was expanded
in October to make room for as many as 135 canisters by 2045. The canisters, which are cooled by air circulation, seal
the waste with
inert gas inside an inner chamber and are themselves loaded into concrete modules. Workers regularly inspect
temperature gauges and, during the winter, shovel snow off the vents. Millstone's low-level nuclear waste is shipped to a disposal facility in
Barnwell, South Carolina. The spent fuel is piling up at a rate of about 2,200 tons a year at U.S. power-plant sites. The industry and government
decline to say how much waste is currently stored at individual plants. The U.S. nuclear industry had 69,720 tons of uranium waste as of May
2013, with 49,620 tons in pools and 20,100 in dry storage, according to the Nuclear Energy Institute industry group. Spent nuclear fuel is about
95 percent uranium. About 1 percent is other heavy elements such as curium, americium and plutonium-239. Each has an extremely long halflife — some take hundreds of thousands of years to lose all of their radioactive potency. Watchdog groups say the dry
storage poses
fewer safety concerns than the reactors themselves, and many have pushed for spent fuel to be transferred more quickly
from the pools. Heavy security is in place to deter sabotage by terrorists . The administration's strategy calls for an
interim storage facility by 2025 and a geologic repository by 2048. Peter Lyons, an assistant secretary for nuclear energy at the U.S. Energy
Department, said it cannot make plans for individual sites until the passage of legislation creating a new framework for waste policy. But he said
the groups in southeastern New
Mexico, western Texas and Mississippi are only the most public of potential hosts to
express interest in taking in high-level waste. The idea for the interim facility is to take spent fuel left behind from reactors that
have already shut down, as is the case at sites in California, Maine, Massachusetts, Michigan, Wisconsin, Connecticut, Colorado and Oregon.
**off cases
**kritik links
cap link
Nuclear energy relies on capitalist institutional structures—turns the case
Lorenzini, 11/27/13—a retired PacifiCorp executive and former general manager of contract operations
at DOE’s nuclear defense facilities (Paul, "A Second Look at Nuclear Power", Issues in Science and
Technology, issues.org/21-3/lorenzini/)//twonily
Ideological blinders
This deeply felt philosophical position could help explain the harsh rhetoric. It is “ modern technology with
its ruthlessness toward nature ,” as University of California, Los Angeles, historian Lynn White characterized it in a 1967 essay. The
prominent psychologist Abraham Maslow attacked science as a “dead end” that had become a “threat and a
danger to mankind.” E. F. Schumacher complained in his influential 1973 critique of modern society, Small is Beautiful, that
humans are “dominated by technology,” and called technology a “force that is out of control … [It] tends to
develop its own laws and principles, and these are very different from human nature.” The troubling consequence of these
declarations has been a tendency to trivialize the enormous benefits in public health, material prosperity, and
lengthened lifespan that science and technology have made possible. As a result, these ideologies have too often become
barriers to developing and using the technologies humans really need. A particularly revealing aspect of this has been the
singular intensity with which environmentalists have opposed nuclear power, knowing full well it would
mean a wider use of coal with its known environmental and human health disadvantages. Why would nuclear
power receive such intense scrutiny since coal too supports industrial growth? A partial explanation for the difference in
treatment is that coal combustion is a comfortingly familiar technology, whereas nuclear power symbolizes as
nothing else the new world of technological advancement. But nuclear power touches an even deeper
ideological chord: mistrust of modern institutions. Nuclear power depends on functioning public
institutions to ensure plant safety and to protect the public from radiation hazards . The political left ,
where environmental lobbies are most comfortable, doesn’t trust these institutions . More basically,
they mistrust the values of modern Western society that these institutions embody, particularly their
capitalist economics and their reliance on science and technology. This philosophical predisposition
against technology explains, at least to some extent, why virtually the entire environmental lobby would have
opposed nuclear power when the overwhelming proportion of scientists was on the other side of the
issue. Many people today remain skeptical about nuclear power, even though recent polls show that as many as 73 percent of college
graduates favor nuclear power, as do 65 percent of the general population. Much of the skepticism about nuclear power has been influenced
by a relatively small activist environmental lobby that is motivated as much by ideology as by concerns with the technology itself. These
ideological differences make it difficult, if not impossible, to find a common ground and work collaboratively to use technologies such as
nuclear power to their full advantage. Rather than seeing nuclear power as a beneficial technology with problems we could solve together,
they view it as anathema and oppose it without regard to its benefits. As one example, the legal system of reviews
intended to protect the public became for them a vehicle for blocking nuclear power. As a result, by the 1980s the process had become so
cumbersome that it took more than 15 years for most nuclear projects to be completed. That economic burden was too much to handle, so no
new U.S. nuclear plants have been ordered since the 1970s.
coercion link
The plan is a violent act of coercion
Marshall 5 – Prof Dept. Humanities @ Masaryk University, Peer Reviewed (Alan Marshall, 2005, “The
Social and Ethical Aspects of Nuclear Waste,” Electronic Green Journal,
https://escholarship.org/uc/item/2hx8b0fp)//twonily
In the case of nuclear waste planning, it is an accepted belief within social science circles that a facility that
imposes risks on a community should be built only if the members of that community give their consent
(Gowda & Easterling, 2000). But an important issue that emerges involves the way that a potential nuclear host community may be
pressured into offering up their consent. Many prospective facilities have come across stiff opposition when
proposed by governmental or private bodies. Despite this, though, the resources and funds that nuclear resistance groups are able to
muster compared to the nuclear industry and government is very small. Governments and business can inject funds into their
side of the proposal to produce advertisements, campaigns, education projects, and so forth, all aimed at fostering a public opinion conducive to their plans. If
consent is given within such an atmosphere of often subtle but perfectly legal coercion, then what is
the ethical status of the facility? Normally we would regard all players in technology and environment debates as rational and well-informed actors
capable of making up their own minds. For instance, if a radioactive waste facility was planned in a disused metro station in central New York or London and then
opposed by the local people, we’d regard the people as being quite rational and informed. But as Blowers and Shrader-Frechette have illustrated, the
communities subjected to waste facility plans (and the workers who are promised jobs in these facilities) may be regarded as
peripheralized communities and economically disadvantaged workers, unable to access all the
information they need, unable to access independent points of view, and unable to fully judge the economic
benefits versus the radiological risk. All this gives rise to what Shrader-Frechette (1991) and Wigley (Wigley & Shrader-Fechette, 1994) would call the
consent dilemma: wherein the siting of nuclear waste facilities and the employing of nuclear waste workers Electronic Green Journal, 1(21), Article 4
(2005) 6 requires the consent of those who are put at risk; yet those most able to give free, informed consent are usually
unwilling to do so, and those least able to validly consent are often willing to do so because they are unaware of the dangers. These
problems then beg us to ask the following questions with regards to siting nuclear waste facilities. * What is an adequate level of information and understanding for
people to make a decision? * Do all stakeholders have equal access to adequate information and assistance in understanding? * Who should be in charge of
ensuring adequate and equally-accessed information and understanding?
fem link
Fem link
Marshall 5 – Prof Dept. Humanities @ Masaryk University, Peer Reviewed (Alan Marshall, 2005, “The
Social and Ethical Aspects of Nuclear Waste,” Electronic Green Journal,
https://escholarship.org/uc/item/2hx8b0fp)//twonily
A general feminist critique would posit that a lot of environmental and technology policy is biased towards male interests
and perpetuates a patriarchal society (Buckingham-Hatfield, 2000; Everts, 1998). As a possible example of the gendered nature of radioactive
waste, the report to the 3rd COWAM Seminar ( History and some facts to Wellenberg, 2002) indicates that only 41% of women polled in a
potential repository site accepted the idea of a nuclear waste repository in their area compared to 52% of males. Other
commentators, like Gregory and Satterfield (2002), have noted that woman have a greater degree of sensitivity to risk in various
hazardous environmental projects. Undoubtedly, there are a myriad of reasons for such situations: the sensitivity of women as a
social group to environmental issues due to their self-perceived social roles, the sensitivity of men as a social group to technical
issues due to their jobs, the higher expectations within men that economic benefits will actually help them and their
families compared to a lower expectation among women for the same thing.
politics links
Philosophical objections and environmental lobbies’ oppositions sparks massive backlash over the
aff—empirical evidence and turns the case
Lorenzini, 11/27/13—a retired PacifiCorp executive and former general manager of contract operations
at DOE’s nuclear defense facilities (Paul, "A Second Look at Nuclear Power", Issues in Science and
Technology, issues.org/21-3/lorenzini/)//twonily
Ideological blinders
This deeply felt philosophical position could help explain the harsh rhetoric . It is “modern technology with
its ruthlessness toward nature,” as University of California, Los Angeles, historian Lynn White characterized it in a 1967 essay. The
prominent psychologist Abraham Maslow attacked science as a “dead end” that had become a “threat and a
danger to mankind.” E. F. Schumacher complained in his influential 1973 critique of modern society, Small is Beautiful, that
humans are “dominated by technology,” and called technology a “force that is out of control … [It] tends to
develop its own laws and principles, and these are very different from human nature.” The troubling consequence of these
declarations has been a tendency to trivialize the enormous benefits in public health, material prosperity, and
lengthened lifespan that science and technology have made possible. As a result, these ideologies have too often become
barriers to developing and using the technologies humans really need. A particularly revealing aspect of this has been the
singular intensity with which environmentalists have opposed nuclear power, knowing full well it would
mean a wider use of coal with its known environmental and human health disadvantages. Why would nuclear
power receive such intense scrutiny since coal too supports industrial growth? A partial explanation for the difference in
treatment is that coal combustion is a comfortingly familiar technology, whereas nuclear power symbolizes as
nothing else the new world of technological advancement. But nuclear power touches an even deeper ideological
chord: mistrust of modern institutions. Nuclear power depends on functioning public institutions to ensure
plant safety and to protect the public from radiation hazards. The political left, where environmental lobbies are most
comfortable, doesn’t trust these institutions. More basically, they mistrust the values of modern Western society that these institutions
embody, particularly their capitalist economics and their reliance on science and technology. This
philosophical predisposition
against technology explains, at least to some extent, why virtually the entire environmental lobby would have
opposed nuclear power when the overwhelming proportion of scientists was on the other side of the
issue. Many people today remain skeptical about nuclear power, even though recent polls show that as many as 73 percent of college
graduates favor nuclear power, as do 65 percent of the general population. Much of the skepticism about nuclear power
has been influenced by a relatively small activist environmental lobby that is motivated as much by
ideology as by concerns with the technology itself. These ideological differences make it difficult, if not
impossible, to find a common ground and work collaboratively to use technologies such as nuclear
power to their full advantage. Rather than seeing nuclear power as a beneficial technology with problems we could solve together,
they view it as anathema and oppose it without regard to its benefits. As one example, the legal system of reviews
intended to protect the public became for them a vehicle for blocking nuclear power. As a result, by the 1980s the process had
become so cumbersome that it took more than 15 years for most nuclear projects to be completed. That
economic burden was too much to handle, so no new U.S. nuclear plants have been ordered since the 1970s.
They can’t overcome the nuclear stigma – the public’s not informed enough
Marshall 5 – Prof Dept. Humanities @ Masaryk University, Peer Reviewed (Alan Marshall, 2005, “The
Social and Ethical Aspects of Nuclear Waste,” Electronic Green Journal,
https://escholarship.org/uc/item/2hx8b0fp)//twonily
One of the concerns that arises from the side of the nuclear industry regarding nuclear waste management is
that the public does not fully understand the technical issues at hand. This makes it impossible for the nuclear
industry to garner full public acceptance of their plans. This perceived public deficit of knowledge gives rise to
what Alan Irwin and Brian Wynne label the public ignorance model of citizen participation. If only the public can be rescued
from their ignorance, this model suggests, they would be freed of their irrational dread associated with nuclear operations. The public
ignorance model, which advocates a form of public participation based upon education, has its roots in the presumption held by many scientists
and technologists that the reason people do not fully trust the scientifically-proven point of view is because the public don’t fully understand it.
For example, Sundqvist (2002) says: Electronic Green Journal, 1(21), Article 4 (2005) 10 There
is a widely held image, in the
rhetoric of decision makers, of lay people as uninformed, ignorant and fearful of the unknown. This image
suggests that if the level of information is raised, lay people will accept the proposals from decision makers. (p.
14) Rosa et al. (1993) echo this point with regard to the 50 years of nuclear facility siting in the United States: The nuclear subgovernment, then as now, was guided by the unshakeable belief that increased public understanding—the
knowledge fix—would translate into support for nuclear technologies. All that was required was thoughtful public relations to
convert the dull, scientific knowledge into interesting, convincing public knowledge. (p.77) Susana Hornig Priest (Hornig Priest, Bonfadelli &
Rusanen, 2003), drawing from her social studies of biotechnology, points out that any
determined effort to use public relations
prone to backfiring. Rosa et al. (1993, p. 315) have found that the
same thing happens when the nuclear industry starts up campaigns aimed at using the media to disseminate
to educate the public about controversial science and technology is
information.
Opposition is ideological—the aff’s new standards can’t solve—best studies
Lorenzini, 11/27/13—a retired PacifiCorp executive and former general manager of contract operations
at DOE’s nuclear defense facilities (Paul, "A Second Look at Nuclear Power", Issues in Science and
Technology, issues.org/21-3/lorenzini/)//twonily
Ideological blinders
Many analysts have attempted to explain the visceral hostility toward nuclear power, and the most common
explanation is that people link nuclear power with nuclear weapons . Others say it is simply irrational fear. Although
fear of unfamiliar technology is understandable, it hardly explains the fear of unfamiliar technology is understandable, it
hardly explains the organized opposition from those who are well educated and technologically literate and who
have given the movement its legitimacy. There is, however, a different question one might ask: To what extent have such fears
been exploited and encouraged by nuclear opponents for reasons that are more ideological than
scientific? Two surveys taken in the early 1980s speak volumes on this question. In 1982, a random survey of scientists listed in American
Men and Women of Science sought to describe with some objectivity the attitudes of scientists toward nuclear power. The survey was
conducted roughly a year and a half after the accident at Three Mile Island, a time when virtually every environmental organization, claiming to
act on the best science, had lined up in opposition. At the time the survey was taken, a poll had reported that almost one
in four
Americans believed that a majority of scientists who are energy experts opposed further development of
nuclear energy. For years the media had hammered home the message that there were deep divisions
within the scientific community about nuclear power, a message that reinforced the legitimacy of the
antinuclear movement. But the results of the scientist survey showed overwhelming support for nuclear
power. Nearly 90 percent of the scientists surveyed believed nuclear power should proceed, with 53 per cent saying it should proceed
rapidly. So why would nearly the entire environmental community be on one side of the nuclear question while, overwhelmingly, scientists
were on the other? Six months later, another survey of attitudes toward nuclear power development focused
on “opinion
leaders.” Seven different groups were surveyed, each of which was assumed to play a key role in
shaping opinions on nuclear power. Those surveys included directors of major national organizations such as the Natural
Resources Defense Council, Friends of the Earth, the Sierra Club, and Critical Mass, as well as important regional anti nuclear groups. Those
surveyed were asked to rate the relative importance of 13 different areas of concern about nuclear power, including plant safety, risks to
workers, high-level and low-level waste disposal, transportation, decommissioning, and proliferation. 7, rating some quite important and others
of little import. Opponents of nuclear power, on the other hand, considered virtually every item to be of critical importance. “Clearly the anti’s
make few distinctions in their assessments of nuclear power’s dangers,” the researchers noted, “which raises the possibility that their views on
these problems may be less the cause of their opposition to the development of nuclear energy than its consequence.” In other words,
although the debate over nuclear power had been waged primarily on a technical front with arguments
focused exclusively on technical issues, it seems likely that for many antinuclear activists their ideological
position came first and the technical arguments were adopted to fit it. These surveys have not been updated, so it is
possible that attitudes may have shifted somewhat over the years. Even so, the rather remarkable alignment at the height of
the controversy—virtually the entire environmental lobby on one side while virtually the entire group of
scientists was on the other— strongly points to an ideological polarization that existed at the time and likely
continues today.
The link here is to a line of thought going clear back to Rousseau, with its evolutions through 19th-century romantics,
20th-century existentialists, and other individual thinkers, most prominently Nietzsche. The consistent theme has been hostility toward the
“mechanical and soulless” world of science and the technologies that flow from it. During the 1960s, it resonated with writers such as Jacques
Ellul and Herbert Marcuse, who saw our technological society as dehumanizing. Others such as Paul Ehrlich and Barry Commoner equated
technological growth with a pending environmental crisis. Environmentalism
itself changed, from a pre-1960s
preservationist posture to a post1960s attack on Enlightenment visions of progress, identified especially
with technology.
NIMBYism means there’s only a risk of a link
Marshall 5 – Prof Dept. Humanities @ Masaryk University, Peer Reviewed (Alan Marshall, 2005, “The
Social and Ethical Aspects of Nuclear Waste,” Electronic Green Journal,
https://escholarship.org/uc/item/2hx8b0fp)//twonily
Negative public reactions to radioactive waste facilities are often construed as an operation of the NIMBY (Not-In-My-BackYard) syndrome. NIMBYism, under this interpretation, is the emotive, reactionary impulse of local citizens to a
project they would probably agree with were it placed somewhere else. Some, like Rosa, Dunlap, and Kraft (1993), feel that such
NIMBYism may just be the predictable result of the alienation that people feel to national decision-making
processes, a natural response to their resignation that their views will not ever be considered. According to some research, the whole
concept of NIMBYism has little explanatory power when used to interpret the politics of managing and siting radioactive waste facilities. The NIMBY concept predicts that those people
physically closest to any planned facility should be those most objecting to it, but when Krannich, Little, and Cramer (1993) studied the phenomenon as applied to the Yucca Mountain
repository in Nevada they found that opposition and concern are strongest in the communities farthest from Yucca Mountain. Marshall: The Social and Ethical Aspects of Nuclear Waste 9
Another theme that the faltering NIMBY concept predicts is that the arguments of opponents will be emotionally driven by fear and dread and that they will be lacking in technical
sophistication. But according to Kraft and Clary (1993), who were studying repository-siting meetings, only 14% of those members of the public testifying made declarations of this kind.
Emotive themes were present for only a relatively small number of those making statements; the vast majority did not appeal to emotionalism. Kraft and Clary also repeat the idea forged by
numerous previous studies that a great amount of public testimony from non-expert individuals and groups is of comparable technical sophistication to that of the experts (Martin, 1996).
After reviewing the way public acceptance of a facility is either forthcoming or not within various affected communities
across the United States, Rosa et al. (1993) come to the conclusion that resistance to nuclear waste is so widespread that it does not conform
to NIMBYism at all but to NIABYism: Not In Anyone’s Backyard (p. 318). Although NIMBYism is denounced by many project planners as the irrational knee-jerk
reaction of technically unsophisticated locals acting out of selfinterest, if we trust the research outlined above, it seems as though the quick and indiscriminate
labeling of resistance as NIMBYism is but the kneejerk reaction of politically unsophisticated project
planners who themselves are reacting under self-interest. A number of works, like for instance that of Rabe (1994), Dunion (2003), and McAvoy (1999), would confirm this view.
The plan triggers massive backlash – psychology proves nuclear waste invokes moral stigmas
Marshall 5 – Prof Dept. Humanities @ Masaryk University, Peer Reviewed (Alan Marshall, 2005, “The
Social and Ethical Aspects of Nuclear Waste,” Electronic Green Journal,
https://escholarship.org/uc/item/2hx8b0fp)//twonily
According to work by Slovic, Layman, and Flynn (1993, p. 64) nuclear waste can be regarded as the top neighbor from
hell, ranking higher than oil refineries, chemical plants, garbage dumps and even nuclear power stations as the most undesirable
facility to live beside. The aversion to things nuclear, including nuclear waste, is often referred to as
nuclear stigma and it has a number of possible effects: economic, social, political, cultural and psychological.
With regard to the last of these, while there may be a case to state that the people of nuclear host communities are active in the construction
of a positive nuclear identity, it is apparent that some members
of the public are concerned about the mental stress of
living close to a nuclear site (or the prospect of the same) (Dunlap , Rosa, Baxter & Mitchell, 1993; Edelstein, 1988). In such
circumstances, if nuclear waste managers are to take social issues seriously then maybe they should consider the ideas
brought out by the likes of Lois Wilson (2000, p. 87), and Wendy Oser and Molly Young Brown (1996) who suggest professional
counseling in some form should be provided to local individuals or groups. Kristen Shrader-Frechette (1993) suggests
also that giving citizens funding for education and health might alleviate this problem, as might delegating authority to monitor stress to the
community itself. This would allow local Marshall: The Social and Ethical Aspects of Nuclear Waste 1 people to have some degree of self-help
capacity over their own psychological and stress problems. Another
type of stigma that may rear its head in the siting of radioactive
that associated with moral stigma. Easterling and Kunreuther (1995, p. 137) indicate that the moral qualms
that people feel toward nuclear weapons seem to have generalized to civilian nuclear power. And thence,
waste facilities is
to anything nuclear, such as the radioactive waste left over from nuclear weapons and nuclear power production. In this case,
if a nuclear waste management facility goes against the morals of individuals, it is not only politically
problematic, giving rise to resistance, but ethically problematic, asking people to live with a facility they
find morally objectionable. As far as these people are concerned, it is flippant for nuclear waste facility planners to
derail weapons/waste connections by indicating that they are only involved in the rear-end of the nuclear cycle,
when so much of the waste was produced for military purposes. Nuclear stigma has also been identified as having identifiably
negative economic consequences. New industries may be reluctant to set up near nuclear waste facilities
in fear that their products will suffer negative nuclear stereotyping ( Great Britain, Parliament, House of Lords, Select Committee on Science and
Technology, 1999, p. 43). In the states of Nevada and Texas, for example, pre-emptive concerns were
expressed regarding
the reputations of the tourist and cattle industries when sites in these states were considered for nuclear waste facilities
proposed by the U.S. Department of Energy (Brody & Fleishman, 1993, p. 117; Slovic & Flynn, 1991; Easterling & Kunreuther, 1993). Similarly
agricultural communities in eastern Washington state were concerned that the establishment of a nuclear repository
at Hanford would be seen as leading to the contamination of fruits and wines grown in the area, thereby causing a decline in the economy
(Easterling & Kunreuther, 1995, p. 137).
The plan’s not enough – lack of transparency dooms solvency
Bunn 99 – Professor of Practice; Co-Principal Investigator; Project on Managing the Atom (Matthew
Bunn, August 30, 1999, “Enabling A Significant Future For Nuclear Power: Avoiding Catastrophes,
Developing New Technologies, Democratizing Decisions -- And Staying Away From Separated
Plutonium,”
http://belfercenter.hks.harvard.edu/publication/2014/enabling_a_significant_future_for_nuclear_powe
r.html)//twonily
Nuclear technology -- both military and civilian -- is inherently a difficult subject for democracies to grapple with.
Today, much of the information needed for informed decisions is either secret, commercially proprietary, or too
complex for most members of the public to fully understand. We should work to reduce these barriers, declassifying information, releasing
broad categories of data now considered proprietary, and organizing the information in ways that allow it to be found and understood. (I
regard it as particularly remarkable, for example, that although both COGEMA and BNFL are state-owned firms operated for the benefit of the
French and British taxpayers, those taxpayers are not allowed to know even the basics of the detailed cost-benefit calculations that factor into
decisions on whether to build and operate new plants.) Most important, if
the nuclear industry is ever to win the public's
confidence, it will have to do far better at honestly, candidly, and quickly relaying the bad news, relating to accidents,
releases, and the like. The kinds of lies and cover-ups that followed the Chernobyl accident, or even the recent incidents at plutonium facilities
in Japan, simply cannot be allowed to recur. Ultimately, I believe that even
with radically simpler, safer, cheaper, more
proliferation-resistant technology, nuclear power will not achieve the broad public, government, and utility
support needed to be a major player in the 21st century unless there is a radical democratization of nuclear
decision-making, putting a well-informed public in a position to ensure that its concerns are adequately addressed at every step of the
way. Many of the nuclear countries are groping toward this general idea, trying to find ways to integrate public concerns into decision-making -from the round-table talks in Japan, to the inter-party consensus talks that were attempted in Germany, to the DOE's Openness Initiative and
enormously expanded use of the public consultation provisions of the National Environmental Policy Act in the United States. I don't believe
any country or organization is fully satisfied that they've found a workable solution. This is one of the great intellectual challenges for those of
us who think about nuclear policy.15
**disads
epa turn
The plan undermines EPA environmental regulation in the long term – this destroys the ozone layer
McGarity 13 – Joe R. and Teresa Lozano Long Endowed Chair in Administrative Law @ Univ. Texas
School of Law (Thomas McGarity, “EPA AT HELM'S DEEP: SURVIVING THE FOURTH ATTACK ON
ENVIRONMENTAL LAW,” 24 Fordham Envtl. Law Rev. 205)//twonily
Measured by the changes it has induced in the environmental statutes, the Fourth Assault on regulation
has thus far been a failure . None of the statutes has been amended, and even riders in appropriations bills
have thus far run aground. The Fourth Assault has, however, had a discernable impact on EPA's efforts to implement those statutes under Administrator
Lisa Jackson. The White House stopped the ozone rulemaking dead in its tracks , and several other rulemaking initiatives,
like the coal ash rule, have slowed down considerably. The "look back" exercise required by President Obama's executive order, like
similar exercises required by nearly all of his predecessors, diverted precious time and resources away from the agency's
primary mission and did little to mollify skeptical companies. The 2012 elections did little to change the political dynamic underlying the Fourth Assault.
The Tea Party faction of the [*241] Republican Party was somewhat less in evidence during the 2012 elections than in the 2010 elections, but Republican candidates
throughout the country were especially careful not to stray far from the Tea Party line. This almost certainly contributed to the Republican Party's failure to regain
control of the Senate in a year in which more Democratic seats were open than Republican seats. The fact that the Republican Party retained control of the House
of Representatives, combined with the fact that Tea Party advocates make up a significant proportion of that majority, should guarantee that the Fourth Assault will
continue in the House for at least another two years. Whether the assault will be as aggressive in the House as it was during the 112th Congress will depend on
whether the Republican leadership feels sufficiently chastised by the outcome of the presidential and Senate races to attempt to moderate the tone of the vocal
EPA critics in the membership. The
battered occupants of the Federal Triangle Complex (the site of EPA's headquarters) have
survived three powerful assaults from the business community and its allies in Congress, conservative think tanks, the conservative echo
chamber, and at times even from within its own walls. In 2009, it seized the offensive with a number of major rulemaking efforts, some of which (like the
greenhouse gas initiative) have become law, but many of which remain bottled up within the administration. With the vote of confidence that the administration
received in 2012, EPA
should remain on the offensive by completing important regulatory initiatives, like the coal ash
disposal regulations and the new source performance standards for fossil fuel-fired power plants, while the forces aligned against it are in
some disarray . There are few indications that the business community and its allies plan to moderate the Fourth Assault in light of the 2012 elections.
EPA and its allies should meet it head on with new and stronger protections to allow the environment
upon which we all so greatly depend to flourish.
Ozone depletion causes extinction
Greenpeace 95 (“Full of Holes: Montreal Protocol and the Continuing Destruction of the Ozone Layer”,
http://archive.greenpeace.org/ozone/holes/holebg.html)//twonily
When chemists Sherwood Rowland and Mario Molina first postulated a link between chlorofluorocarbons and ozone layer depletion in 1974,
the news was greeted with scepticism, but taken seriously nonetheless. The
vast majority of credible scientists have since
confirmed this hypothesis. The ozone layer around the Earth shields us all from harmful ultraviolet radiation from
the sun. Without the ozone layer, life on earth would not exist. Exposure to increased levels of ultraviolet radiation can
cause cataracts, skin cancer, and immune system suppression in humans as well as innumerable effects on other living
systems. This is why Rowland's and Molina's theory was taken so seriously, so quickly - the stakes are literally the continuation of
life on earth . Oceans
iraq federalism turn
Iraq federalism causes secession and civil war
Walen 3 (Alec Walen, professor at the University of Baltimore, “Federalism For Postwar Iraq,” April 10,
2003, http://writ.news.findlaw.com/commentary/20030410_walen.html#bio)//twonily
Unfortunately for the DPWG, federalism in Iraq seems to carry a huge cost: devolving power to the provinces
threatens to lead to the disintegration of Iraq as a country. Each province could grow to feel that it has its own
distinct identity, and that it would be better off governing itself without any restrictions from the center.
The dangers of fragmentation are quite real. Fragmentation would likely result in a series of bloody of civil wars, made
especially grave as groups struggle to control Iraq's vast oil reserves. In addition, the secession of the Kurds in particular would likely
draw Turkey into the fray. Turkey has a large Kurdish population of its own, and it does not want to see an independent Kurdistan on
its borders, tempting its own Kurds to try to secede in order to create a greater Kurdistan.
federalism turn
The plan’s application of federalism is incorrect and uniquely disrupts environmental regulation
Yoo 4 – Prof. Law at UC Berkeley School of Law (John Yoo, 1/1/14, “Judicial Safeguards of Federalism
and the Environment: Yucca Mountain from a Constitutional Perspective,”
http://scholarship.law.berkeley.edu/cgi/viewcontent.cgi?article=2500&context=facpubs)//twonily
In Part I of this article, we lay out our argument that judicial review must include review of federalism questions. We
begin by describing the political safeguards theory of federalism, which postulates that political safeguards offer sufficient protection of
federalism and should be the exclusive protection for federalism. We then discuss why the
political safeguards theory is
incorrect. Specifically, we contend that the text of the Constitution does not support the exclusion of federalism from judicial review, and
that the review of federalism issues is inherent in deciding supremacy conflicts. We also contend that the political safeguards theory is
consistent with coordinate branch review. In Part II, we discuss two models of federalism: the political autonomy model and the dual
sovereignty model. We explain that each of these models has
important impacts on how federalism questions are
analyzed by the U.S. Supreme Court. In Part III, using these models, we examine claims that the siting of the nation's
nuclear repository at Yucca Mountain violates core constitutional principles of federalism. We conclude that
the siting of the repository at Yucca Mountain does not offend the balance of power between the federal
government and the states as established by the Constitution. Instead, we find that the issues raised regarding this
decision are well within the federal government's enumerated powers. As a result, the question is not whether
the federal government has the power to locate the repository at Yucca Mountain, but whether it should, leaving it
a question for policymakers not constitutional scholars .
The plan destroys long-term federalism
Butler and Harris 14 – Prof. Law and Executive Director @ Law and Economy Center @ George Mason
Univ. School of Law AND Law Clerk to The Honorable Harris L Hartz, United States Court of Appeals for
the Tenth Circuit (Henry N. Butler, Nathaniel J. Harris, 2014, “Sue, Settle, and Shut Out the States:
Destroying the Environmental Benefits of Cooperative Federalism,”
http://www.law.gmu.edu/assets/files/publications/working_papers/1357.pdf)//twonily
In many circumstances, the EPA’s failure is a failure to act when states themselves miss deadlines imposed by the varying
environmental statutes. After the state fails, various statutes require the EPA to impose a federal implementation
plan (FIP) that states must follow. At other times though, and a major point of this paper, it is the EPA’s failings — completely independent of
the states — that leads to a consent decree. The EPA and the advocacy group then settle the lawsuit — without
any input from the states that were responsible in the first place and are now responsible for implementing the terms of
the settlement. In the settlement agreement, the EPA is required to implement its own standard if the states fail to
develop a standard by a deadline imposed by the settlement. The standard, or at least the nature of the standard, is also frequently
established by the settlement agreement. The settlement is then entered as a consent decree and the EPA is bound by the terms under
court order. After the consent decree is entered, the EPA issues an FIP because the states were unable to meet the
settlement’s deadlines, standards, or both. Just like that, states — statutorily charged with implementing pollution controls themselves — are
circumvented and the EPA takes over and imposes FIPs . Paradoxically, the EPA’s “surrendering” of its
discretionary authority to work cooperatively with the states leads to more , not less, control at the
federal level . Thus, as a result of being sued, the agency actually has more power relative to the states.
Instead of allowing the states the flexibility to continually experiment with different approaches, standards, implementation plans, and so forth, the settlement
agreements between the advocacy group and the EPA increase direct EPA control over the states. Of course, the advocacy groups that bring these suits are
generally pleased with the result. The fact that both parties get what they want as a result of the filing of the lawsuit should raise some suspicion about what is
actually going on. Consider the case of Defenders of Wildlife v. Perciasepe. On November 8, 2010, two events occurred: (1) Defenders of Wildlife filed its complaint
against EPA; and (2) EPA and Defenders of Wildlife filed a consent decree and a joint motion to enter the consent decree with the court. Although simultaneously
filing a lawsuit and a consent decree do not necessarily imply foul play, it does illustrate
how little impact states may have in the
consent decree process that may ultimately dictate what and when a state is required to do by statute.
warming turn
It’s a net increase in emissions
Lowe 5 – Emeritus Professor of Science, Technology, and Society @ Griffith University, President of the
Australian Conservation Foundation (Prof. Ian Lowe, September 2005, “Nuclear Power: No Solution to
Climate Change,”
http://www.acfonline.org.au/sites/default/files/resources/Nuclear_Power_No_Solution_to_Climate_Ch
ange.pdf)//twonily
Claims that nuclear power is ‘greenhouse free’ are false . Substantial greenhouse gas generation occurs across
the nuclear fuel cycle – uranium mining, milling, conversion, and enrichment; reactor construction, refurbishment and decommissioning;
and waste management (e.g. reprocessing, and/or encasement in glass or cement). In addition, transportation is extensive – for
example, Australian uranium may be converted to uranium hexafluoride in Canada, then enriched in France, then
fabricated into fuel rods in Japan, and the spent fuel may back to table of contents be reprocessed in the UK or
France resulting in plutonium, uranium and waste streams which may be subject to further international
transportation. Lifecycle estimates of greenhouse gas emissions per kilowatt-hour of nuclear electricity vary dramatically – from 2-60
grams of carbon dioxide (equivalent) per kilowatt-hour of electricity. A detailed study by the Oko-Institute calculates the figure at 34 grams
(Fritsche and Lim, 1997). Other studies calculate the figure at 30-60 grams (WISE/NIRS, 2005). At the moment, using comparatively rich uranium
ores, nuclear power generally emits far less greenhouse gases compared to fossil fuels – about 12 times less than gas power stations and about
30 times less than coal stations (WISE/NIRS, 2005). Again, the figures vary. Nuclear emits just three times less emissions per kilowatt-hour of
electricity than large, modern natural gas stations according to van Leeuwen & Smith (2004). Further, if comparing natural gas cogeneration
(electricity plus useful heat) with nuclear (for electricity) plus oil (for heat), gas
cogeneration is more greenhouse ‘friendly’
than nuclear-plus-oil, and biogas cogeneration plants even more so (Fritsche and Lim, 1997). Greenhouse gas emissions per
kilowatt-hour of electricity from nuclear are generally greater than for most renewable energy sources ,
especially wind and hydroelectricity, though the differences are not great and the emissions from all three sources are far less than most fossil
fuel sources. The Oko-Institut study calculates emissions for nuclear at 34 grams/kWh, wind power 20 grams, and hydroelectricity 33 grams
(Fritsche and Lim, 1997).
terror turn
Increasing nuclear power leads to terrorist attacks on nuclear reactors—causes extinction
Hodges 14 – Editor and Host of The Common Sense Show, citing FEMA inspectors of nuclear power
plants (Dave Hodges, 4/18/14, “Nuclear Power Plants Will Become America’s Extinction Level Event,”
http://www.dcclothesline.com/2014/04/18/nuclear-power-plants-will-become-americas-extinctionlevel-event/)//twonily
Subsequently, I decided to examine the likelihood of a monumental nuclear catastrophe in this country due to a take down of
the power grid. All discussions about a catastrophic failure of any nuclear facility must begin with a cursory
analysis of what we have learned about the Fukushima event. Does anyone else smell the presence of the Chinese through all of this given what we
know about Chinese Solar Energy Zones in the United States and their proximity to several nuclear power plants? Lessons Learned from Fukushima Fukushima is often spoken of by many, as a
Fukushima continues to wreak havoc upon the world and in the United States as we
fish are becoming inedible and the ocean
currents as well as the prevailing ocean winds are carrying deadly radiation. Undoubtedly, by this time, the radioactivity has made its way into the
transpiration cycle which means that crops are being doused with deadly radiation. The radiation has undoubtedly made its way into the
water table in many areas and impacts every aspect of the food supply. The health costs to human beings is
incalculable. However, this article is not about the devastation at Fukushima, instead, this article focuses on the fact that North America could have a total of 124 Fukushima events if
the necessary conditions were present. A Festering Problem Long before Fukushima, American regulators knew that a power failure lasting for days involving the power grid
connected to a nuclear plant, regardless of the cause, would most likely lead to a dangerous radioactive leak in at least several
nuclear power plants. A complete loss of electrical power poses a major problem for nuclear power plants because the reactor core must be kept cool as well as the back-up
possible extinction level event because of the radiation threat.
are being bathed in deadly radiation from this event. Coming to a neighborhood near you. Because of Fukushima,
cooling systems, all of which require massive amounts of power to work. Heretofore, all the NERC drills which test the readiness of a nuclear power plant are predicated on the notion that a
blackout will only last 24 hours or less. Amazingly, this is the sum total of a NERC litmus test. Although we have the technology needed to harden and protect our grid from an EMP event,
whether natural or man-made, we have failed to do so. The cost for protecting the entire grid is placed at about the cost for one B-1 Stealth Bomber. Yet, as a nation, we have done nothing.
This is inexplicable and inexcusable. Our collective inaction against protecting the grid prompted Congressman Franks to write a scathing letter to the top officials of NERC. However, the good
Congressman failed to mention the most important aspect of this problem. The problem is entirely fixable and NERC and the US government are leaving the American people and its
a recognized
expert in nuclear plant failure analyses, when a nuclear power plant loses access to off-grid electricity, the event is referred to as a “station blackout”. Haar
states that all 104 US nuclear power plants are built to withstand electrical outages without experiencing any core damage, through
infrastructure totally unprotected from a total meltdown of nuclear power plants as a result of a prolonged power failure. Critical Analyses According to Judy Haar,
the activation of an automatic start up of emergency generators powered by diesel. Further, when emergency power kicks in, an automatic shutdown of the nuclear power plant commences.
water is pumped by the diesel power generators into the reactor to
reduce the heat and thus, prevent a meltdown. Here is the catch in this process, the spent fuel rods are encased in both a primary and secondary containment structure which is
designed to withstand a core meltdown. However, should the pumps stop because either the generators fail or diesel fuel is not
available, the fuel rods are subsequently uncovered and a Fukushima type of core meltdown commences immediately. At this
The dangerous control rods are dropped into the core, while
point, I took Judy Haar’s comments to a source of mine at the Palo Verde Nuclear power plant. My source informed me that as per NERC policy, nuclear power plants are required to have
enough diesel fuel to run for a period of seven days. Some plants have thirty days of diesel. This is the good news, but it is all downhill from here. The Unresolved Power Blackout Problem A
long-term loss of outside electrical power will most certainly interrupt the circulation of cooling water to the pools. Another one of my Palo Verde nuclear power plant sources informed me
there is no long term solution to a power blackout and that all bets are off if the blackout is due to an EMP attack.
spent fuel pools carry depleted fuel for the reactor. Normally, this spent fuel has had time to considerably
decay and therefore, reducing radioactivity and heat. However, the newer discharged fuel still produces heat and needs cooling. Housed in
that
A more detailed analysis reveals that the
high density storage racks, contained in buildings that vent directly into the atmosphere, radiation containment is not accounted for with regard to the spent fuel racks. In other words, there is
no capture mechanism. In this scenario, accompanied by a lengthy electrical outage, and with the emergency power waning due to either generator failure or a lack of diesel needed to power
the plant could lose the ability to provide cooling. The water will subsequently heat up, boil away and
uncover the spent fuel rods which required being covered in at least 25 feet of water to remain benign from any deleterious effects. Ultimately, this would
lead to fires as well and the release of radioactivity into the atmosphere. This would be the beginning of another
Fukushima event right here on American soil. Both my source and Haar shared exactly the same scenario about how a meltdown would occur. Subsequently, I
the generators,
spoke with Roger Landry who worked for Raytheon in various Department of Defense projects for 28 years, many of them in this arena and Roger also confirmed this information and that the
above information is well known in the industry. When I examine Congressman Franks letter to NERC and I read between the lines, it is clear that Franks knows of this risk as well, he just stops
short of specifically mentioning it in his letter. Placing Odds On a Failure Is a Fools Errand An analysis of individual plant risks released in 2003 by the Nuclear Regulatory Commission shows that
for 39 of the 104 nuclear reactors, the risk of core damage from a blackout was greater than 1 in 100,000. At 45 other plants the risk is greater than 1 in 1 million, the threshold NRC is using to
determine which severe accidents should be evaluated in its latest analysis. According to the Nuclear Regulatory Commission, the Beaver Valley Power Station, Unit 1, in Pennsylvania has the
greatest risk of experiencing a core meltdown, 6.5 in 100,000, according to the analysis. These odds don’t sound like much until you consider that we have 124 nuclear power generating plants
in the US and Canada and when we consider each individual facility, the odds of failure climb. How many meltdowns would it take in this country before our citizens would be condemned to
the hellish nightmare, or worse, being experienced by the Japanese? The Question That’s Not Being Asked None of the NERC, or the Nuclear Regulatory tests of handling a prolonged blackout
at a nuclear power plant has answered two critical questions, “What happens when these nuclear power plants run out of diesel fuel needed to run the generators”, and “What happens when
some of these generators fail”? In the event of an EMP attack, can tanker trucks with diesel fuel get to all of the nuclear power plants in the US in time to re-fuel them before they stop
Will tanker trucks even be running themselves in the aftermath of an EMP attack? And in the event of an EMP attack, it
is not likely that any plant which runs low on fuel, or has a generator malfunctions, will ever get any help to mitigate the
running?
crisis prior to a plethora of meltdowns occurring. Thus, every nuclear power plant in the country has the potential to cause a
Chernobyl or Fukushima type accident if our country is hit by an EMP attack. CAN YOU EVEN IMAGINE 124 FUKUSHIMA EVENTS IN NORTH AMERICA
HAPPENING AT THE SAME TIME? THIS WOULD CONSTITUTE THE ULTIMATE DEPOPULATION EVENT. …And There Is More… The
ramifications raised in the previous paragraphs are significant. What if the blackout lasts longer than 24 hours? What if the reason for the blackout is an EMP burst caused by a high altitude
nuclear blast and transportation comes to a standstill? In this instance, the cavalry is not coming. Adding fuel to the fire lies in the fact that the power transformers presently take at least one
year to replace. Today, there is a three year backlog on ordering because so many have been ordered by China. This makes one wonder what the Chinese are preparing for with these multiple
our unpreparedness is a prescription for disaster
orders for both transformers and generators. In short,
. As a byproduct of my investigation, I have
discovered that most, if not all, of the nuclear power plants are on known earthquake fault lines. All of California’s nuclear power plants are located on an earthquake fault line. Can anyone tell
me why would anyone in their right mind build a nuclear power plant on a fault line? To see the depth of this threat you can visit an interactive, overlay map at this site. Conclusion I have
studied this issue for almost nine months and this is the most elusive topic that I have ever investigated. The more facts I gather about the threat of a mass nuclear meltdown in this country,
it is not
matter of if we have a mass power grid take down, but it is a matter of when. I would echo her concerns and apply
the “not if, but when” admonition to the possibility of a mass meltdown in this country. It is only a matter of time until this
scenario for disaster comes to fruition. Our collective negligence and high level of extreme depraved indifference on the part of NERC
is criminal because this is indeed an Extinction Level Event. At the end of the day, can anyone tell me why would any country be so negligent
the more questions I realize that are going unanswered. With regard to the nuclear power industry we have the proverbial tiger by the tail. Last August, Big Sis stated that
as to not provide its nuclear plants a fool proof method to cool the secondary processes of its nuclear materials at all of its plants? Why would ANY nuclear power plant be built on an
earthquake fault line? Why are we even using nuclear energy under these circumstances? And why are we allowing the Chinese to park right next door to so many nuclear power plants?
naval readiness turn
LOST legitimacy kills naval readiness
Bell 12 – Forbes Contributor specializing in aerospace, environment, energy, and Second Amendment
policy (Larry Bell, 5/20/14, “Will U.S. Sovereignty Be LOST At Sea? Obama Supports U.N. Treaty That
Redistributes Drilling Revenues,” http://www.forbes.com/sites/larrybell/2012/05/20/will-u-ssovereignty-be-lost-at-sea-obama-signs-u-n-treaty-that-redistributes-drilling-revenues/)//twonily
A proposed Law of the Sea Treaty (LOST), which is supported by President Obama but has not yet been ratified by Congress, will subordinate U.S.
naval and drilling operations beyond 200 miles of our coast to a newly established U.N. bureaucracy. If
approved, it will grant a Kingston, Jamaica-based International Seabed Authority (ISA) the power to regulate deep-sea oil exploration, seabed mining, and fishing
rights. As part of the deal, as much as 7%
of U.S. government revenue that is collected from oil and gas companies operating off our coast will
be forked over to ISA for redistribution to poorer, landlocked countries. This apparently is in penance for America’s
audacity in perpetuating prosperity yielded by our Industrial Revolution. Under current law, oil companies are required to pay royalties to the
U.S. Treasury (typically at a rate of 12 ½% to 18%) for oil and gas exploration in the Gulf of Mexico and off the northern coast of Alaska. Treasury keeps a portion,
and the rest goes to Gulf states and to the National Historic Preservation Fund. But if LOST is ratified, about half of those Treasury revenues, amounting to billions, if
not trillions of dollars, would go to the ISA. We will be required to pay 1% of those “international royalties” beginning in the sixth year of production at each site,
with rates increasing at 1% annual increments until the 12th year when they would remain at 7% thereafter. Like the U.N.’s Kyoto Protocol debacle that preceded it,
this most recent LOST cause embodies
the progressive ideal of subordinating the sovereignty of nation states to
authoritarian dictates of a world body. The U.S. would have one vote out of 160 regarding where the money would go, and be obligated to hand over
offshore drilling technology to any nation that wants it… for free. And who are those lucky international recipients? They will most likely include such undemocratic,
despotic and brutal governments as Belarus, Burma, China, Cuba, Sudan and Zimbabwe…all current voting members of LOST. The treaty was originally drafted in
1968 at the behest of Soviet bloc and Third World dictators interested in implementing a scheme to weaken U.S. power and transferring wealth from industrialized
countries to the developing world. It had been co-authored by Elisabeth Mann Borgese, a socialist and admirer of Karl Marx who ran the World Federation of
Canada. In a 1999 speech she declared: “The world ocean has been and is so to speak, our great laboratory for making a new world order.” Recognizing this as a
global grab, President Reagan thought it was such a lousy idea that he not only refused to sign, but actually fired the State Department staff that helped negotiate it.
Former U.N. Ambassador John Bolton warns that world circumstances are even much less favorable to the U.S. for LOST enactment now: “With China emerging as a
major power, ratifying the treaty would encourage Sino-American strife, constrain U.S. naval activities and do nothing to resolve China’s expansive maritime
territorial claims.” The treaty has been pitched as an effort to protect the world’s oceans from environmental damage and to avoid potential conflicts between
nations. Accordingly, ISA
would settle international maritime and jurisdictional disputes, possibly even to the extent of
overriding our U.S. Navy’s freedom of navigation and governing where ships can and cannot go . ISA’s
prerogative to do so would be entirely consistent with a “global test” definition advocated by key LOST proponent Senator John Kerry in 2004. The treaty contains a
clause empowering the ISA to take whatever steps it deems necessary to stop “marine pollution.” According to William C. G. Burns of the Monterey Institute of
International Studies, its expansive definition of pollution could be read to include “…the potential impact of rising sea surface temperature, rising sea levels, and
changes in ocean pH as a consequence of rising levels of carbon dioxide in sea water.” Burns warns that this could “give rise to actions under the Convention’s
marine pollution provisions to reduce carbon emissions worldwide.” He warns that this can easily be expanded to include anti-global warming measures, and since
it would be “self-executing”, U.S. courts can be used to enforce it. Powerful environmental organizations love LOST because it will afford a legal system for dispute
resolution which culminates in a 21-member international tribunal (ITLOS) based in Hamburg which can be enforced against American companies without
possibilities of U.S. court appeal. Numerous lawsuits charging global warming dangers linked to greenhouse emissions from ships will most likely supersede binding
rules of the discredited Kyoto Protocol which the U.S. wisely never ratified. The U.S. Navy maintains that we need LOST to guarantee free transit in dangerous
waters, such as in the Strait of Hormuz, which Iran has threatened to block, and in the South China Sea which is dominated by China. Yet freedom
navigation has been recognized under international law for centuries.
of
environment turn
The plan devastates the environment – their defense relies on models which assume non-living
systems—leaks and radiation are inevitable
Edwards 5 – President of the Canadian Coalition for Nuclear Responsibility (Dr. Gordon Edwards, 2005,
“Alternative Proposals to “Dispose” of Radioactive Wastes,”
http://www.ccnr.org/radwaste_readings.pdf)//twonily
This is a kind of "advanced dumping" option. It has often been suggested that the oceans are so vast that nuclear waste could be
safely disposed of at sea; the radioactive poisons would become so diluted and so dispersed, reaching such a low level of
concentration, that the danger would become negligible. This idea dramatizes the difference between physical sciences
and biological sciences. In a non-living environment, material that is dissolved in water spreads out uniformly in all
directions, resulting in very low concentrations at any one place. However, living organisms have the ability to seek out and
concentrate dilute materials (nutrients) into their bodies. Thus biological organisms can often reverse expectations
that are based on the study of non-living systems. This is the principle behind bio-accumulation and bio-magnification. Many radioactive
materials that enter the food chain can be reconcentrated by factors of thousands or hundreds of thousands as they
work their way up the food chain. Think of mercury concentrations in fish, or DDT concentrations in birds of prey such as eagles or falcons. Thus
we cannot predict the end result of a "dilute and disperse" approach, and no nation is pursuing this idea. It is
important to remember that there are literally hundreds of different radioactive materials in irradiated nuclear fuel, and
these materials behave exactly the same as their non-radioactive cousins. Thus radioactive iodine behaves just the same way as nonradioactive iodine -- it goes straight to the thyroid gland. Once there however, the radioactivity damages the thyroid;
it can cause thyroid disorders which impair the growth, well-being, or even the intelligence of a child, as well as causing
tumors (both cancerous and non-cancerous). Other radioactive materials mimic non-radioactive materials. Our digestive system
cannot tell the difference between potassium and cesium, so radioactive cesium is stored in our muscle
tissues when it gets into our food supply. Similarly, radioactive strontium is stored in our bones, teeth, and mother's milk,
because our body cannot tell the difference between it and non-radioactive calcium. In short, our bodies have not evolved in a way
that will allow our digestive systems to detect or reject radioactive materials in our food; the same can be said for all other living
things, as far as we can tell. Sub-seabed disposal would involve placing the wastes in containers below the seabed, so that it will take a long time (hopefully thousands of years) for the containers to disintegrate and the waste materials to be dissolved in
the ocean water. Thus it is a "dilute and disperse" option with a time delay built in. Sub-seabed disposal was
investigated extensively in the 1980's by the Nuclear Energy Agency of the OECD (Organization for Economic Cooperation and Development).
Canada participated in this work, along with Japan, USA, UK, and other countries. This research was ended in the 1990s when it became clear
that there would always be intense political opposition internationally to such an option.
Extinction
Nissani 92 (Moti, writer who has examined the reception new scientific discoveries have received in
history, “Lives in the Balance: the Cold War and American Politics, 1945-1991”,
http://www.is.wayne.edu/mnissani/PAGEPUB/CH2.html)
Some 15 percent of the bomb's energy is taken up by ionizing radiation. From the psychological point of view, and from the point of view of
humankind's long-term future, radiation
is perhaps the most frightening direct effect of nuclear explosions. We can sense blast,
can't detect ionizing radiation (except at very high intensities when it produces a tingling sensation4) without
the aid of special instruments; we can be irradiated to death without knowing it. Unlike fire and blast, ionizing
radiation not only damages our health, but, through its potential impact on fetuses and on reproductive
cells, it may damage the health of our descendants. Though the heat and the blast wreak incredible havoc, their direct
effects are gone within seconds, or, in the case of the fires they cause, within hours or days. In contrast, poisonous radioactivity may
linger for years. X-rays are the most familiar type of ionizing radiation. Owing to their ability to penetrate the human body, they are
heat, and fire, but we
widely used as a diagnostic tool. But even when used in minuscule doses (as in dental examinations), X-rays can cause slight problems by
damaging, or ionizing, the chemical constituents of our bodies. Two overlapping schemes are used to classify the ionizing radiations produced
by nuclear bombs. The first, which will not be taken up here, is based on their ability to penetrate matter. The second scheme is based on their
order of appearance. Initial radiation is released within the first minute of an explosion. It accounts for about 5 percent of the bomb's energy.
The initial radiation of a 12.5 kt explosion will
knock unconscious people standing in the open at a distance of less than half a mile
from ground zero. These people will die from radiation sickness within two days (even if they somehow managed to escape the
heat and blast). People standing in the open three-quarters of a mile away will die within one month.6b Given
these three powerful effects-blast, heat, initial radiation-the chances of survival are slim for anyone within a one mile radius of a small nuclear
explosion. With larger explosions, or with multiple detonations in one area, the lethal range is greater. Those who manage to survive all three
must still deal with radioactive fallout (also called residual radiation). Fallout takes some 10 percent of the bomb's energy. Fallout is emitted by
fission products such as radioactive iodine, weapon residues such as plutonium and radioactive hydrogen, and substances in the vicinity of the
explosion which became radioactive as a result of exposure to the bomb's initial radiation. Radioactive
fallout is usually classified into
two components, early and delayed. Early fallout reaches the ground with in 24 hours of the explosion. Delayed fallout
reaches the ground after 24 hours. Early fallout is also called local fallout because it tends to remain in the vicinity of the explosion site. Delayed
fallout is also called global fallout because it can take mo
**counterplans
generic counterplan framing
Presumption stays with the team that transports waste the least – transportation is inherently more
risky than storage
Marshall 5 – Prof Dept. Humanities @ Masaryk University, Peer Reviewed (Alan Marshall, 2005, “The
Social and Ethical Aspects of Nuclear Waste,” Electronic Green Journal,
https://escholarship.org/uc/item/2hx8b0fp)//twonily
**best against land counterplan
Within and outside of the industry, the transport of nuclear waste has been perceived as inherently riskier than its
storage or disposal. The risk of such accidents has driven some writers to declare that waste transport should be
regarded as the last resort ( Nuclear Guardianship Project, 2002). According to studies by Slovic et al. (1993), somewhere between 70% and 80% of
people questioned in Nevada and California were convinced that railway and highway accidents were going to occur on route to any operating nuclear waste
facility. The public
perception of transportation as being a problem arises in part from the acknowledged
dangers emerging from industry watchdogs, the media, and the industry itself. For instance, the Association of Electronic Journalists declares that
“from 1971 to 1998, there were 1,936 accidents and incidents involving radioactive materials transport” ( Nuclear Shipping Accidents:
Rare but Regular , 2002) . When forecasting the transport problems of the proposed Yucca Mountain repository in Nevada, the U.S.
Department of Energy (DOE) predicted there will be 100 accidents over the lifetime of the project (the State of Nevada
predicts 400 accidents during the same period) (Wile & Cox, 2002). Most of these accidents would result in no, or negligible, harm to human
health and the environment. However, Wile and Cox used published DOE figures to Marshall: The Social and Ethical Aspects of Nuclear Waste 11 study what that
agency calls a “moderate” accident. Wile and Cox concluded that under such an event: * A small number of first responders may be fatally affected. * Around 200 to
1,200 latent fatal cancers of nearby citizens would eventuate. * Nearly 600 million dollars would be needed to clean up the contaminated area over a 14 month
period. In the event of a transport accident it is fairly certain that local fire, police, and ambulance services might be among the first upon the scene. An ethical issue
that must be investigated here is whether all the emergency personnel from the local communities that line the proposed routes of the transported radioactive
waste should be trained in some way to deal with accidents that may involve that waste. If so, this will have ramifications concerning the security and financial
regimes under which such training might be given. Some people have argued that the transportation
of waste is so dangerous that it
should not be undertaken. The Nevada-based Citizen Alert group, for instance, points out that transportation massively
increases all the risks associated with radioactive waste handling (High level radioactive waste transportation factsheet, 2000)
. Physical, or passive, security, for instance, at stationary sites involves much more robust physical protection from
human interference and natural disaster since the strength of buildings and mobile wastes. Nevada’s Nuclear Waste Project Office confirm this when they declare
that if
transport casks were designed to protect the waste to the same degree as stationary facilities, they’d be too heavy
radioactive waste cannot favorably
compare to the stationary waste either, since the former does not have the police presence, and the emergency personnel, that
regularly accompanies the latter. The Nuclear Information and Resource Service (NIRS) ( Mariotte, 1998) also points out that mobile
radioactive waste is more vulnerable to external factors than stationary waste since, as safe as we can get the transportation
system, external factors (such as drunken drivers, weather extremes, traffic emergencies—all of which have caused
accidents in radioactive transport in the past) cannot be eliminated. Another important issue regarding transport of radioactive waste is
whether the route should be openly declared. To discuss this particular issue necessitates an engagement with the never-ending balancing
act of working with security concerns versus fairness/democratic concerns. To minimize the Electronic Green Journal, 1(21), Article 4 (2005) 12 risk of
terrorist action or theft, the usual approach is to keep the routes secret. To maximize the democratic impulse of people to
to be transported ( Nevada Nuclear Waste Project Office, 1999). When it comes to active security, mobile
know about threats to their health and their environment, the routes should be declared. This balance may be made more complex by acknowledging that some
along the route are more concerned about nuclear stigma affecting property prices than about any health risk or environmental danger. Thus, under
the
rhetoric of fairness, there may be social pressure (and also political back-up) for the routes to remain unnamed
(Gawande & Jenkins Smith, 2001).
The conditions CP is legitimate and predictable in the context of the affirmative
Davenport 93 – Private Practice, Olympia, Washington; Of Counsel, Riddell, Williams, Bullitt &
Walkinshaw, Seattle, Washington; Special Deputy Attorney General, State of Nevada, 1983 to present;
Staff of the Environment and Public Works Committee, U.S. Senate, 1980-82 (James Davenport, Summer
1993, “THE FEDERAL STRUCTURE: CAN CONGRESS COMMANDEER NEVADA TO PARTICIPATE IN ITS
FEDERAL HIGH LEVEL WASTE DISPOSAL PROGRAM?” 12 Va. Envtl. L.J. 539)//twonily
Another possibility is the Spending or General Welfare Clause, which arguably is the constitutional basis for several environmentally salient
statutes, including, inter alia, the Clean Water Act of 1977. n106 But the Spending
or General Welfare Clause cannot have
been a basis for the originally-enacted NWPA or, more specifically, its notice of disapproval and congressional
override provisions. U.S.C. 10135 and 10136(b) provide no financial incentives to states. Discussing the Spending
Clause, Justice O'Connor states in New York: First, under
Congress' spending power, "Congress may attach
conditions on the receipt of federal funds." Such conditions must (among other requirements) bear some
relationship to the purpose of the federal spending; otherwise, of course, the spending power could render academic the
Constitution's other grants and limits of federal authority. Where the recipient of federal funds is a State, as is not unusual
today, the conditions attached to the funds by Congress may influence a State's legislative choices . In the 1987
Amendments Act, apparently recognizing that the carrot may be constitutional when the stick is not, Congress authorized
the Secretary of Energy to enter benefits agreements with prospective repository states, and Nevada in particular. n109 These agreements
could authorize the annual payment of millions of dollars to the state [*565] in return for "the acceptance of high-level radioactive waste or
spent nuclear fuel in that State". n110 Nevada has refused to enter any such agreement or receive any such benefit from the federal
government. Even though Congress may have intended to cure the coercive problem posed by the notice of disapproval and congressional
override provisions by coupling them with "benefits" provisions in the NWPA, Nevada's refusal to accept benefits leaves the notice of
disapproval and congressional override provisions bare of the constitutional assistance of the Spending Clause. n111 VI. Environmental Subsidy
On one hand, the
Constitution would not permit Congress simply to transfer radioactive waste from
generators to state governments. Such a forced transfer, standing alone, would in principle be no different than
a congressionally compelled subsidy from state governments to radioactive waste producers. The same is true of the provision
requiring the States to become liable for the generators' damages. n112
**land counterplan
1nc – land counterplan
Text: The Nuclear Regulatory Commission should designate a centralized dry cask
storage interim site as the sole candidate for spent nuclear fuel disposal.
Consent-based land storage solves the case and their DAs
Audgette 5/20 – staff writer @ Brattleboro Reformer Magazine (Bob Audgette, 5/20/14, “Spent nuclear
fuel may stay on site long after Vermont Yankee shuts down,”
http://www.berkshireeagle.com/news/ci_25798437/spent-nuclear-fuel-may-stay-site-longafter)//twonily
In fact, the Nuclear Regulatory Commission released last year a revised waste confidence rule that stated
impacts would be small if spent fuel had to be stored at nuclear sites "indefinitely.” Ernest Moniz, the secretary of the U.S.
Department of Energy, was in Vermont last week. During a phone interview with the Reformer, Moniz said his department is focused on developing a way to take care of the nation's nuclear
waste. However, noted Moniz, DOE needs the go-ahead from Congress. Senate Bill 1240, which has been in the Energy and Natural Resources Committee since 2013, would establish a new
organization to manage nuclear waste, provide a consensual process for siting nuclear waste facilities and ensure adequate funding for managing nuclear waste. "The legislation has been
the Department
of Energy entered into contracts with the operators of the nation's nuclear power plants and agreed to take possession of all
nuclear waste produced as a result of their operations. The plan was to move the waste to a centralized storage facility for long-term disposal, and after a siting process, Yucca Mountain
crafted and is totally consistent with administration policy," said Moniz. "We certainly hope to see it marked up in committee and hopefully passed." In 1983,
in Nevada was chosen. But after $9 billion was invested in the project, the Obama administration pulled the plug due to local opposition, environmental concerns and pressure from Harry
Despite all the money spent on Yucca Mountain, said Moniz, it's not a viable project. "It
certainly did not follow the consent-based process." In January of 2012, the Blue Ribbon Commission on America's Nuclear Future, of which Moniz
was a member, released a report concluding a repository needed to be established as quickly as possible, but not without
local input. Currently, all the waste produced by the power plants is being stored onsite in either spent nuclear fuel pools or dry casks. In late August 2013, Entergy announced it
Reid, the Senate majority leader and democrat from Nevada.
would be closing Yankee at the end of 2014 because it was no longer financially viable due to the fact that natural gas has driven down the costs of producing electricity. Late last week,
Entergy announced that it would soon be asking for permission to construct an additional dry cask storage facility at Yankee. The pad will be used for the placement of 100-ton dry casks, which
will each contain up to 25 tons of spent nuclear fuel once it has cooled down enough to be removed from the fuel pool located inside the plant's reactor building. The first storage pad at
Vermont Yankee was constructed in 2006 and now holds 13 dry casks, with room for 23 more. Each cask contains 68 fuel assemblies, meaning there are now 884 assemblies in dry cask
storage. There are another 2,627 spent fuel assemblies in the pool in the reactor building and another 368 assemblies currently in the reactor vessel. The proposed new pad will be similar in
Senate Bill 1240 calls for the construction of a pilot facility for the storage of
priority waste, one or more additional storage facilities for the storage of nonpriority nuclear waste, and one or more repositories for the permanent disposal of nuclear waste. The
pilot facility would be used "to demonstrate the safe transportation of spent nuclear fuel and high-level
radioactive waste ... [and] to demonstrate the safe storage of spent nuclear and high-level radioactive waste ... at the one or more storage
facilities, pending the construction and operation of deep geologic disposal capacity for the permanent disposal of the spent nuclear
size and storage capacity to the one already on site.
fuel or high-level radioactive waste." If Congress approves Senate Bill 1240, Moniz said it is hoped a pilot facility can be established early in the 2020s. He said a pilot facility should have been
We should
pursuing consolidated storage facilities
part of the nation's waste storage strategy since 1983. "
have been
in parallel with repository
development," said Moniz. The bill also calls for the development of the Nuclear Waste Administration, taking the responsibility for moving and storing the nuclear rods and other high-level
waste out of the hands of the Department of Energy. The Nuclear Waste Administration would also be responsible for finding a geological repository. Moniz said that wherever a spent fuel
"The consent-based approach is very crucial to us," said
hosting community and the state and the federal government must be aligned if we are given Congressional
authority to pursue this work with communities that are interested. We fully expect that there will be multiple interested communities."
The process is intended to allow prospective host communities to decide whether, and on what terms they will
host a nuclear waste facility; is open to the public and allows interested persons to be heard in a meaningful way; is flexible and allows decisions to be reviewed and
repository is established, it needs to be established with the consent of the hosting community.
Moniz. "The
modified in response to new information or new technical, social, or political developments; and is based on sound science and meets public health, safety, and environmental standards. Sen.
Bernie Sanders, I-Vt., said it's very important that states are involved in all decisions related to decommissioning, and not just the siting of a spent nuclear fuel storage facility. Under current
rules, public hearings can be held to take input, but in the end, the operator and the Nuclear Regulatory Commission are the only entities that have any real say in how a plant is
decommissioned, said Sanders. "We need to make sure that states that are undergoing decommissioning have a real seat at the table so they can participate in the best way to decommission a
plant." Mike Twomey, Entergy's vice president for external affairs, told the Reformer he and other industry executives expect that the federal government will eventually fulfill its obligation to
remove the spent fuel from Vermont Yankee and sites around the country. "Until it does,
we are confident that we are storing it safely within the
spent fuel pool or in dry cask storage. This has been extensively reviewed by the NRC and we are very confident that both methods provide safe storage until
such a time as the federal government removes the spent fuel."
2nc – xt: solvency
Dry cask storage solves the case
SP 14 – Sheboygan Press (7/12/14, “Our View: Find a long-term solution to nuclear waste storage,”
http://www.sheboyganpress.com/story/opinion/2014/07/12/view-find-long-term-solution-nuke-wastestorage/12585369/)//twonily
Dominion Resources Inc. deserves credit for listening to Town of Carlton residents concerned that it might take the full 60 years allowed by
federal law to decommission the Kewaunee Nuclear Power Plant, which has not produced electricity for a year. Residents and town
officials are rightfully concerned about the negative economic impact of a shuttered nuclear plant, once so vital to the local economy, in their
midst. The company in response recently announced
plans to speed the process — by about four years — of moving spent
nuclear fuel rods from a large storage pool at the plant to more secure, long-term storage in 24 concrete
casks, each standing 18 feet tall. Company officials say the accelerated fuel storage schedule could also speed up
the plant decommissioning process. It is an expensive — and safer — proposition. The company told federal regulators it would
spend $103 million through 2016 to manage the spent fuel, according to the Milwaukee Journal Sentinel. Decommissioning the plant will cost
an estimated cost of $884 million by 2073. Concrete cask storage is safer than leaving spent fuel cooling in water for many years. With spent
nuclear fuel, however, there is no panacea. Nobody feels comfortable with it in their back yard, no matter the method employed to store it
there. After years of planning, the
federal government halted efforts to open a nuclear waste disposal site at
Yucca Mountain in Nevada, and a blue ribbon commission formed by the Obama administration recommends that the nation set up several
regional sites to store used nuclear fuel. Nuclear experts are in general agreement that long-term storage of spent
nuclear fuel at a permanent site is the safest bet in the long term. That it hasn’t happened after many years of trying
speaks to the difficulty in establishing such a site, or series of sites. For now, on-site storage in dry casks is the next best
alternative . Such storage is being used by many nuclear sites throughout the country, including Point Beach in Manitowoc County and at a
reactor in La Crosse. The snail's pace at which these things take place can be frustrating. State, federal and local agencies all play a role, and
nothing happens quickly. That is why Dominion's announcement of an intent
to speed up the waste storage process —
however minimal the impact in the grand scheme of decommissioning — is welcome. Our only wish is that
safety is not compromised anywhere in the process.
More evidence – dry cask solves
Sweet 11 – staff writer for IEEE Spectrum (Bill Sweet, 6/7/11, “Case for Accelerating Dry Cask Storage of
Spent Nuclear Fuel,” http://spectrum.ieee.org/energywise/energy/nuclear/case-for-accelerating-drycask-storage-of-spent-nuclear-fuel-)//twonily
A newly released report from the International Panel on Fissile Materials contains information that implicitly bolsters
the case for moving spent fuel out of cooling ponds and into dry cask storage, both in the United States and in
most other parts of the world as well. After 9/11 it already was apparent that fuel in cooling ponds could make a tempting target for terrorists-and one much easier to hit than reactor cores. Now, in the wake of the dangerous fire in the Fukushima cooling pond,
the case for
accelerating dry cask storage is inescapable . With plans for permanent disposal of nuclear wastes stalled
just about everywhere except for Finland and Sweden, spent fuel should be moved as fast as possible out of cooling ponds and
into dry casks. What does that mean? As the Fissile Materials report usefully explains, "In dry cask storage, spent fuel
assemblies are typically placed in steel canisters that are surrounded by a heavy shielding shell of
reinforced concrete, with the shell containing vents allowing air to flow through to the wall of the canister and cool the fuel. A typical
dry cask for Pressurized Water Reactor fuel contains about 10 tonnes of spent fuel, roughly one half of an
annual discharge
from a 1 GWe reactor." The large cylindrical
containers (seen in the Nuclear Regulatory Commission photo
above) generally are located close to reactor sites in the United States, but
are much "harder" than the spent fuel ponds
also typically found at the sites . Worldwide, about 90 percent of spent fuel is in vulnerable cooling ponds
and only a tenth in dry casks, according to the report. The numbers are somewhat better for the U nited S tates, where,
of roughly 64,500 tonnes of heavy metal (uranium and plutonium, basically), 15,250--almost a quarter of the total--is in dry casks.
On-site storage solves 100% of the impact – zero risk of their impacts
Lydersen 13 – reporter specializing in energy, environment, labor, public health, and immigration, staff
writer for Midwest Energy News (Kari Lydersen, 11/15/13, “In Illinois, nuclear industry sees no urgency
on waste storage,” http://www.midwestenergynews.com/2013/11/15/in-illinois-nuclear-industry-seesno-urgency-on-waste-storage/)//twonily
While nuclear critics at the hearing described possible nightmare scenarios, nuclear plant employees provided a
polar opposite view. A Boilermakers union member extolled the quality of dry casks, and challenged anyone who questions their safety
to meet him in the parking lot after the hearing. Young power plant employees said they have no concerns residing
near and working at the reactors. They are members of a group of “nuclear enthusiasts” called North American Young Generation in Nuclear
(NAYGN). “I currently live within 50 miles of three nuclear power plants as I’m sure many of you do,” said Samantha Schussele, a reactor
engineer at the LaSalle reactor in Illinois, southwest of Chicago. “I plan to get married there, I plan to raise my family there, and I have the
utmost confidence that my family will live in a safe community enhanced by those nuclear power plants.” Chris Rosso described a
“shocking safety culture” at the Braidwood reactor in Illinois, where he is an associate project manager. He said statistics show
the industry is very safe for workers especially compared to the sector he had considered entering, construction. His co-worker
Amanda Stenson, 25, has worked at Braidwood for three and a half years as a radiation protection technical specialist and engineer. “A lot of
you guys were mentioning terrorist attacks,” she said of industry critics. “Every three years the government
comes up with a team of military individuals to break into our plant…they really try to break in…they shoot fake
weapons at each other… it’s like high-tech laser tag.” She said Braidwood has repeatedly passed this safety test,
and that she “absolutely” feels safe at work there.
**space counterplan
1nc – space counterplan (version 1)
Text: The United States federal government should substantially increase its nuclear waste disposal
development beyond the Earth’s mesosphere.
Sub seabed disposal is nowhere near far and fast enough—space waste disposal is feasible and
doesn’t affect the environment
Simberg 2 – aerospace engineer and consultant in space commercialization, space tourism, and Internet
security (Rand Simberg, 2/28/2, “Nuclear Waste Should Be Stored on the Moon,”
http://www.foxnews.com/story/2002/02/28/nuclear-waste-should-be-stored-on-moon/)//twonily
Unfortunately, nuclear energy and nuclear waste are not issues amenable to decisions based on sound
science — people tend to get too emotional about things that they don't understand. There aren't any simple solutions to this
policy problem. Nuclear energy is potentially the most environmentally benign source available in the near term (though the federal policy on it has been idiotic
since the inception of the industry, making it much more hazardous and expensive than it need be, by mandating intrinsically bad plant designs). But waste
disposal is probably the most pressing problem, and it's one that's independent of plant design. And even if we were to renounce
nuclear power today (with the attendant economic and environmental damage as we either destroy local economies from energy shortages, or increase production
from much dirtier coal plants which produce the evil CO2, and actually put out more radiation than properly-operating nukes), we still have tens of thousands of
tons of waste sitting in unsafe conditions at existing plants. Every
criticism of Yucca Mountain applies in spades to the available
alternative — continuing to accumulate it at the plants in a wide range of conditions, few of them good. If Nevada wants to fight this
decision, they'll have to do more than simply naysay it and declare that, after over two decades and billions of dollars, it needs more study. They have to offer a
viable alternative. And any alternative should consider the following: one
generation's waste is another's commodity. Before the
of the features of the
Yucca Mountain solution is that the waste will be available to us in the future when we may find it useful, and any alternative
should ideally have that feature as well. But on the bright side, another feature (well, actually, it's a bug) of the Yucca Mountain plan is that it
invention of the internal combustion engine, gasoline was a waste byproduct of cracking oil for other purposes. Thus, one
will cost billions of dollars and take several years to implement. This effectively lowers the evaluation bar for competing concepts — they don't have to be either
cheap or fast, as long as they're better. Those of you who read my ravings regularly probably know where I'm going with this. Many eons ago, when I was an
undergraduate, I took a course in aerospace systems design. The class project was to come up with a
way to dispose of nuclear waste — in
space. While it was (of course) a brilliant study, it has also been more recently analyzed by people who both knew what they were doing and got paid for it. It
turns out to be (at least technically — politics are another matter) a non-ridiculous idea. These are the basic options: —
dropping it into good ol' Sol, which is really really expensive, and puts it totally out of the reach of our smarter
descendants; — lofting it out of Sol's system completely, which is cheaper than putting it in the Sun, but still expensive, and practically
if not theoretically out of reach of future recyclers; — a long-term orbit, which is accessible, but long term can't be
guaranteed to be long-enough term; and finally, — on some planetary surface, most likely the Moon because it's
the most convenient. Lunar storage sounds like a winner to me. There's no ecology to mess up there, the
existing natural radiation environment will put that particular grade of nuclear waste to shame when it comes to particle
dispensing, and we can retrieve it any time we want, while making it hard (at least right now) for terrorists to get their
hands on it. So, great storage location. Now, how do we get it there? Aye, there's the rub. The two problems, of course, are cost and safety. It turns out that
both are tractable, as long as one doesn't use Shuttle, or any existing launcher, as a paradigm for the achievable. The key to both reducing cost
and increasing reliability is high flight rate of reusable systems — what I call space transports. Fortunately, like space tourism,
hazardous waste disposal may be a large enough market to allow such a system to be developed. A thousand
tons is a thousand flights of a vehicle with a one-ton payload. And there are many thousands of tons of nuclear waste in storage.
And the tonnage will only increase if it's further processed for safe handling and storage (such as vitrification, in which it is encased in glass). Preliminary
estimates indicate that it can in fact be done economically in the context of the current nuclear industry
operating costs; the major issue is safety. This issue has been addressed as well, and it's something that Nevada (a state
that also offers high potential as a home for rocket racing and the space tourism industry) should take seriously as a possible alternative to terrestrial storage. It
might allow them to make the lemon that they've been stuck with into the lemonade of a whole new 21st-century industry.
1nc – space counterplan (version 2)
Text: The United States federal government should develop beamed thermal
propulsion technology that disposes of fissile waste materials beyond the Earth’s
mesosphere.
There has been a global shift towards reprocessing technology—makes collapse of the
non-proliferation regime inevitable
Lyman and Hippel, April 2008—*scientist at the UCS **professor of international affairs at Princeton
(Edwin Lyman and Frank N. von Hippel, “Reprocessing Revisited: The International Dimensions of the
Global Nuclear Energy Partnership,” Arms Control
http://www.armscontrol.org/act/2008_04/LymanVonHippel)//twonily
For decades, the U nited S tates had opposed the ambitions of South Korea and several other non-nuclear-weapon
states to begin civil spent fuel reprocessing programs. Washington rightly feared that allowing these
states to separate plutonium from highly radioactive spent fuel would destabilize the nonproliferation regime by
drastically reducing the time between a decision to acquire nuclear weapons and having a large nuclear
arsenal. This would make both internal and external constraints on proliferation much less effective. Yet only two
years after Bush’s speech, spurred by the fear that the inability to remove spent nuclear fuel piling up at reactor sites in the United States and
many other countries would threaten a nuclear renaissance, the Bush
administration subsumed its initial proposal into a
new scheme known as the Global Nuclear Energy Partnership (GNEP). One of the chief objectives of GNEP was to
promote the virtues of spent nuclear fuel reprocessing and the civil use of plutonium as a nuclear waste
management strategy. Although GNEP represented a reversal of previous U.S. policies that opposed the spread of reprocessing, the
Bush administration billed GNEP as a nonproliferation initiative because it would still limit reprocessing facilities
to the nuclear-weapons states and Japan and would use reprocessing technologies that would not separate pure plutonium, unlike
the PUREX (plutonium and uranium extraction) technology in use today. GNEP member states without reprocessing plants
would be encouraged to send their spent fuel to other countries for reprocessing. Today, GNEP no longer
adheres to these constraints. Eager for support from reprocessing states such as France, Japan, and Russia, the Bush administration has
stopped warning about the dangers of separated plutonium. It now advocates the quick deployment of a minor variant of PUREX for
reprocessing U.S. power reactor fuel, even though this modification would produce a mixture of uranium and plutonium that would be as
vulnerable to theft or diversion as plutonium alone. For the longer term, the Bush administration champions liquid sodium-cooled fast-neutron
reactors and pyroprocessing, a form of reprocessing that it describes as “proliferation resistant” although it falls far short of any common-sense
definition of this standard. At U.S. urging, 20 other countries, including South Korea, have now joined the United States in signing a GNEP
Statement of Principles that embraces the development and use of reprocessing technology and contains no commitments on the part of its
members to limit the spread of sensitive fuel cycle facilities such as reprocessing plants. In
promoting the development of
pyroprocessing and other experimental separations technologies , the Bush administration says it hopes to
persuade those countries that currently use conventional PUREX reprocessing to switch to these
other tech nologies eventually, thereby ending the production of pure plutonium. Yet through GNEP, the
administration is promoting reprocessing primarily to countries that do not reprocess at all but rather
store their spent fuel. Spent fuel storage is a far more proliferation-resistant management strategy than
any form of reprocessing. In a February 2008 speech, Dennis Spurgeon, assistant secretary of energy for nuclear energy, argued that
“ closing the fuel cycle is essential for expansion of nuclear power in the U.S. and around the world.” This
assertion is highly questionable because reprocessing is 10 times more costly than spent fuel storage. If nuclear power is
to become more widely competitive, its cost must decrease, not increase. Spurgeon’s view, however,
reflects the belief of GNEP supporters in the need to bypass the political logjams that block permanent
spent fuel storage, which they see as a chief impediment to a major global increase in nuclear power. In the absence of geological
repositories, reprocessing plants provide an alternative destination for the spent fuel accumulating at nuclear power plants. This change in
the U.S. attitude toward reprocessing is at odds with the welcome, recent global trend of countries
abandoning reprocessing because it is costly and complicates waste disposal rather than facilitating it.
The net result of even a partial success of the Bush administration’s policy would be a reversal in the decline in the number of
countries with stockpiles of separated plutonium, thereby undermining the nonproliferation regime .
2nc – xt: solvency
Space disposal is the key alternative to solve the nuclear waste problem—it’s cheap,
efficient, safe, feasible—technology exists now
Coopersmith, 8/22/5—associate professor of history at Texas A&M University, specializes in the history
of technology and the history of Russia (Jonathan, The Space Review, “Nuclear waste in space?”
http://www.thespacereview.com/article/437/1)//twonily
Neither the space shuttle nor conventional rockets are up to this task. Not only are they expensive, but they lack the desired
reliability and safety as insurance rates demonstrate. Instead, we need to develop a new generation of launch systems where
the launcher remains on the ground so the spacecraft is almost all payload, not propellant. As well as being more
efficient, ground-launched systems are inherently safer than rockets because the capsules will not carry liquid fuels, eliminating
the in-flight danger of an explosion. Nor will the capsules have the pumps and other mechanical equipment of rockets, further reducing the chances of something
going wrong. We
need to develop a new generation of launch systems where the launcher remains on the
ground so the spacecraft is almost all payload, not propellant. How would disposal of nuclear wastes in space actually work? In
the simplest approach, a ground-based laser system will launch capsules directly out of the solar system. In a more
complicated scheme, the laser system will place the capsules into a nuclear-safe orbit, at least 1,100 kilometers above the earth, so that they could not reenter for
several hundred years at a minimum. Next, a space tug will attach the capsules to a solar sail for movement to their final destination orbiting around the sun, far, far
from earth. The underlying
concept is simple: the launcher accelerates the capsule to escape velocity. Like a gun,
only the bullet heads toward the target, not the entire gun. Unlike a shuttle or rocket, ground systems are designed for quick
reuse . To continue the analogy, the gun is reloaded and fired again. These systems would send tens or hundreds of kilograms
instead of tons into orbit per launch. Of the three possible technologies—laser, microwave, and electromagnetic railguns—
laser propulsion is the most promising for the next decade . In laser propulsion, a laser beam from the ground
hits the bottom of the capsule. The resultant heat compresses and explodes the air or solid fuel there,
providing lift and guidance. Although sounding like science fiction, the concept is more than just an elegant idea. In October 2000, a 10-kilowatt laser
at White Sands Missile Range in New Mexico boosted a two-ounce (50 gram) lightcraft over 60 meters vertically. These numbers seem small, but prove the
underlying feasibility of the concept. American research, currently at Rensselaer Polytechnic Institute in New York with previous work at the Department of Energy’s
Lawrence Livermore National Laboratory in California, has been funded at low levels by the United States Air Force, NASA, and FINDS, a space development group.
The United States does not have a monopoly in the field. The four International Symposiums on Beamed
Energy Propulsion have attracted researchers from Germany, France, Japan, Russia, South Korea, and other countries. The
long-term benefit of a ground-based system will be much greater if it can ultimately handle people as
well as plutonium. Dartmouth physics professor Arthur R. Kantrowitz, who first proposed laser propulsion in 1972, considers the concept
even more promising today due to more efficient lasers and adaptive optics, the tech nology used by
astronomers to improve their viewing and the Air Force for its airborne anti-ballistic missile laser. Where should the nuclear
waste ultimately go? Sending the capsules out of the solar system is the simplest option because the
laser can directly launch the capsule on its way. Both Ivan Bekey, the former director of NASA’s of Advanced Programs in the Office of
Spaceflight, and Dr. Jordin T. Kare, the former technical director of the Strategic Defense Initiative Organization’s Laser Propulsion Program, which ran from 198790, emphasized
solar escape is the most reliable choice because less could go wrong.
Laser Propulsion technology is viable and cost effective
Patel 11—Prachi Patel, Astrobiology Magazine, Internally Qualified, 1/21/2011 ("Laser
Propulsion Could Beam Rockets into Space," Space.com, Accessed online at
http://www.space.com/10658-laser-rocket-propulsion-technology.html, Accessed on
9/6/11)//twonily
Space launches have evoked the same image for decades: bright orange flames exploding beneath a rocket as it lifts, hovers and takes off into
the sky. But an
alternative propulsion system proposed by some researchers could change that vision. Instead of explosive chemical
reactions onboard a rocket, the new concept, called beamed thermal propulsion, involves propelling a rocket by
shining laser light or microwaves at it from the ground. The technology would make possible a reusable
single-stage rocket that has two to five times more payload space than conventional rockets , which
would cut the cost of sending payloads into low-Earth orbit. NASA is now conducting a study to examine the
possibility of using beamed energy propulsion for space launches. The study is expected to conclude by March 2011. In
a traditional chemical rocket propulsion system, fuel and oxidizer are pumped into the combustion chamber under high pressure and burnt,
which creates exhaust gases that are ejected down from a nozzle at high velocity, thrusting the rocket upwards. A
beamed thermal
propulsion system would involve focusing microwave or laser beams on a heat exchanger aboard the rocket. The
heat exchanger would transfer the radiation's energy to the liquid propellant, most likely hydrogen, converting
it into a hot gas that is pushed out of the nozzle. "The basic idea is to build rockets that leave their energy source on the
ground," says Jordin Kare, president of Kare Technical Consulting, who developed the laser thermal launch system concept in 1991. "You
transmit the energy from the ground to the vehicle." With the beam shining on the vehicle continually, it would take 8 to 10
minutes for a laser to put a craft into orbit, while microwaves would do the trick in 3 to 4 minutes. The vehicle would have to be designed
without shiny surfaces that could reflect dangerous beams, and aircraft and satellites would have to be kept out of the beam's path. Any
launch system would be built in high-altitude desert areas, so danger to wildlife shouldn't be a
concern , Kare says. Thermal propulsion vehicles would be safer than chemical rockets since they can't
explode and don't drop off pieces as they fly. They are also smaller and lighter because most of the
complexity is on the ground, which makes them easier and cheaper to launch.
"People can launch small
satellites for education, science experiments, engineering tests, etc. whenever they want, instead of having to wait for a chance to share a ride
with a large satellite," Kare says. Another
cost advantage comes from larger payload space . While conventional
propulsion systems are limited by the amount of chemical energy in the propellant that's released by combustion, in
beamed systems you can add more energy externally. That means a spacecraft can gain a certain
momentum using less than half the amount of propellant of a conventional system, allowing more room for the
payload. "Usually in a conventional rocket you have to have three stages with a payload fraction of three percent overall," says Kevin Parkin,
leader of the Microwave Thermal Rocket project at the NASA Ames Research Center. "This propulsion system will be single stage with a payload
fraction of five to fifteen percent." Laser propelled spacecraft would be small, simple and expendable with the complicated launch system on
the ground. Having
a higher payload space along with a reusable rocket could make beamed thermal
propulsion a low-cost way to get material into low Earth orbit, Parkin says. Parkin developed the idea of microwave
thermal propulsion in 2001 and described a laboratory prototype in his 2006 Ph.D. thesis. A practical real-world system should
be possible to build now because microwave sources called gyrotrons have transformed in the last
five decades, he says. One megawatt devices are now on the market for about a million U.S. dollars. "They're going up in power
and down in cost by orders of magnitude over the last few decades," he says. "We've reached a point where you can
combine about a hundred and make a launch system." Meanwhile, the biggest obstacle to using lasers to beam energy
has been the misconception that it would require a very large, expensive laser, Kare says. But you could buy
commercially available lasers that fit on a shipping container and build an array of a few hundred. "Each would have its own telescope and
pointing system," he says. "The array would cover an area about the size of a golf course." The smallest real laser launch system would have 25
to 100 megawatts of power while a microwave system would have 100 to 200 megawatts. Building such an array would be expensive, says
Kare, although similar to or even less expensive than developing and testing a chemical rocket. The
system would make most
economic sense if it was used for at least a few hundred launches a year. In addition, says Parkin, "the main
components of the beam facility should last for well over ten thousand hours of operation, typical of this class
of hardware, so the savings can more than repay the initial cost." In the near term, beamed energy propulsion would be
useful for putting microsatellites into low Earth orbit, for altitude changes or for slowing down spacecraft as they descend to Earth. But the
technology could in the future be used to send missions to the Moon or to other planets and for
space tourism.
Overwhelming consensus proves reprocessing causes prolif
Lowe 5 – Emeritus Professor of Science, Technology, and Society @ Griffith University, President of the
Australian Conservation Foundation (Prof. Ian Lowe, September 2005, “Nuclear Power: No Solution to
Climate Change,”
http://www.acfonline.org.au/sites/default/files/resources/Nuclear_Power_No_Solution_to_Climate_Ch
ange.pdf)//twonily
In addition to the potential to use reactor grade plutonium produced in a normal power reactor operating cycle for weapons
production, there is the option of using civil power or research reactors to irradiate uranium for a much shorter
period of time to produce weapon grade plutonium ideally suited to weapons manufacture. Hundreds of tonnes
of plutonium have been produced in power reactors (and to a lesser extent research reactors), hence the importance of
the debate over the use of reactor grade plutonium in weapons. Definitions of plutonium usually refer to the level of the unwanted
plutonium-240 isotope: • Weapon grade plutonium contains less than 7% plutonium-240. • Fuel grade plutonium contains 7-18% plutonium240 • Reactor grade plutonium contains over 18% plutonium-240. APPENDIX 3. the use of reactor grade plutonium in nuclear weapons 7
Plutonium in spent fuel removed from a commercial power reactor typically contains 55-70% plutonium-239, 20-25% plutonium-240 and
smaller quantities of other plutonium isotopes. For weapons, the
ideal plutonium is low burn-up plutonium with a very
high proportion of plutonium-239. As neutron irradiation of uranium-238 proceeds, the greater the quantity of isotopes such as
plutonium-240, plutonium-241, plutonium-242 and americium-241, and the greater the quantity of plutonium-238 formed (indirectly) from
uranium-235. These unwanted isotopes in high burn-up plutonium make it more difficult and dangerous to produce nuclear weapons. The use
of reactor grade plutonium in weapons manufacture poses several additional problems compared to the use of weapon grade plutonium (see
Gorwitz, 1998 for discussion and references). The difficulties associated with the use of reactor grade plutonium are as follows. Spent fuel
from power reactors running on a normal operating cycle will be considerably back to table of contents 7 more radioactive and hotter than low
burn-up spent fuel. Thus high burn-up spent fuel and the separated reactor grade plutonium are more hazardous – though it is not difficult to
envisage scenarios whereby proliferators place little emphasis on worker safety. It may also be more time consuming and expensive to separate
reactor grade plutonium. Weapons with reactor grade plutonium are likely to be inferior in relation to reliability and
yield when compared to weapon grade plutonium. A greater quantity of reactor grade plutonium may be required to produce a weapon of
similar yield, or conversely there will be a lower yield for reactor grade plutonium compared to a similar amount of weapon grade plutonium. A
strong majority of informed opinion holds that reactor grade plutonium can indeed be used for the manufacture of nuclear weapons despite
the above-mentioned problems. A report from the US Department of Energy (1997) puts the following view: “Virtually
combination of plutonium isotopes
any
– the different forms of an element having different numbers of neutrons in their nuclei –
can be used to make a nuclear weapon . ... “At the lowest level of sophistication, a potential proliferating state or
subnational group using designs and technologies no more sophisticated than those used in first-generation
nuclear weapons could build a nuclear weapon from reactor-grade plutonium that would have an assured, reliable yield
of one or a few kilotons (and a probable yield significantly higher than that). ... “In short, reactor-grade plutonium is
weapons-usable , whether by unsophisticated proliferators or by advanced nuclear weapon states.” The broad thrust of the US
Department of Energy’s position is supported by, among others: • An expert committee drawn from the major US
nuclear laboratories (Hinton et al., 1996). • Robert Seldon (1976), of the Lawrence Livermore Laboratory. • J. Carson Mark (1993),
former director of the Theoretical Division at Los Alamos National Laboratory. • Matthew Bunn (1997), chair of the US
National Academy of Sciences’ analysis of options for the disposal of plutonium removed from nuclear weapons. • Prof. Marvin
Miller, from the MIT Defense and Arms Control Studies Program (quoted in Dolley, 1997). • The Office of Arms Control and back to table of
contents 7 Nonproliferation, US Department of Energy (quoted in Dolley, 1997). • Steve Fetter (1999) from Stanford University’s Centre for
International Security and Cooperation. • the IAEA’s Department of Safeguards (Shea and Chitumbo, 1993). With the exception of plutonium
comprising 80% or more of the isotope plutonium-238,
all plutonium is
defined by the IAEA as a “direct use” material, that is,
“nuclear material that can be used for the manufacture of nuclear explosives components without
transmutation or further enrichment”, and is subject to equal levels of safeguards. According to Hans Blix, then IAEA Director
General: “On the basis of advice provided to it by its Member States and by the Standing Advisory Group on Safeguards Implementation
(SAGSI), the Agency considers high burn-up reactor-grade plutonium and in general plutonium of any isotopic composition with the exception
of plutonium containing more than 80 percent Pu-238 to be capable of use in a nuclear explosive device. There
is no debate on the
matter in the Agency’s Department of Safeguards.” (Blix, 1990; see also Anon., 1990).
Counterplan solves better than burial strategies—prefer comparative evidence
Kare, PhD in Astrophysics, 90 (Jordin T. Kare, “GROUND-TO-ORBITLASER PROPULSION ADVANCED
APPLICATIONS” www.osti.gov/bridge/servlets/purl/6203669-Uxrfwv/6203669.pdf)//twonily
Unlike weight- and volume-limited conventional systems, a laser launcher could potentially handle unprocessed or
minimally-processed waste. This minimizes boda radiation and toxic chemical hazards on the ground, and is
therefore crucial to an economical system. A laser system could even be cheaper than geological
disposal , because there would be less handling (separation, glassification) of waste. Lasers can launch waste
directly to any desirable disposal site -- the Lunar surface, interplane- tary space, or deep space (solar escape). The required deltaV's are roughly 11 to 15 km/s, beyond the capability of any single-stage chemical rocket or proposed cannon launcher. Laser propulsion
could even launch payloads directly into the Sun, at 30 km/s delta-V. The precision guidance and flexible
launch di-ection of a laser syst_.:_ could allow dumping payloads into, e.g., a selected lunar crater, for future
recovery if desired. Very small laser propulsion payloads could present problems of shielding (to protect both
launch-site workers and possible crash site bystanders) and safe any-angle reentry [11]. However, some problems of laser
propulsion, such as launch delays due to weatller, are not important as long as the total mass la_nched is constant
and the reliability is high .
proliferation net benefit
Continuation of worldwide reprocessing results in nuclear proliferation, terrorism, and
collapse of the NPT—effective disposal is key to solve
UCS, 4/5/11 (Union of Concerned Scientists, “Nuclear Reprocessing: Dangerous, Dirty, and Expensive”,
http://www.ucsusa.org/nuclear_power/nuclear_power_risk/nuclear_proliferation_and_terrorism/nucle
ar-reprocessing.html)//twonily
Reprocessing is a series of chemical operations that separates plutonium and uranium from other nuclear waste
contained in the used (or “spent”) fuel from nuclear power reactors. The separated plutonium can be used to fuel reactors,
but also to make nuclear weapons. In the late 1970’s, the U nited S tates decided on nuclear non-proliferation grounds not
to reprocess spent fuel from U.S. power reactors, but instead to directly dispose of it in a deep underground
geologic repository where it would remain isolated from the environment for at least tens of thousands of years.
While some supporters of a U.S. reprocessing program believe it would help solve the nuclear waste problem,
reprocessing would not reduce the need for storage and disposal of radioactive waste. Worse,
reprocessing would make it easier for terrorists to acquire nuclear weapons materials, and for nations
to develop nuclear weapons programs.
Reprocessing would increase the risk of nuclear terrorism. Less than 20 pounds of
plutonium is needed to make a simple nuclear weapon. If
the plutonium remains bound in large, heavy, and highly radioactive
spent fuel assemblies (the current U.S. practice), it is nearly impossible to steal. In contrast, separated plutonium is
not highly radioactive and is stored in a concentrated powder form. Some claim that new reprocessing technologies
that would leave the plutonium blended with other elements, such as neptunium, would result in a mixture that would be too radioactive to
steal. This is incorrect; neither
neptunium nor the other elements under consideration are radioactive enough
to preclude theft. Most of these other elements are also weapon-usable. Moreover, commercial-scale
reprocessing facilities handle so much of this material that it has proven impossible to keep track of it
accurately in a timely manner, making it feasible that the theft of enough plutonium to build several bombs
could go undetected for years.
A U.S. reprocessing program would add to the worldwide stockpile of separated and vulnerable
civil plutonium that sits in storage today, which totaled roughly 250 metric tons as of the end of 2009—enough for some 30,000 nuclear
weapons. Reprocessing the U.S. spent fuel generated to date would increase this by more than 500 metric tons.
Reprocessing would
increase the ease of nuclear proliferation. U.S. reprocessing would undermine the U.S. goal of halting
the spread of fuel cycle technologies that are permitted under the Nuclear Non-Proliferation Treaty but can be used to make
nuclear weapons materials. The U nited S tates cannot credibly persuade other countries to forgo a
technology it has newly embraced for its own use. Although some reprocessing advocates claim that new reprocessing
technologies under development will be "proliferation resistant," they would actually be more difficult for international inspectors to safeguard
because it would be harder to make precise measurements of the weapon-usable materials during and after processing. Moreover, all
reprocessing technologies are far more proliferation-prone than direct disposal.
Reprocessing
would
hurt
U.S.
nuclear waste
management e fforts. First, there is no spent fuel storage crisis that warrants such a drastic change in
course. Hardened interim storage of spent fuel in dry casks is an economically viable and secure option for at least
fifty years. Second, reprocessing does not reduce the need for storage and disposal of radioactive waste, and a
geologic repository would still be required . Plutonium constitutes only about one percent of the spent fuel from U.S. reactors.
After reprocessing, the remaining material will be in several different waste forms, and the total volume of nuclear waste will have been
increased by a factor of twenty or more, including low-level waste and plutonium-contaminated waste. The largest
component of the
remaining material is uranium, which is also a waste product because it is contaminated and undesirable for reuse
in reactors. Even if the uranium is classified as low-level waste, new low-level nuclear waste facilities would have to be built to dispose of it. And
to make a significant reduction in the amount of high-level nuclear waste that would require disposal, the used
fuel would need to be
reprocessed and reused many times with an extremely high degree of efficiency—an extremely difficult
endeavor that would likely take centuries to accomplish. Finally, reprocessing would divert focus and resources
from a U.S. geologic disposal program and hurt —not help—the U.S. nuclear waste management effort. The
licensing requirements for the reprocessing, fuel fabrication, and waste processing plants would dwarf those needed to
license a repository, and provide additional targets for public opposition. What is most needed today is
a renewed focus on secure interim storage of spent fuel and on gaining the scientific and technical consensus
needed to site a geological repository.
Prolif causes global nuclear war
Sokolski 9 (Henry, Executive Director – Nonproliferation Policy Education Center, “Avoiding a Nuclear
Crowd”, Policy Review, June/July, http://www.hoover.org/publications/policyreview/46390537.html)
Finally, several new nuclear weapons contenders are also likely to emerge in the next two to three decades.
Among these might be Japan, North Korea, South Korea, Taiwan, Iran, Algeria, Brazil (which is developing a
nuclear submarine and the uranium to fuel it), Argentina, and possibly Saudi Arabia (courtesy of weapons leased to it by Pakistan or
China), Egypt, Syria, and Turkey. All of these states have either voiced a desire to acquire nuclear weapons or tried to
do so previously and have one or more of the following: A nuclear power program , a large research reactor, or plans to build a large
power reactor by 2030. With a large reactor program inevitably comes a large number of foreign nuclear experts (who are exceedingly
difficult to track and identify) and extensive training , which is certain to include nuclear fuel making.19 Thus, it will be much more difficult to know when and if a state is
acquiring nuclear weapons (covertly or overtly) and far more dangerous nuclear technology and materials will be available to terrorists than would otherwise. Bottom line: As more states bring
large reactors on line more will become nuclear-weapons-ready — i.e., they could come within months of acquiring nuclear weapons if they chose to do so.20 As for nuclear safeguards
keeping apace, neither the iaea’s nuclear inspection system (even under the most optimal conditions) nor technical trends in nuclear fuel making (e.g., silex laser enrichment, centrifuges, new
South African aps enrichment techniques, filtering technology, and crude radiochemistry plants, which are making successful, small, affordable, covert fuel manufacturing even more
likely)21 afford much cause for optimism. This brave new nuclear world will stir existing security alliance relations more than it will settle them: In the case of states such as Japan, South
Korea, and Turkey, it could prompt key allies to go ballistic or nuclear on their own. Nuclear 1914 At a minimum, such developments will be a departure from whatever stability existed during
the Cold War. After World War II, there was a clear subordination of nations to one or another of the two superpowers’ strong alliance systems — the U.S.-led free world and the RussianChinese led Communist Bloc. The net effect was relative peace with only small, nonindustrial wars. This alliance tension and system, however, no longer exist. Instead, we now have one
superpower, the United States, that is capable of overthrowing small nations unilaterally with conventional arms alone, associated with a relatively weak alliance system (nato) that includes
two European nuclear powers (France and the uk). nato is increasingly integrating its nuclear targeting policies. The U.S. also has retained its security allies in Asia (Japan, Australia, and South
Korea) but has seen the emergence of an increasing number of nuclear or nuclear-weapon-armed or -ready states. So far, the U.S. has tried to cope with independent nuclear powers by
making them “strategic partners” (e.g., India and Russia), nato nuclear allies (France and the uk), “non-nato allies” (e.g., Israel and Pakistan), and strategic stakeholders (China); or by fudging if
a nation actually has attained full nuclear status (e.g., Iran or North Korea, which, we insist, will either not get nuclear weapons or will give them up). In this world, every nuclear power center
(our European nuclear nato allies), the U.S., Russia, China, Israel, India, and Pakistan could have significant diplomatic security relations or ties with one another but none of these ties is viewed
by Washington (and, one hopes, by no one else) as being as important as the ties between Washington and each of these nuclear-armed entities (see Figure 3). There are limits, however, to
what this approach can accomplish. Such a weak alliance system, with its expanding set of loose affiliations, risks becoming analogous to the international system that failed to contain
offensive actions prior to World War I. Unlike 1914, there is no power today that can rival the projection of U.S. conventional forces anywhere on the globe. But in a world with an increasing
actions of just one or two state s or groups that
might threaten to disrupt or overthrow a nuclear weapons state could check U.S. influence or ignite a war Washington could
have difficulty containing. No amount of military science or tactics could assure that the U.S. could disarm or neutralize such
number of nuclear-armed or nuclear-ready states, this may not matter as much as we think. In such a world, the
threatening or unstable nuclear states.22 Nor could diplomats or our intelligence services be relied upon to keep up to date on what each of
these governments would be likely to do in such a crisis (see graphic below): Combine these proliferation trends with the others noted above
and one
could easily create the perfect nuclear storm: Small differences between nuclear competitors that would
put all actors on edge ; an overhang of nuclear materials that could be called upon to break out or significantly ramp up existing nuclear
deployments; and a variety of potential new nuclear actors developing weapons options in the wings. In such a setting, the military and
nuclear rivalries between states could easily be much more intense than before. Certainly each nuclear state’s military
would place an even higher premium than before on being able to weaponize its military and civilian surpluses quickly,
to deploy forces that are survivable, and to have forces that can get to their targets and destroy them with high levels of
probability. The advanced military states will also be even more inclined to develop and deploy enhanced air and missile defenses and longrange, precision guidance munitions, and to develop a variety of preventative and preemptive war options. Certainly, in such a world,
relations between states could become far less stable. Relatively small developments — e.g., Russian support for
sympathetic near-abroad provinces; Pakistani-inspired terrorist strikes in India, such as those experienced recently in Mumbai; new
Indian flanking activities in Iran near Pakistan; Chinese weapons developments or moves regarding Taiwan; state-sponsored
assassination attempts of key figures in the Middle East or South West Asia, etc. — could easily prompt nuclear weapons
deployments with “strategic” consequences ( arms races, strategic miscues, and even nuclear war ). As Herman Kahn once
noted, in such a world “every quarrel or difference of opinion may lead to violence of a kind quite different from what is
possible today.”23 In short, we may soon see a future that neither the proponents of nuclear abolition, nor their critics, would ever
want. None of this, however, is inevitable.
space colonization net benefit
Space disposal through ground-based laser launch solves best—solves space
colonization
Coopersmith, 9/22/5—associate professor of history at Texas A&M University (Jonathon, The Space
Review, “Nuclear waste in space?,” http://www.thespacereview.com/article/437/1)//twonily
When I fly from Texas to Europe, I pay $3–6 a pound, depending on how well I do buying a ticket. When a satellite or shuttle is launched into
space, the customer (or taxpayer) pays over $10,000 a pound. That is the major challenge of space flight: until the cost of going into space
drastically decreases, the large-scale exploration and exploitation of space will not occur. The world currently sends approximately 200 tons of
payloads, the equivalent of two 747 freighter flights, into space annually. At $50–500 million a launch, very few cargoes can justify their cost.
We have here the classic chicken-and-egg situation. As long as space flight remains very expensive, payloads will be small. As long as payloads
remain small, rockets will be expensive. If annual demand were 5,000 tons instead of 200, the equation would shift. Engineers would have the
incentive to design more efficient launch systems. Large, guaranteed payloads could significantly reduce the cost of reaching orbit, ushering in a
new, affordable era in space for governments, businesses, universities, and, hopefully, individuals. Where would this much new cargo come
from? Fortunately, there is an answer. Unfortunately, it’s not intuitively attractive, at least at first glance: it’s high-level nuclear waste, the
45,000 tons and 380,000 cubic meters of high-level radioactive spent fuel and process waste and detritus (as opposed to the more abundant
but far less dangerous and shorter-lived low-level waste) from six decades of nuclear weapons programs and civilian power plants. There
are three good reasons to send nuclear waste into space. First, it is safe. Second, space disposal is
better than the alternative, underground burial. Third, it may finally open the door to widespread
utilization of space . Where would this much new cargo come from? Fortunately, there is an answer. Unfortunately, it’s not intuitively
attractive, at least at first glance: it’s high-level nuclear waste. Because of the obvious and real concern about moving such dangerous material
anywhere, let alone into space, this proposal justly raises the question of safety. Can
nuclear waste be safely launched into
earth orbit? The answer is yes. By keeping the launch system on the ground instead of putting it on the vehicle,
designing and building unbreakable containers, and arranging multiple layers of safety precautions, we can
operate in a judicious and safe manner. The nuclear waste problem The problem of nuclear waste disposal is real, especially for future
generations. Leaving
radioactive wastes on earth creates permanent and tempting targets for terrorism as
well as threatening the environment. We have a moral imperative to solve this problem now so we do
not burden our children and their children. For twenty years, the federal government’s preferred solution to the nuclear waste
problem is underground disposal, specifically, over 11,000 30–80 ton canisters buried in 160 kilometers of tunnels hundreds of meters
underneath Yucca Mountain in northern Nevada. Forty-nine states favor this plan. It’s not hard to guess which state does not. To be fair to
Nevada, any site would draw the same objections from anybody who lost this lottery, yet policymakers remain stuck on the idea of burial.
Nevada’s fears are justified: researchers cannot guarantee complete environmental isolation for the thousands of years needed for these
wastes to decay harmlessly. A recent report by the Government Accountability Office raised nearly 200 technical and managerial concerns
about the site. Even the promise of construction and maintenance jobs has failed to sway a skeptical public. Historically, garbage has been
something to bury or recycle. Consequently, nuclear waste disposal has remained the province of the geologists, who are professionally inclined
to look down, not up. That’s shortsighted. The permanent elimination of high-level radioactive waste demands a reconceptualization of the
problem. We need to look up, not down. Let’s put high-level radioactive waste where it belongs, far out in space where it will not endanger
anyone on earth. The laser launch solution Neither the space shuttle nor conventional rockets are up to this task. Not only are they expensive,
but they lack the desired reliability and safety as insurance rates demonstrate. Instead, we need to develop a new generation of launch systems
where the launcher remains on the ground so the spacecraft is almost all payload, not propellant. As well as being more efficient,
groundlaunched systems are inherently safer than rockets because the capsules will not carry liquid fuels,
eliminating the in-flight danger of an explosion. Nor will the capsules have the pumps and other
mechanical equipment of rockets, further reducing the chances of something going wrong. We need to
develop a new generation of launch systems where the launcher remains on the ground so the spacecraft is almost all payload, not propellant.
How would disposal of nuclear wastes in space actually work? In the simplest approach, a ground-based
laser system will launch capsules directly out of the solar system. In a more complicated scheme, the
laser system will place the capsules into a nuclear-safe orbit, at least 1,100 kilometers above the earth,
so that they could not reenter for several hundred years at a minimum. Next, a space tug will attach the
capsules to a solar sail for movement to their final destination orbiting around the sun, far, far from
earth. The underlying concept is simple: the launcher accelerates the capsule to escape velocity. Like a gun, only the
bullet heads toward the target, not the entire gun. Unlike a shuttle or rocket, ground systems are designed for quick reuse. To continue the
analogy, the gun is reloaded and fired again. These systems would send tens or hundreds of kilograms instead of tons into orbit per launch. Of
the three possible technologies—laser, microwave, and electromagnetic railguns—laser propulsion
is the most promising for
the next decade. In laser propulsion, a laser beam from the ground hits the bottom of the capsule. The
resultant heat compresses and explodes the air or solid fuel there, providing lift and guidance. Although
sounding like science fiction, the concept is more than just an elegant idea. In October 2000, a 10-kilowatt laser at White Sands Missile Range in
New Mexico boosted a two-ounce (50 gram) lightcraft over 60 meters vertically. These numbers seem small, but prove the underlying feasibility
of the concept. American research, currently at Rensselaer Polytechnic Institute in New York with previous work at the Department of Energy’s
Lawrence Livermore National Laboratory in California, has been funded at low levels by the United States Air Force, NASA, and FINDS, a space
development group. The United States does not have a monopoly in the field. The four International Symposiums on Beamed Energy
Propulsion have attracted researchers from Germany, France, Japan, Russia, South Korea, and other countries. The
long-term benefit
of a ground-based system will be much greater if it can ultimately handle people as well as plutonium.
Dartmouth physics professor Arthur R. Kantrowitz, who first proposed laser propulsion in 1972, considers the concept even more promising
today due to more efficient lasers and adaptive optics, the technology used by astronomers to improve their viewing and the Air Force for its
airborne anti-ballistic missile laser. Where should
the nuclear waste ultimately go ? Sending the capsules out of
the solar system is the simplest option because the laser can directly launch the capsule on its way. Both
Ivan Bekey, the former director of NASA’s of Advanced Programs in the Office of Spaceflight, and Dr. Jordin T. Kare, the former technical
director of the Strategic Defense Initiative Organization’s Laser Propulsion Program, which ran from 1987-90, emphasized solar escape is the
most reliable choice because less could go wrong. A second option, a solar orbit inside Venus, would retain the option of retrieving the
capsules. Future generations might actually find our radioactive wastes valuable, just as old mine tailings are a useful source of precious metals
today. After all, the spent fuel still contains over three-quarters of the original fuel and could be reprocessed. Terrorists or rogue states might
be able to reach these capsules, but if they have that technical capability, stealing nuclear wastes will be among the least of our concerns. This
approach is more complex, demanding a temporary earth orbit and a solar sail to move it into a solar orbit, thus increasing the possibility of
something going wrong. Addressing safety The issue of safety has two components. One is the actual engineering of safe operations. This is
demonstrable and testable. The other, equally important, part is the public perception of safety. As University of Missouri nuclear engineering
professor William H. Miller, a specialist on nuclear fuel cycle and fuel management, noted, “The obvious
problem is public
perception. No matter how far you go to show that it is safe, there will always be someone to say ‘what
if’.” John W. Poston, a Texas A&M nuclear engineering professor with a forty-six year career in nuclear health physics, agrees, considering
convincing people of the safety of space-based disposal as challenging, if not more so, than the actual technical questions. Safety should
appropriately dominate public discussion of this proposal. To succeed, space disposal must demonstrate lower risk and uncertainty than
underground disposal. This project must be completely safe technically, but nonetheless will not succeed unless potential supporters and
opponents are thoroughly convinced about its safety and efficiency. Safety should appropriately dominate public discussion of this proposal. To
succeed, space disposal must demonstrate lower risk and uncertainty than underground disposal. Assuring
safety is possible. The two
major concerns are launching the capsule and ensuring the integrity of the capsule. Laser launching is safer and more reliable
than rockets. The absence of rocket propellants and its accompanying propulsion systems eliminates the possibility of an explosion. The
major problem would be if the laser failed before the capsule reached escape velocity. Because the capsule will be bulletshaped, its ballistic characteristics are well known. Thus, if a launch failure occurred, the capsule would
land only in known recovery zones. Launch trajectories would be designed to avoid populated areas.
One advantage of a laser launch system is that the safe return from these aborted missions can be
demonstrated by testing with inert capsules. Scores of launches could test every conceivable scenario,
the equivalent of firing a new rifle to understand all its characteristics. This could not be done with a rocket. If
another layer of safety is desired, placing the launch system on an island in the Pacific Ocean will further decrease the chance of an aborted
flight landing in a populated area. Such isolation would also improve security. The capsule itself must protect its radioactive cargo not only from
the demands of a normal launch with its severe atmospheric heating and aerodynamic loading, but also from potential accidents ranging from
reentry into the atmosphere to a seriously flawed launch that would send the capsule into the high pressures of the ocean’s depths or into
land. Summing up the engineering challenges, Bob Carpenter, the program manager for Orbital Sciences’ space nuclear power program,
cautioned, “I’m not saying they are insurmountable, but they are major technical issues to be solved.” Jordin Kare, now an independent
aerospace consultant, was more optimistic. The
laser can accelerate the capsule slowly in the lower atmosphere,
reducing heating. Furthermore, noted NASA nuclear engineer Dr. Robert C. Singleterry, the same aerobraking analyses and technologies
that use a planet’s atmosphere to slow down a visiting spacecraft as the Mars Global Surveyor demonstrated in 1997 can ensure the control of
a capsule leaving the earth’s atmosphere. The
integrity of a capsule can be demonstrated too. The aerospace
industry has accumulated decades of research and experience on how to contain radioactive material in
containers that can maintain their integrity despite atmospheric re-entry, accidents, explosions, and
other potential catastrophes. They are called nuclear warheads. Designing containers for space disposal is well within the state of
the art. Dr. Rowland E. Burns, the engineer who led a NASA study in the mid-1970s on this issue, stated it is feasible to design and construct
containers that can safely withstand the demands of even a catastrophic explosion, claiming, “I won’t say you would have to nuke the container
to break it, but it would take something like that.” Materials
technology has improved since the 1970s, making even
tougher capsules possible. Because launch costs will be relatively inexpensive, engineers can overdesign
for safety instead of trying to create the lightest possible container. Fail-proof capsules can be built, though the ratio
of waste to shielding will be low. Ensuring safety must have an inclusionary component. A broadly based panel of stakeholders, including
skeptics and opponents, should determine the criteria for tests and scenarios that proponents must pass. Computer simulations and controlled
tests, however, will not be enough. Convincing demonstrations such as aborting launches with a mock payload and sending test capsules to
reenter the atmosphere will be necessary to calm fears and prove the veracity of safety calculations. Minimum danger must be demonstrated,
not assumed. Those opponents who unilaterally reject space-based disposal should be asked to propose an alternative. Nuclear waste will not
go away on its own volition. Expensive and inexpensive What about the economics? Let’s be honest and upfront in our accounting: Space
disposal will ultimately cost tens of billions of dollars, but the federal government has already spent $8 billion researching underground disposal
and expects the total cost will be $60 billion. The difference is that future generations will not have to worry about the waste and they will have
an infrastructure for reaching space. While technologically impressive, developments in tunnel boring have far less potential. Disposal in any
form will be expensive. Space disposal at least offers a major spinoff, inexpensive access to space. Putting a small surcharge—a fraction of a
cent per kilowatt-hour of electricity—on power generated by nuclear reactors would handle the operational costs. Those opponents who
unilaterally reject space-based disposal should be asked to propose an alternative. Nuclear waste will not go away on its own volition. How can
a system be both expensive and inexpensive? Judging by the costs of other high technology projects such as the Airbus 380 and Boston’s Big
Dig, developing a laser launch system will require at least $5–10 billion. This is a lot of money, but historically space technologies are expensive:
The Apollo program cost over $150 billion in contemporary dollars. Constructing the actual launch system will require a few billion dollars and
operations will consume billions more. And even if the price of a pound to escape velocity is only $100, 5000 tons is $1 billion. We owe the
future as well as ourselves the opportunity to determine whether space-based disposal is the best way to handle nuclear waste. Accordingly,
over the next few years, NASA and the Department of Energy should establish three research programs. The first will determine the criteria and
acceptance for a demonstration program. The second program will design safe capsules and the third program will test the ground-launched
system. For the price of a new hotel in Las Vegas or a day or two of the defense budget, we will have enough information to decide whether to
commit large resources to space-based disposal. Space disposal may not appear the obvious solution to the high-level nuclear waste problem.
Nor is disposing of nuclear waste the obvious answer to the question of how to reduce the cost of reaching space. But the immense magnitude
of nuclear wastes provides the incentive to develop launch systems that will drastically cut the cost of space exploitation. The result will be
lower operating costs, more infrastructure, and more skilled personnel able to develop other areas of space. Once a ground launcher is
developed and built, constructing additional launchers will be far less costly and risky. The dream of affordable access to space may then come
true, opening up the final frontier in ways that we have not dreamed of since the 1960s. The development of the computer may offer a good
analogy. Government funding, mostly from the military, intelligence community, and NASA, greatly accelerated research, development, and
diffusion of computers since the 1940s. The federal government did this to conduct projects of national significance such as the census, Social
Security, weapons research (especially nuclear explosions), cryptoanalysis, and space exploration. Not until the 1970s did the civilian market
grow large enough to seize the technological initiative. Space disposal may prove a similar opportunity. Once a ground launcher is developed
and built, constructing additional launchers will be far less costly and risky. The dream of affordable access to space may then come true,
opening up the final frontier in ways that we have not dreamed of since the 1960s. As important, we will be acting ethically, providing our
children a safer earth and inexpensive access to space for people as well as plutonium.
Colonization solves inevitable extinction
Fox News 10 (“Abandon Earth or Face Extinction, Stephen Hawking Warns – Again”, 6-9,
http://www.foxnews.com/scitech/2010/08/09/abandon-earth-face-extinction-warns-stephenhawking/)
It's time to abandon Earth , warned the world's most famous theoretical physicist. In an interview with website Big Think, Stephen
Hawking warned that the long-term future of the planet is in outer space. "It will be difficult enough to avoid disaster on planet
Earth in the next hundred years, let alone the next thousand, or million. The human race shouldn't have all its
eggs in one basket, or on one planet ," he said. "I see great dangers for the human race," Hawking said. "There have been a number of
times in the past when its survival has been a question of touch and go. The Cuban missile crisis in 1963 was one of these. The frequency of
such occasions is likely to increase in the future." "But I'm an optimist. If we can avoid disaster for the next two centuries, our
species
should be safe, as we spread into space ," he said. That said, getting to another planet will prove a challenge, not to mention
colonizing it for humanity. University of Michigan astrophysicist Katherine Freese told Big Think that "the nearest star [to Earth] is Proxima
Centauri which is 4.2 light years away. That means that, if you were traveling at the speed of light the whole time, it would take 4.2 years to get
there" -- or about 50,000 years using current rocket science. Still, we
need to act and act fast , Hawking stated. "It will be difficult
only chance of long-term
survival is not to remain inward looking on planet Earth but to spread out into space . We have made remarkable
progress in the last hundred years. But if we want to continue beyond the next hundred years, our future is in space." This is not the
enough to avoid disaster in the next hundred years, let alone the next thousand or million. Our
first time Hawking has warned of impending planetary doom. In 2006, the physicist warned that Earth was at an ever increasing risk of being
wiped out. And lately, Hawking has become quite outspoken. In April, he warned of the dangers of communicating with aliens, telling the
Discovery Channel that extra-terrestrials are almost certain to exist -- and humanity should avoid contact with them at all cost.“To my
mathematical brain, the numbers alone make thinking about aliens perfectly rational,” he said. “The real challenge is to work out what aliens
might actually be like.” The answer, he suggests, is that most of alien life will be the equivalent of microbes or simple animals -- the sort of life
that has dominated Earth for most of its history -- and they could pose a serious threat to us. In May Hawking said he believed humans could
travel millions of years into the future and repopulate their devastated planet. If spaceships are built that can fly faster than the speed of light,
a day on board would be equivalent to a year on Earth. That's because -- according to Einstein -- as objects accelerate through space, time slows
down around them. “Time travel was once considered scientific heresy, and I used to avoid talking about it for fear of being labelled a crank,"
he said in Stephen Hawking's Universe. "These days I’m not so cautious.”
2nc – xt: solves colonization
Space nuclear waste disposal is a prerequisite to exploration and research
development—spills over to better and cheaper space technology
Coopersmith, 8/22/5—associate professor of history at Texas A&M University, specializes in the history
of technology and the history of Russia (Jonathan, The Space Review, “Nuclear waste in space?”
http://www.thespacereview.com/article/437/1)//twonily
When I fly from Texas to Europe, I pay $3–6 a pound, depending on how well I do buying a ticket. When a satellite or shuttle is
launched into space, the customer (or taxpayer) pays over $10,000 a pound. That is the major challenge of space
flight: until the cost of going into space drastically decreases , the large-scale exploration and
exploitation of space will not occur. The world currently sends approximately 200 tons of payloads, the equivalent of two 747
freighter flights, into space annually. At $50–500 million a launch, very few cargoes can justify their cost. We have
here the classic chicken-and-egg situation. As long as space flight remains very expensive, payloads will be small. As
long as payloads remain small, rockets will be expensive. If annual demand were 5,000 tons instead of 200, the
equation would shift. Engineers would have the incentive to design more efficient launch systems. Large, guaranteed
payloads could significantly reduce the cost of reaching orbit, ushering in a new, affordable era in space for governments,
businesses, universities, and, hopefully, individuals. Where would this much new cargo come from? Fortunately, there is an answer.
Unfortunately, it’s not intuitively attractive, at least at first glance: it’s high-level nuclear waste, the 45,000 tons and 380,000
cubic meters of high-level radioactive spent fuel and process waste and detritus (as opposed to the more abundant but far less dangerous and
shorter-lived low-level waste) from
six decades of nuclear weapons programs and civilian power plants. There are
three good reasons to send nuclear waste into space. First, it is safe. Second, space disposal is better
than the alternative, underground burial. Third, it
may finally
open the door to widespread utilization
of space . Where would this much new cargo come from? Fortunately, there is an answer. Unfortunately, it’s not intuitively
high-level nuclear waste. Because of the obvious and real concern about moving such
dangerous material anywhere, let alone into space, this proposal justly raises the question of safety. Can nuclear waste be
safely launched into earth orbit? The answer is yes. By keeping the launch system on the ground instead of putting it
on the vehicle, designing and building unbreakable containers, and arranging multiple layers of safety
precautions, we can operate in a judicious and safe manner.
attractive, at least at first glance: it’s
**grounds pic
1nc – grounds pic
Text: The United States Supreme Court should declare its intent to grant certiorari in the next case
involving preemption of nuclear waste through the NWPA. In an appropriate test case, the United
States Supreme Court should mandate a substantial increase in the Nuclear Regulatory Commission’s
sub-seabed nuclear waste disposal development of the Earth’s oceans by overturning the precedent
established by Pacific Gas & Electric Co. v. State Energy Resources Conservation and Development
Commission on the grounds that the California nuclear development moratorium contradicts
Congressional intent to grant the Nuclear Regulatory Commission field preemption under the Nuclear
Waste Policy Act.
The aff’s ruling decks the nuclear industry –enforcing state sovereignty in nuclear waste issues allows
states to expand moratoria on development and overturns existing federal preemption – this turns
the entire case – only the CP solves
Harper 11 – attorney for the US NRC, J.D. from the George Washington University Law School (Richard S.
Harper, Summer 2011, “Pacific Gas & Electric Revisited: Federal Preemption of State Nuclear
Moratoria,” http://gwujeel.files.wordpress.com/2013/07/2-2-harper.pdf)//twonily
A major obstacle to new nuclear power plants in the United States exists in the form of state moratoria
placed on new nuclear builds. These moratoria, enacted by state legislatures, prohibit any energy utility from constructing a nuclear
power plant until certain conditions have been met.15 In most states the condition is based on the federal government’s
demonstration of a permanent solution to the nuclear waste issue. The Supreme Court upheld these
moratoria in Pacific Gas & Electric Co. v. State Energy Resources Conservation and Development
Commission (“PG&E”). Although the federal government has proposed the Yucca Mountain geologic repository as a solution to the waste issue, ongoing litigation,
compounded by the current Administration’s attempt to withdraw its license application from the Nuclear Regulatory Commission (“NRC”), raises further doubts that
the government will produce a timely, permanent solution as mandated by the moratoria. Additionally, the NRC Chairman
recently directed the NRC staff to stop processing the Yucca Mountain license application.20 Therefore, on the basis of PG&E, certain
states are effectively prohibited from reconsidering nuclear energy as an option in combating climate change
and meeting their energy needs. This Note argues that the Supreme Court erred in PG&E by upholding the California
moratorium . The Court’s rationale that the California law was based on economic considerations overlooked
the fact that, as seen through the Nuclear Waste Policy Act21 (“NWPA”), Congress intended for the NRC to
completely occupy the field of nuclear waste regulation. The Court’s refusal to examine California’s rationale
in passing its moratorium effectively allows states to establish a moratorium by simply stating that their
motives are driven by economic concerns . To correct the problem created by PG&E , this Note proposes that: (1) the
NRC should issue regulations under the NWPA to clarify that it has exclusive authority to regulate in the field of nuclear waste, thereby barring states from making any waste considerations
regarding new nuclear power plants; (2) Congress should amend the NWPA to expressly state that the federal government occupies the field of nuclear waste regulation, thereby leaving no
room for states to regulate; and (3)
the Court should reverse this precedent
on its own,
based on evidence that the
assumptions upon which the Court relied have turned out to be erroneous
establishes that Congress intended the NRC to have field preemption
and because
the NWPA clearly
in the area of nuclear waste,
thereby
prohibiting states from using waste considerations as conditions for moratoria . Part I of this Note examines the
preemption standard as established by the Constitution and Supreme Court precedent, distinguishing between the different types of preemption granted to the federal government. Part II
examines the Atomic Energy Act (“AEA”), the NWPA, preemption case law surrounding both statutes, and the Supreme Court’s decision in PG&E. Part III discusses three possible solutions to
the preemption issues of state moratoria.22 Part IV provides a summary of the arguments and a conclusion.23 I. Federal Preemption of State Law The Supremacy Clause in the Constitution of
the United States provides that the “Constitution and the Laws of the United States which shall be made in Pursuance thereof; and all Treaties made, or which shall be made, under the
Authority of the United States, shall be the supreme Law of the Land.”24 The Supreme Court has consistently held this provision to grant federal preemption over state law.25 The most
evident method of preemption is when Congress clearly expresses intent to preempt state law through statutory language. 26 In Jones v. Rath Packing Co.,27 the Supreme Court determined
that
when Congress “unmistakably ordains” that a federal statute is to preempt a state statute, the state
statute must fail. This “ordination” can come through explicit language or can be implied through the “structure and purpose” of the statute. Therefore Congress’
intent to preempt, whether express or implied , is sufficient to establish federal preemption over state law. The
Court’s duty is to determine that intent and, once determined, to establish whether the state law under contention infringes on the federal statute’s intent to regulate. Even in the
absence of express statutory language, congressional intent can demonstrate that an entire field of regulation was
intended to be held under exclusive federal jurisdiction. For example, the federal regulation may be structured in a manner where there is simply
no room left for the states to supplement the regulation. Therefore any state law attempting to regulate in that area must also fail on the basis of federal preemption. A federal law may also
preempt state law even when there is no field preemption. If a court determines that the state law directly conflicts with federal law, to the extent that compliance with both statutes is
impossible, the state statute must fail. States cannot enact legislation that is inconsistent with a federal regulatory scheme. Such laws must fail because they directly interfere or conflict with
congressional purpose. Also, in situations where it is impossible to adhere to both state and federal law, the federal law necessarily preempts the state law.37 A federal agency may also take
action that preempts state law. The Supreme Court has established that a federal agency, “acting within the scope of its congressionally delegated authority,” can preempt state law through
it is well established that the federal
government has ample authority to regulate in a manner that preempts state law in the same area.
the issuance of regulations. While the determination of congressional intent is left in large part to the Court,
Failure to overturn Pacific Gas & Electric ensures states will exercise preemption authority – turns the
case
Harper 11 – attorney for the US NRC, J.D. from the George Washington University Law School (Richard S.
Harper, Summer 2011, “Pacific Gas & Electric Revisited: Federal Preemption of State Nuclear
Moratoria,” http://gwujeel.files.wordpress.com/2013/07/2-2-harper.pdf)//twonily
When the Court stated that Congress was regulating in an area that had historically been occupied by the
states, it misinterpreted the intent of the California statute because the statute was not focused on
regulating actual nuclear generation, but rather on regulating nuclear waste . As stated above, the Court found that California’s
moratorium was based on the economic concern that unresolved waste issues could drive up costs by prematurely closing a plant. The
states never had the authority to regulate waste ,
the Court should have recognized that the California statute was in effect making a determination based
on waste concerns, which would have resulted in federal preemption.110 Instead, the Court bowed out of an inspection
of California’s motives, stating that a determination of motive is often an “unsatisfactory venture.”111 By failing to make a
determination on motive, the Court effectively authorized states to enact any law infringing on the
preempted area of safety by simply claiming economic motivation . The Court may be correct that an inquiry into
motive may be an “unsatisfactory venture,” but the alternative could lead to the widespread allowance of state
regulation in preempted areas of nuclear safety. The Court, however, indicated that Congress, rather than the judicial system,
economic determination was based on a nuclear waste issue. Because
should determine “whether the state has misused the authority left in its hands.”112 The Court stated that it “should not assume the role
which our system assigns to Congress.”113 In this situation, it would appear that the Court is seeking to impose on Congress its responsibility
over cases and controversies. The judiciary’s power “extend[s] to all cases, in Law and Equity, arising under this Constitution, [and] the Laws of
the United States . . . .”114 The
Court in PG&E, however, seems to suggest that Congress is better suited to
determine whether a state is violating a federal law. The Court is better suited to decide controversies
on a case-by-case basis in order to assure that the laws enacted by Congress are followed. The Court
should not push that responsibility onto another branch of government, and therefore should have decided the
issue in this case. The Court also claimed that the economic fears of prematurely closing a power plant would “largely evaporate” once a
satisfactory disposal technology was found. Since the decision in 1983, the federal government has yet to demonstrate or satisfactorily develop
a technology for waste disposal. Yet, over twenty-five years after the decision in PG&E, no nuclear power plant has been forced to close
because of a lack of storage capacity. Various storage techniques have been developed, including dry cask storage,119 which greatly increases
nuclear power plant capacity to store waste onsite. Because
the economic justification for the Court’s decision has
proven to be erroneous, the Court should overturn its ruling and reverse its precedent.
The CP’s announcement forces a quick test case
Harper 11 – attorney for the US NRC, J.D. from the George Washington University Law School (Richard S.
Harper, Summer 2011, “Pacific Gas & Electric Revisited: Federal Preemption of State Nuclear
Moratoria,” http://gwujeel.files.wordpress.com/2013/07/2-2-harper.pdf)//twonily
While NRC regulations or amendments to the NWPA that establish federal preemption over nuclear waste determinations would likely provide an opportunity for
the Supreme Court to reconsider PG&E, the
Court could also seize upon any litigation involving federal preemption and
the NWPA to revisit its decision. Indeed, such a decision seems inevitable . In dealing with the litigation
surrounding Yucca Mountain, the Ninth Circuit briefly addressed the question of NWPA preemption, but stated
that the Supreme Court had not yet made this determination. Regardless of the course that the litigation takes, the
Supreme Court should grant certiorari in the next case involving preemption of nuclear waste
through the NWPA . Even though the Court’s decision in PG&E has been accepted without much debate,
significant issues remain with the Court’s rationale that leave open the possibility for reversal.
2nc – at: cp controversial
New court and most recent economic data means the CP isn’t controversial
Harper 11 – attorney for the US NRC, J.D. from the George Washington University Law School (Richard S.
Harper, Summer 2011, “Pacific Gas & Electric Revisited: Federal Preemption of State Nuclear
Moratoria,” http://gwujeel.files.wordpress.com/2013/07/2-2-harper.pdf)//twonily
The past twenty-five years have brought a number of changes not only to the nuclear industry, but also to the make-up
of the Court itself. The conservative leanings of today’s Court may suggest a willingness to reconsider
PG&E
because conservatives
are more likely to support nuclear energy than liberals. This inclination may be
moderated, however, by conservatives’ preference for policies that limit federal control over states and oppose broad sweeping regulations
overall. While the composition of the Court could lead to a change in past precedent, it is more likely that recent
economic
developments since the PG&E decision would be more influential. The economics of nuclear power plants have
improved with the passage of time. Although the Court in PG&E was concerned that nuclear power plants would incur increased costs
without a permanent solution to the waste issue, nuclear power plants instead have developed technology and procedures that now allow
them to operate more efficiently. 128 Instead
of being a risky investment, nuclear power has become one of the
cheapest ways to produce energy, running contrary to the concerns expressed by the Court in PG&E.
2nc – climate change nb
The CP solves climate change
Harper 11 – attorney for the US NRC, J.D. from the George Washington University Law School (Richard S.
Harper, Summer 2011, “Pacific Gas & Electric Revisited: Federal Preemption of State Nuclear
Moratoria,” http://gwujeel.files.wordpress.com/2013/07/2-2-harper.pdf)//twonily
The grave concerns with climate change have led the U nited S tates to consider many different paths away
from its current coal-dominated energy consumption. Even with the most recent expansion of renewable sources of energy,
coal continues to account for close to fifty percent of U.S. energy generation. The last two administrations have
claimed that the country needs a new generation of nuclear power plants in order to meet increased
energy demand while at the same time combating climate change .133 However, certain obstacles remain to
acting on those declarations. While states have the ultimate say in what energy sources they employ to meet their
power needs, PG&E has allowed states to institute moratoria without the proper basis, thereby eliminating
the option of nuclear power
in many states. Any of the three branches of government can act to amend this situation. The
executive branch, through the NRC, should enact new regulations clarifying its authority over nuclear waste and eliminating a state’s ability to
use nuclear waste determinations as a basis for their moratoria on new nuclear projects. The legislative branch should amend the NWPA to
include language expressly granting the NRC exclusive authority over nuclear waste policy, thereby eliminating any argument of state authority
over nuclear waste. Finally,
the Supreme Court should grant certiorari in the next case involving preemption
and the NWPA in order to correct its faulty reasoning in PG&E. With the pressing concerns of climate change,
Congress or the Obama administration should act to eliminate these unnecessary barriers to a carbon-free
source of electricity: nuclear energy. By clarifying the authority Congress placed in the NRC through the NWPA, the
United States will be better suited to meet rising energy demand
change.
without adding to the extreme dangers of climate
2nc – at: you only do a third of what Harper wants
One’s enough
Harper 11 – attorney for the US NRC, J.D. from the George Washington University Law School (Richard S.
Harper, Summer 2011, “Pacific Gas & Electric Revisited: Federal Preemption of State Nuclear
Moratoria,” http://gwujeel.files.wordpress.com/2013/07/2-2-harper.pdf)//twonily
One of the main obstacles to overturning PG&E is the availability of a cause of action involving
preemption, nuclear waste, and the NWPA. The best opportunity for the Supreme Court to reconsider PG&E was
through
litigation surrounding the siting studies performed by the U.S. Department of Energy for the Yucca Mountain
geologic repository. These studies generated substantial litigation, which was one reason for the significant delay
in the process.130 Unfortunately, the recent decreases in funding for the Yucca Mountain license application and the Obama administration’s withdrawal of
the application suggest that the Yucca Mountain project could fail without the need for further litigation on federal preemption. While any of the three
the
solutions presented in this Note by itself could correct the issue of states using nuclear waste
determinations as a basis for state moratoria, they are not mutually exclusive. If Congress were to amend the NWPA, that could persuade
the NRC to issue regulations clarifying its authority over nuclear waste. Either of those two actions could lead to litigation regarding the validity of existing state
moratoria under the new regulations or the amended language of the NWPA. Whatever the course of action, it
issues created by PG&E.
is clear there are solutions to the
2nc – at: perm do both
The perm has zero precedential value
Brownewell 11 – attorney @ Barnes & Thornburg (Monica Renee Brownewell, 2011, “Rethinking the
Restatement View (Again!): Multiple Independent Holdings and the Doctrine of Issue Preclusion,”
Valparaiso University Law Review, Vol. 37, No. 3 [2003], Art. 11)//twonily
The second problem in applying issue preclusion is the requirement that the issue must be "necessarily
decided,"
which will be the focus of this Note. It
is well settled that, if a court finds it is unable to decide a case on the merits due
to a procedural defect, yet does so anyway, the decision is not precluded from relitigation by a proper tribunal. Therefore, dicta or a jury's
special verdict, both of which are not binding to the controlling legal issues, will not be precluded from relitigation by the application of issue
preclusion. Furthermore, if
a court's decision could have been based on narrower grounds than those actually
chosen, the resolution of multiple issues was unnecessary to the judgment and will not bind the
parties in future litigation . Multiple independent holdings further challenge the "necessarily decided"
prong. A case decided on multiple independent holdings differs from a nonessential determination because multiple grounds for a decision
are not dicta. Second, there is no reason why the judge, the jury, or the parties would believe that the holdings are
less important or worthy of less scrutiny. Finally, all of the determinations are reviewable on appeal, unlike unnecessary
commentary by the court. There are four situations in which multiple independent holdings are possible. First, if a plaintiff asserts and fully
develops more than one legal theory to support a single claim for relief and the court finds for the plaintiff on more than one theory, multiple
independent holdings exist. Second, the plaintiff may plead more than one instance of conduct that gave rise to the claim. Third, certain claims
require the plaintiff to prove multiple elements, and the defendant will prevail if the plaintiff fails to establish one or more of the elements.
Finally, a defendant can deny the allegations of the complaint, as well as plead an affirmative defense or some other defense to prevail, any of
which, in conjunction with a negative finding for the plaintiff, would create multiple independent holdings.
Divorce litigation proves
Virginia Circuit Court 13 – Virginia Circuit Court of the City of Winchester (“Domestic Pretrial Order,”
Civil Action No. 12)//twonily
(e) In all cases in which a fault ground of divorce is claimed, specific findings necessary to support a conclusion that a fault
ground exists with specific reference to each date, time and place of any material incident. However, most divorces are entered on a no fault
grounds. " 'Where
dual or multiple grounds for divorce exist, the trial judge can use his sound discretion
to select the grounds upon which he will grant the divorce .'" Konefal v. Konefal, 18 Va. App. 612, 613- 14, 446 S.E.2d
153, 154, 11 Va. Law Rep. 7 (1994) (quoting Williams v. Williams, 14 Va. App. 217, 220, 415 S.E.2d 252, 254, 8 Va. Law Rep. 2521 (1992)). See
also Fadness v. Fadness, 52 Va. App. 833, 840 (Va. Ct. App. 2008).
More empirics – Oklahoma Worker Compensation Law
Shugart 14 – Staff Writer @ US Workers Comp (James Shugart, 5/5/14, “New Workers Compensation
Law in Oklahoma,” http://www.usworkerscomp.com/workers-compensation-blog/2014/05/05/newworkers-compensation-law-oklahoma/)//twonily
You may be aware that there is a new workers’ compensation law in Oklahoma. If you are an employee who is injured at work in Oklahoma, the
new law may make it even more important for you to have a workers’ compensation lawyer standing with you. In May 2013, the Oklahoma
Legislature passed Senate Bill 1062, and Governor Mary Fallin signed the new workers’ compensation bill into law. The bill is aimed at reforming
Oklahoma’s workers’ compensation into an administrative system. Questions about new law There have been questions regarding the new law
as to how it will be applied and how claims under the old law will be handled. Also, there have been questions concerning the constitutionality
of portions of the new law that was set to go into effect on February 1, 2014. Opponents challenged the new law on multiple grounds. One of
which was that the new law could violate a constitutional prohibition against covering multiple subjects in a single bill. This is a practice that is
commonly known as logrolling. Challenge of opponents The challenge of opponents to the new law led to it being brought before the Oklahoma
Supreme Court. On Monday, December 16, 2013, the
Oklahoma Supreme Court upheld the constitutionality of the
new workers’ compensation law. While the opponents of the new law challenged it on multiple grounds ,
the Supreme
Court’s majority opinion focused on the allegation concerning multiple subjects. The ruling of
the Supreme Court was that the new workers’ compensation law does not violate a constitutional prohibition against covering multiple subjects
in a single bill.
**unfinished
fracking counterplan
Shale storage solves
Koch 3/17 – Staff Writer @ USA Today (Wendy Koch, 3/17/14, “Shale could be long-term home for
nuclear waste,” http://www.usatoday.com/story/news/nation/2014/03/17/shale-nuclear-wastehome/6520759/)//twonily
**we don’t endorse the ableist language this card has been edited to remove
Could shale rock spur another energy bonanza? It's already helped create a surge in U.S. oil and natural gas production, and research today
suggests it could do something else: store radioactive waste from nuclear power plants. These rock formations are
ideal for storing potentially dangerous spent fuel for millennia , because they are nearly impermeable, a U.S.
geologist told a scientific meeting. One of the biggest risks of storing nuclear waste for thousands of years is water
contamination. The development of new U.S. nuclear power plants, all of which are now decades old, has been partly
hobbled [slowed] by the lack of a long-term repository for their waste. In 2009, the U.S. government abandoned
plans for a repository at Yucca Mountain in Nevada, so plants currently store about 77,000 tons of spent nuclear fuel
onsite in above-ground facilities. "Shale has a lot of nice properties . ... We really should consider whether this is
something we should look into," says Chris Neuzil of the U.S. Geological Survey, who presented his findings Monday in Dallas to the annual
meeting of the American Chemical Society. He says experiments show how incredibly watertight shale can be — 100 to 10,000 times less permeable than cement
grout. "Not all shales have the low permeabilities at the scale we desire," but plenty is available in tectonically stable areas that won't be
used for oil and natural gas production, Neuzil says. In recent years, hydraulic fracturing, or "fracking," is being used to break apart these rock deposits and extract
the the gas or oil trapped within. Neuzil says current U.S. storage of nuclear waste is problematic because the spent fuel continues to produce heat and harmful
radiation long after a power plant uses it to produce electricity. Plants typically store the waste in steel-lined, concrete pools filled with water or in massive, airtight
steel casks. Neuzil says safe maintenance of above-ground storage depends on stable societies for thousands of years. He
also notes the risks of natural disasters, including Japan's 2011 tsunami that knocked cooling pumps offline at the Fukushima Daiichi nuclear power plant. Several
countries, including France, Switzerland and Belgium, have plans to develop long-term
nuclear waste repositories hundreds of yards
underground in layers of shale and other clay-rich rock. Neuzil is investigating a site using limestone in Ontario with the Canadian Nuclear Waste
Management Organization. "He's bringing up a very sensible idea, but this isn't particularly new," says Mick Apted, a geochemist at Austinbased INTERA, an environmental consulting firm. "The Europeans have taught us this." Apted, who's working with Switzerland and Belgium on
their programs, says France is furthest along in pursuing an underground repository in clay-rich rock, which isn't as hard as shale. He says France, which gets 80%
of its electricity from nuclear power, has identified a site. In Finland and Sweden, he says, companies have submitted a
construction license to build a repository in granite-like rock and are waiting on government approval.
Shale storage solves the whole aff – massive expert consensus
ACS 3/17 – American Chemical Society (3/17/14, “Shale could be long-term home for problematic
nuclear waste,” http://www.acs.org/content/acs/en/pressroom/newsreleases/2014/march/shalecould-be-long-term-home-for-problematic-nuclear-waste.html)//twonily
DALLAS, March 17, 2014 — Shale, the source of the United States’ current natural gas boom, could help solve another
energy problem: what to do with radioactive waste from nuclear power plants. The unique properties of the
sedimentary rock and related clay-rich rocks make it ideal for storing the potentially dangerous spent fuel for
millennia, according to a geologist studying possible storage sites who made a presentation here today. The talk was one of more than
10,000 presentations at the 247th National Meeting & Exposition of the American Chemical Society (ACS), the world’s largest
scientific society, taking place here through Thursday. About 77,000 tons of spent nuclear fuel currently sit in
temporary above-ground storage facilities, said Chris Neuzil, Ph.D., who led the research, and it will remain dangerous
for tens or hundreds of thousands of years or longer. “Surface storage for that length of time requires
maintenance and security,” he said. “Hoping for stable societies that can continue to provide those things for millennia is not a good idea.”
He also pointed out that natural disasters can threaten surface facilities, as in 2011 when a tsunami knocked
cooling pumps in storage pools offline at the Fukushima Daiichi nuclear power plant in Japan. Since the U.S.
government abandoned plans to develop a long-term nuclear-waste storage site at Yucca Mountain in Nevada in 2009, Neuzil said
finding new long-term storage sites must be a priority. It is crucial because nuclear fuel continues to
produce heat and harmful radiation after its useful lifetime. In a nuclear power plant, the heat generated by uranium, plutonium
and other radioactive elements as they decay is used to make steam and generate electricity by spinning turbines. In temporary pool storage,
water absorbs heat and radiation. After spent fuel has been cooled in a pool for several years, it can be moved to dry storage in a sealed metal
cask, where steel and concrete block radiation. This also is a temporary measure. But shale
deep under the Earth’s surface
could be a solution. France, Switzerland and Belgium already have plans to use shale repositories to
store nuclear waste long-term. Neuzil proposes that the U.S. also explore the possibility of storing spent nuclear
fuel hundreds of yards underground in layers of shale and other clay-rich rock. He is with the U.S. Geological Survey and
is currently investigating a site in Ontario with the Canadian Nuclear Waste Management Organization. Neuzil explained that these rock
formations may be uniquely suited for nuclear waste storage because they are nearly impermeable —
barely any water flows through them. Experts consider water contamination by nuclear waste one of the biggest risks of long-term storage.
Unlike shale that one might see where a road cuts into a hillside, the rocks where Neuzil is looking are much more watertight. “Years ago, I
probably would have told you shales below the surface were also fractured,” he said. “But we’re seeing that that’s not necessarily true.”
Experiments show that water moves extremely slowly through these rocks, if at all. Various circumstances have
conspired to create unusual pressure systems in these formations that result from minimal water flow. In one well-known example, retreating
glaciers in Wellenberg, Switzerland, squeezed the water from subsurface shale. When they retreated, the shale sprung back to its original shape
faster than water could seep back in, creating a low-pressure pocket. That means that groundwater
now only flows extremely
slowly into the formation rather than through it. Similar examples are also found in North America, Neuzil said. Neuzil added
that future glaciation probably doesn’t pose a serious threat to storage sites, as most of the shale
formations he’s looking at have gone through several glaciations unchanged. “Damage to waste containers,
which will be surrounded by a filler material, is also not seen as a concern,” he said. He noted that one critical criterion for a good
site must be a lack of oil or natural gas that could attract future interest. The American Chemical Society is a nonprofit organization chartered
by the U.S. Congress. With more than 161,000 members, ACS is the world’s largest scientific society and a global leader in providing access to
chemistry-related research through its multiple databases, peer-reviewed journals and scientific conferences. Its main offices are in
Washington, D.C., and Columbus, Ohio.
Shale solves – unique sedimentary processes
NWN 3/17 – Nature World News (NWN, 3/17/14, “Storing Nuclear Waste in Shale Underground Could
be Safe Long-Term Solution, USGS Scientist Says,”
http://www.natureworldnews.com/articles/6366/20140317/storing-nuclear-waste-in-shaleunderground-could-be-safe-long-term-solution-usgs-scientist-says.htm)//twonily
Shale and other clay-rich rocks could be used for the long-term storage of spent nuclear fuel, according to a report
presented Monday at the 247th National Meeting & Exposition of the American Chemical Society in Dallas. Shale is the source of the current
boom in natural gas production in the US, but its unique
sedimentary properties make it an ideal long-term storage
solution for spent nuclear fuel, said research leader Christopher Neuzil, a hydrologist with the US Geological Survey. New nuclear
reactor design can power the entire world until 2083, using nuclear waste. According to Neuzil, there are nearly 77,000
tons of potentially dangerous spent nuclear fuel currently sitting in storage in temporary above-ground facilities around the
US. This fuel will continue to pose potential threats for generations, up to thousands of years from now. "Surface storage for that length
of time requires maintenance and security. Hoping for stable societies that can continue to provide those things for
millennia is not a good idea," Neuzil said in a statement, adding that natural disasters, such as the 2011 earthquake and tsunami that
struck the Fukushima Dai-ichi nuclear power plant in Japan, can also jeopardize the containment and security of housed nuclear fuel. Finding
a suitable storage solution for spend nuclear fuel is critical, Neuzil said, because even after it's taken out of reactors, it continues
to produce heat and harmful radiation. Neuzil recommends underground shale deposits as a means of storage for spent
nuclear fuel. He contends that water contamination is the greatest threat of nuclear waste and one of the
biggest risks to long-term storage. Shale, Neuzil reported at the Dallas meeting, is nearly impermeable and therefore it
could be a safe place to store nuclear waste. Other countries, such as France, Switzerland and Belgium already have plans to
use shale repositories to store nuclear waste long-term. Rich reserves of hydrocarbons can eb found in shale deposits
around Earth. But the difficulties in extracting these fossil fuels from shale are exactly why Neuzil thinks shale is a good storage option. Shale is
nearly impenetrable, so housing spent nuclear fuel in it could be a safe option that's not prone to leaks. Even
if one of the storage
containers were to leak while entombed within shale, because water doesn't penetrate the shale, there is little risk of the
contamination spreading. And the shake is so thick that it would take millions of years for radionuclides to
diffuse through the shale, according to a Bloomberg News report. Neuzil said that future glaciation would not pose a risk to the
storage system. "Damage to waste containers, which will be surrounded by a filler material, is also not seen
as a concern," he said. The sites for shale storage, however, would need to be assuredly removed from any future potential oil or natural
gas extraction operations. Nuezel said there are numerous sites around the US that are suitable for storage.
Fracking storage solves
Radford 14 – Staff Writer @ Climate News Network (Tim Radford, 4/7/14, “Can Fracking Solve The
Nuclear Waste Problem?” http://ecowatch.com/2014/04/07/can-fracking-solve-nuclear-wasteproblem/)//twonily
U.S. scientists are proposing that the source of one controversial energy program could provide a solution
to the problems of another. Nuclear waste—that embarrassing by-product of two generations of uranium-fueled power stations—
could be stored indefinitely in the shale rock that right now provides a highly contentious source of
natural gas for utility companies. nukeFI Photo courtesy of Shutterstock An estimated 77,000 tons of spent nuclear fuel is
stored in temporary, above-ground facilities. For decades, governments, anti-nuclear campaigners and nuclear generating
companies have all agreed that such a solution is unsafe in the long-term , and unsatisfactory even in the
short term . Nuclear fuel remains hazardous for tens of thousands of years. Everyone would like to see it safely tucked out of harm’s way.
But for decades, there has been disagreement and uncertainty about what might constitute long-term safety. But Chris
Neuzil of the U.S. Geological Survey told the American Chemical Society annual meeting in Dallas, TX, that the unique properties of the
sedimentary rock and clay-rich strata that make up the shale beds could be ideal. France, Switzerland and Belgium already planned to use shale
repositories as a long-term home. For decades, U.S.
authorities planned to bury American waste under Yucca Mountain
in Nevada, but abandoned the scheme in 2009. Rare Impermeability For more than 60 years, miners and oil and gas companies have used
controversial fracking or hydraulic fracture techniques to create flow channels to release oil and gas
trapped in rock, and the approach has been amplified in the search for otherwise inaccessible natural gas or
methane trapped underground. But fracking is necessary because shale rock is impermeable—hardly any water normally
flows through shale beds—and this impermeability may actually make the rock perfect for long-term nuclear waste
storage. Many shale formations are the product of very high pressures over many millions of years. Shale fractures may
show up where roads cut through a hillside, but conditions deep underground are quite possibly much safer . Experiments
have shown that water
moves through the rocks only very slowly, if at all . “Years ago I would probably have told you
shales below the surface were also fractured,” said Neuzil, who is examining a shale site in Ontario for the Canadian Nuclear Waste
Management Organization. “But we are seeing that that is not necessarily true.” However, one criterion for a safe burial site would have to be
the absence of oil or natural gas or anything else that might attract the interest of a future generation of hydraulic fracture engineers.
agent counterplan
The DOE fails – a new organization is key
BRC 12 – Blue Ribbon Commission (BRC, January 26, 2012, “Blue Ribbon Commission on America’s
Nuclear Future; Report to the Secretary of Energy,”
http://cybercemetery.unt.edu/archive/brc/20120620220235/http://brc.gov/sites/default/files/docume
nts/brc_finalreport_jan2012.pdf)//twonily
The U.S. Department of Energy (DOE) and its predecessor agencies have had primary responsibility for
implementing U.S. nuclear waste policy for more than 50 years. In that time, DOE has achieved some notable
successes, as shown by the WIPP experience and recent improvements in waste cleanup performance at several DOE sites. The overall
record of DOE and of the federal government as a whole, however, has not inspired widespread confidence or trust in
our nation’s nuclear waste management program. For this and other reasons, the Commission concludes that a new, singlepurpose organization is needed to provide the stability, focus, and credibility that are essential to get the
waste program back on track. We believe a congressionally chartered federal corporation offers the best model, but
whatever the specific form of the new organization it must possess the attributes, independence, and resources to effectively carry out its
mission. The
central task of the new organization would be to site, license, build, and operate facilities for the safe
and final disposal of spent fuel and high-level nuclear waste at a reasonable cost and within a reasonable
timeframe. In addition, the new organization would be responsible for arranging for the safe transport of waste and spent
consolidated storage
fuel to or between storage and disposal facilities, and for undertaking applied research, development, and demonstration (RD&D) activities
directly relevant to its waste management mission (e.g., testing the long-term performance of fuel in dry casks and during subsequent
transportation). For the new organization to succeed, a
substantial degree of implementing authority and assured
access to funds must be paired with rigorous financial, technical, and regulatory oversight by Congress and the
appropriate government agencies. We recommend that the organization be directed by a board nominated by the President, confirmed by the
Senate, and selected to represent a range of expertise and perspectives. Independent scientific and technical oversight of the nuclear
waste management program is essential
and should continue to be provided for out of nuclear waste fee payments. In addition,
the presence of clearly independent, competent regulators is essential; we recommend the existing roles of the U.S.
Environmental Protection Agency in establishing standards and the Nuclear Regulatory Commission (NRC) in licensing and regulating waste
management facilities be preserved but that steps be taken to ensure ongoing cooperation and coordination between these agencies. Late in
our review we heard from several states that host DOE defense waste that they agree with the proposal to establish a new organization to
manage civilian wastes, but believe the government can more effectively meet its commitments if responsibility for defense waste disposal
remains with DOE. Others argued strongly that the current U.S. policy of commingling defense and civilian wastes should be retained. We are
not in a position to comprehensively assess the implications of any actions that might affect DOE’s compliance with its cleanup agreements, and
we did not have the time or the resources necessary to thoroughly evaluate the many factors that must be considered by the Administration
and Congress in making such a determination.4 The Commission therefore urges the Administration to launch an immediate review of the
implications of leaving responsibility for disposal of defense waste and other DOE-owned waste with DOE versus moving it to a new waste
management organization. The implementation of other Commission recommendations, however, should not wait for the commingling issue to
be resolved. Congressional and
Administration efforts to implement our recommendations can and should
proceed as expeditiously as possible.
DOE fails – empirics – new organization is a prerequisite to effective policy
BRC 12 – Blue Ribbon Commission (BRC, January 26, 2012, “Blue Ribbon Commission on America’s
Nuclear Future; Report to the Secretary of Energy,”
http://cybercemetery.unt.edu/archive/brc/20120620220235/http://brc.gov/sites/default/files/docume
nts/brc_finalreport_jan2012.pdf)//twonily
For the last 60 years, the DOE and its predecessor agencies have had primary responsibility, subject to annual
appropriations and policy direction by Congress, for implementing U.S. nuclear waste policy. DOE is a large cabinet-level agency with multiple
competing missions, a budget that is dependent on annual congressional appropriations, and top management that changes with every change
of administration, and sometimes more frequently than that. Clearly, multiple factors have worked against the timely implementation of the
NWPA and responsibility for the difficulties of the past does not belong to DOE alone. Nevertheless, the
record of the last several
decades indicates that the current approach is not well suited to conducting a steady and focused longterm effort, and to building and sustaining the degree of trust and stability necessary to establish one or more disposal
facilities and implement other essential elements of an integrated waste management strategy. These considerations lead the
Commission to agree with a conclusion that has also been reached by many stakeholders and long-time
participants in the nation’s nuclear waste management program: that moving responsibility to a single purpose
organization—outside DOE—at this point offers the best chance for future success. For example, a new organization
dedicated to the safe, secure management and ultimate disposal of high‑level nuclear waste can concentrate on this objective in
a way that is difficult for a larger agency that must balance multiple agendas or policy priorities. A new organization will be in a
better position to develop a strong culture of safety, transparency, consultation, and collaboration.159
And by signaling a clear break with the often troubled history of the U.S. waste management program it can begin
repairing the legacy of distrust left by decades of missed deadlines and failed commitments. Finally, while the Commission
recognizes that it will never be possible or even desirable to fully separate future waste management decisions from politics, we believe a
new organization with greater control over its finances could operate with less influence from short-term
political pressures. We do not propose that a new organization be less accountable for its actions—on the contrary, effective oversight
by Congress and by a strong, independent regulator remains essential. But with greater control over year-toyear budgets and operations, we
believe a new organization could more easily maintain the program-level continuity and mission consistency that has often been lacking at DOE.
From an implementation standpoint, this is clearly among the most difficult recommendations advanced by the Commission. Nevertheless, it is
also one of the most important, since even
the wisest policies are likely to fail without an institutional structure
that is capable of implementing them.
thorium counterplan
Thorium reactors solve
AWF 13 – Alvin Weinberg Foundation, Mark Halper, staff researcher (4/5/13, “How thorium can solve
the nuclear waste problem in conventional reactors,” http://www.the-weinbergfoundation.org/2013/04/05/how-thorium-can-solve-the-nuclear-waste-problem-in-conventionalreactors/)//twonily
Thorium mixed with plutonium and other actinide “waste” could continuously power modified conventional reactors
almost forever in a reusable fuel cycle, according to a discovery at the University of Cambridge in England. The discovery, by PhD candidate Ben Lindley
working under senior lecturer Geoff Parks, suggests that mixed thorium fuel would outperform mixed uranium fuel, which lasts
only for one or two fuel cycles rather than for the “indefinite” duration of the thorium mix. Ideally, the reactors would be “reducedmoderation water” reactors that work on the same solid-fuel, water-cooled principles of conventional reactors but that do not slow down
neutrons as much and thus also offer some of the advantages of fast reactors. Lindley’s finding, made while he was
a master’s candidate in 2011, bodes well for the use of thorium not only as a safe, efficient and clean power source, but
also as one that addresses the vexing problem of what to do with nuclear waste from the 430-some conventional light
water reactors that make up almost all of the commercial power reactors operating in the world today and that run on uranium. By mixing thorium
with “waste” in a solid fuel, the nuclear industry could eliminate the need to bury long-lived plutonium
and other
actinides. Lindley’s work surfaced recently in an article about it in the hard copy edition of Cambridge’s quarterly Engineering Department magazine. An earlier version also appears online. ACCENTUATE THE NEGATIVE I interviewed Lindley and Parks recently after the magazine story
appeared. They explained the crux of Lindley’s discovery: Uranium/plutonium lasts for only a limited period because after one or two cycles, when the actinide portion increases, the mix displays a “positive feedback coefficient.” In the sometimes counter intuitive world of nuclear
engineering, a positive feedback is an undesirable occurrence. To use an unscientific term, the reaction goes haywire. Parks notes that with uranium, “As the amount of actinides in the mixture increases, you get this tipping point where with the uranium mixed with actinide based fuel – a
key feedback coefficient goes from being negative to being positive, at which point the fuel is not safe to use in the reactor.” Lindley completes the thought. “The idea is tha t mixing things with thorium rather than with uranium keeps the feedback coefficient negative,” he says. In a mixed
fuel system, reactor operators would allow a batch of fuel rods to stay in a reactor for about five years, roughly the same as with today’s solid uranium fuel. The fuel would then cool for a few years while the shorter-lived fission products decay, and would then be reprocessed over
another year, mixing actinide wastes with more thorium before being put back in a reactor. And just how long could this cycle continue? “You could just keep doing that forever – until the world runs out of thorium,” notes Parks. Ben Lindley Wasting his future. Cambridge PhD candidate
Ben Lindley made the discovery that actinide waste will burn with thorium for an indefinite period, auguring a way to simultaneously generate power and dispose of nuclear waste. Lindley’s proposal is the latest possibility to emerge for using thorium reactors to dispose of waste as well
as generate power. As we wrote here recently, Japan’s Thorium Technology Solution (TTS) is proposing to mix thorium and plutonium in a liquid molten salt reactor. Likewise, Transatomic Power in the U.S. has similar plans, although it is starting first with a liquid mixed uranium fuel
rather than with thorium. Lindley and Parks’ idea differs from TTS and Transatomic in one obvious way: It would allow the nuclear industry to carry on building conventional solid fuel, water-cooled designs. That would be strictly true only in the initial implementation of the technology,
which Lindley and Parks say would entail thorium mixed only with plutonium rather than also with other actinides like neptunium, americium and curium. That’s because plutonium is now available from sources such as the Sellafield nuclear waste site in Britain. The other actinides are
not as readily available, but would become so as it became clear they could be used as part of a mixed thorium fuel, Lindley and Parks believe. GO EASY ON THE WATER Once the other actinides enter the mix, the optimal reactor would be a light water reactor modified to have less water
and thus less moderation of neutrons in the reaction process. That, in turn, would allow more burn up of actinides. Lindley envisions a reactor with about a quarter to half the amount of water as in a conventional LWR – enough to serve as a necessary coolant, but little enough so that the
water could not slow down neutrons to the extent they do in a conventional reactor. “It’s not really a fast reactor, and it’s not really a thermal (conventional) reactor,” notes Lindley. “It’s between the two.” Hitachi, Toshiba, Mitsubishi Heavy Industries and the Japan Atomic En ergy Agency
all have reduced-moderation water reactor designs (RMWR), according to the International Atomic Energy Agency. Lindley described them as similar to LWRs but with different fuel assemblies. The development – and regulatory approval – of RMWRs is one of several challenges facing the
deployment of mixed thorium fuel in a water-cooled reactor. Another is the development and cost of reprocessing techniques for thorium and for actinides other than plutonium (for which reprocessing already exists). “Splitting thorium from waste or splitting some of the minor actinides
from waste has not been done on an industrial scale,” notes Lindley. “There are processes that are envisaged that can do that, that have been tested on a laboratory scale, but never on an industrial scale.” Another hurdle: Fabricating fuel that as Lindley notes would be “highly
radioactive” given the amount of waste that would go into it. “That would have to be done behind a shield,” Lindley says. Reduced Moderation Water Reactor JAEA “Light water Lite.” Lindley’s proposal to mix thorium with plutonium and other actinides would work best in a reduced-
All of that will require significant research and
development funding –more than what Lindley currently has at his disposal, which consists of university research funds and academic scholarships. One
possible source for additional funding could be Cambridge Enterprise, a commercial arm of the university. U.S. nuclear company Westinghouse
has also been collaborating with Lindley on his thorium research. Lindley hopes to test his fuel at the Halden test reactor in Norway, where
moderation water reactor. The diagram above shows a uranium version of an RMWR, from the Japan Atomic Energy Agency.
Westinghouse is a partner in Thor Energy’s project to irradiate thorium/plutonium fuel. It will be interesting to see if any of the £15 million that the UK government
recently earmarked for nuclear R&D finds it way to Lindley’s project. It’s possible that Sellafield could at least provide plutonium. FUNDS FROM
DECOMMISSIONING? Given the potential usefulness of thorium as a way of ridding the UK of actinides, it’s not out of the question that funding could also come
from the UK’s Nuclear Decommissioning Authority, which has a 2013-14 budget of £3.2 billion and which is responsible for managing nuclear waste, including
actinides and shorter lived fission products. As Parks notes, an ultimate goal for applying Lindley’s discovery “is to come up with a nuclear fuel cycle where the only
waste you have to dispose of is the fission product waste.” Parks encourages the government to “grasp the nettle” and financially back the thorium research. He
and Lindley note that a multiple-cycle thorium reactor would save money in the long run for among other reasons: uranium prices, although low now, will rise; and
a mixed thorium/actinide fuel would eliminate costs associated with nuclear waste storage. “There are economic benefits in the future to investing in the
reprocessing and fuel fabrication aspects now,” says Parks. “And you would completely change what nuclear waste means as far as the public is concerned, in terms
of the volume of it and how long it’s radioactive for.” Lindley and Parks say that their technology
could take hold in a commercial
RMWR within 10-to-20 years. For that to happen, they’ll have to find the right mix of collaborators and financial backers.
eis counterplan
The CP competes – normal means for the plan is to operate under GEIS recommendations – EIS is
comparatively better
Lydersen 13 – reporter specializing in energy, environment, labor, public health, and immigration, staff
writer for Midwest Energy News (Kari Lydersen, 11/15/13, “In Illinois, nuclear industry sees no urgency
on waste storage,” http://www.midwestenergynews.com/2013/11/15/in-illinois-nuclear-industry-seesno-urgency-on-waste-storage/)//twonily
The National Environmental Protection Act (NEPA) requires that the environmental impacts be studied and considered
for any project involving the federal government. The U.S. Court of Appeals for the D.C. Circuit ruled in 2012 that the NRC’s
previous environmental impact statement on nuclear waste storage at reactors “did not calculate the
environmental effects of failing to secure permanent storage – a possibility that cannot be ignored.” The court
ordered a more comprehensive environmental impact statement, but allowed the NRC to do a generic
environmental impact statement, or GEIS, which can be applied to various different existing and proposed reactor sites. Paul
Michalak, NRC chief of the decommissioning materials branch, explained that the GEIS means possible environmental impacts
at specific sites “would not be revisited” in deciding whether to renew licenses or grant new licenses, unless the NRC
finds a specific reason the generic findings would be “inappropriate” to that site. Nuclear power critics argue that every reactor is
different, and that one generic impact statement cannot adequately explore the risks posed by the storage of
nuclear waste at closed reactors across the country. S.Y. Chen, a nuclear power expert at the Illinois Institute of Technology, said that a
generic study relies on “a lot of assumptions.” “But every assumption has a lot of uncertainty involved,” Chen
said, noting that climate change is among the factors that could impact nuclear waste safety in the future. “I’m not sure how much this
uncertainty has been analyzed.
at: EIS is normal means – NWPA disproves
NWPA 82 – Nuclear Waste Policy Act of 1982 (1982, [42 U.S.C. 10101 note])//twonily
(B) No recommendation of a site by the President under this subsection shall require the preparation of an
e nvironmental i mpact s tatement under section 102(2)(C) of the N ational E nvironmental P olicy A ct of 1969 (42 U.S.C.
require any environmental review under subparagraph (E) or (F) of section 102(2) of such Act.
4332(2)(C)), or to
Download