SMRs Neg JDI 2014

advertisement

SMRs Neg

Community Relations DA

***1NC

SMR siting on bases causes NIMBY battles – kills local community relations

Andres and Breetz 11

[Richard B. andres is Professor of national Security Strategy at the national War College and a

Senior fellow and energy and environmental Security and Policy Chair in the Center for Strategic research, institute for national

Strategic Studies, at the national Defense University. Hanna L. Breetz is a doctoral candidate in the Department of Political Science at the Massachusetts institute of technology. Small Nuclear Reactors for Military Installations: Capabilities, Costs, and Technological

Implications, Institute for National Strategic Studies, http://www.dtic.mil/cgibin/GetTRDoc?Location=U2&doc=GetTRDoc.pdf&AD=ADA545712 ]

Small reactors used on domestic military bases are likely to face a number of additional

siting hurdles

. As a distributed energy source,

they are likely to face substantial “not-in-my-backyard” battles

. Moreover, dispersing a large number of reactors leads to questions about longterm nuclear waste disposal.27 Arguably, reactors should be relatively safe on domestic military installations, certainly more secure than, for instance, the reactors situated in developing countries or intended for processing tar sands. Nevertheless, no issue involving nuclear energy is simple.

Institutional and technical uncertainties

—such as the security of sealed modules, the potential and unintended social and environmental consequences, or the design of reliable safeguards—make dispersing reactors across the country challenging. Some key issues that require consideration include securing sealed modules, determining how terrorists might use captured nuclear materials, carefully considering the social and environmental consequences of dispersing reactors, and determining whether Permissive Action Links technology could be used to safeguard them.

Community relations key to prevent encroachment – undermines training and readiness.

Amanda

Boccuti

, Lauren

Faul

,

and

Lauren

Gray

, 5/21/

2012

. Analyst for Marstel-Day, LLC, providing analysis and

GIS support for U.S. Marine Corps projects; analyst for Marstel-Day, LLC, specializing in Strategic Communications. Her primary responsibilities entail the development of engagement plans for the U.S. Marine Corps which will provide them a framework to sustain the missions through community outreach and engagement; and researcher at Marstel-Day, LLC, offering research and analysis of environmental issues for encroachment control plans and communications, outreach and engagement strategies for the U.S.

Marine Corps. “Establishing Creative Strategies for Effective Engagement Between Military Installations and Communities,”

Engaging Cities, http://engagingcities.com/article/establishing-creative-strategies-effective-engagement-between-militaryinstallations-communi.

Throughout the Nation’s history, military installations and ranges were historically established in undeveloped areas, except for those forts located to defend cities. Local communities developed near the installations for safety and economic reasons resulting in the installation being the up-to-that-point rural community’s primary economic engine. Routine communication between the installations and local communities were minimal because the installation was self-supporting and not subject to local laws and regulations. Communications were primarily social. Starting in the post-World War II era and accelerating as the 20th Century came to a close, installation-adjacent communities increased in both density and size – becoming less rural, more suburban or urban, and more economically diverse. Military missions continue to evolve, incorporating new weapon platforms and

training over larger areas

and at all hours of the day and night. These changes in

both surrounding communities and the installation missions have often lead to competing interests with respect to the economy, natural resource management, and land use

. Military installations and local communities must, therefore,

focus communication efforts on building partnerships to find mutually acceptable paths forward for resolving their competing interests.

Developing collaborative relationships is imperative to turning otherwise conflicting interests into opportunities for mutually beneficial solutions. The nature of those interactions is defined by issue type, installation and community rapport, and available communication channels. The four military services (i.e., Army, Navy, Marine Corps and Air Force) have service-specific community engagement programs to develop partnerships; all four, however, conduct information sharing through the Public Affairs

Office (PAO), which handles media and public relations. Three of the services – the Navy, Marine Corps, and Air Force – have established encroachment management policies that outline service responsibilities to establish, maintain, and sustain community relationships in order to reduce encroachment effects. This responsibility is usually assigned to a Community

Plans and Liaison Office (CPLO) or an equivalent community planner. The CPLO and PAO work with their installation

Commander to act as the military’s voice and point of engagement in the community through consistent messaging, establishing an installation presence in community forums, and planning community-engagement events and processes.

Though Department of Defense (DoD) mechanisms exist to

develop community partnerships

,

mediating the different interests and priorities among military installations

and

their surrounding communities is a complex, nuanced process usually exercised by the services, through their installation leadership. Siting of renewable energy projects, environmental stewardship responsibilities, noise from training events, and other policy- and planning-related matters invoke difficult questions, such as: how can an installation and its surrounding communities concurrently pursue goals and development in a way that lead to mutual gain, obtaining threshold requirements and fair compromise? Finding interest nexuses and fostering an open, strong relationship in which those nexuses can be explored is key.

Readiness key to deter global conflict.

Spencer 0

(Jack, Policy Analyst for Defense and National Security in the Kathryn and Shelby Cullom Davis Institute for

International Studies at The Heritage Foundation. “The Facts About Military Readiness,” Heritage Foundation Backgrounder, http://www.heritage.org/research/reports/2000/09/bg1394-the-facts-about-military-readiness)

Military readiness is

vital

because declines in America's military readiness

signal to the rest of the world

that

the United States is not prepared to defend its interests

. Therefore, potentially hostile nations will be

more likely to lash out

against American allies and interests, inevitably leading to U.S. involvement in combat. A high state of military readiness is

more likely to deter

potentially hostile nations from acting aggressively in regions of vital national interest, thereby

preserving peace

.

2NC – NIMBY – Turns Case

NIMBYism means the military won’t adopt it

King et al ’11

[Marcus King, Research Analyst And Project Director At CAN Corporation’s Center For Naval Analyses, LaVar

Huntzinger, Center for Naval Analyses, Thoi Nguyen, Center for Naval Analyses, “Feasibility of Nuclear Power on U.S. Military

Installations,” March, http://www.cna.org/sites/default/files/research/Nuclear%20Power%20on%20Military%20Installations%20D0023932%20A5.pdf]

DoD will have to take the views of stakeholders such as state and local governments into account when deciding whether to undertake, or participate in a nuclear power project. Governmental views at these levels vary considerably and may be shaped by public opinion.

Public opinion is solicited and taken into consideration at several stages of the NRC licensing process. Although public views toward nuclear power are increasingly favorable, there is significant opposition within some segments of the population. Before undertaking a specific nuclear power project, it would be important for DoD to take public opinion into account and consider it in the context of broader military installation/community relations. While public attitudes are somewhat unknown particularly until a plant is actually proposed for location in a community, it is possible for DoD to make some general determinations about the likelihood of support.

Since none of the small reactor designs have yet been submitted for design certification and licensing, areas where early site permits for large reactors have been submitted might be more generally receptive of nuclear power. An early site permit is an NRC approval of one or more sites for a nuclear power facility, independent of whether companies have submitted an application for a construction permit or combined license. NRC has issued early site permits for projects in Illinois, Mississippi, Virginia, and Georgia, and applications are currently under review in Texas and New Jersey [47].

Accidents DA

1NC Accidents DA

Accelerating SMR development will cause devastating accidents

Wang 12

Ucilia, Forbes, 1-20, “Feds To Finance Small Nuclear Reactor Designs,” http://www.forbes.com/sites/uciliawang/2012/01/20/feds-tofinance-small-nuclear-reactor-designs/

Just because small nuclear reactors promise many economic and environmental benefits (they don’t produce dirty air like coal or natural gas power plants do) doesn’t mean they can be developed and made more quickly or cheaply, however. Technology companies also will have to prove that their small nuclear reactors can be just as safe if not safer than the conventional, large-scale nuclear reactors today. The Fukushima nuclear power plant disaster in Japan has shown that a

misstep in designing and operating

a nuclear plant can have a far greater and more

devastating impact

than a mistake in running other types of power plants. That means nuclear power companies — and the government —

will have to do a lot more

to prove that nuclear power should remain an important part of the country’s energy mix.

Accidents turn case – kills SMR industry

Reynolds ‘10

- Mechanical Engineering Professor WSU Tri-Cities (Roger S., "APPLICABILITY OF THE NRC LIGHT

WATER REACTOR LICENSING PROCESS TO SMRs," July 2010, https://smr.inl.gov/Document.ashx?path=DOCS%2fReading+Room%2fPolicy+and+regulation%2fANS+SMR+APPLICABILITY+O

F+THE+NRC+LWR+LICENSING+PROCESS+910.pdf)

Small and Medium Sized Reactors (

SMRs) of a Light Water design

differ in important ways from each other and from the current fleet of operating reactors. These designs

incorporate innovative approaches

to achieve simplicity, improved operational performance, and enhanced safety

.

Gas-cooled and liquid metal–cooled reactors represent an even

greater departure

from current designs and consequently

greater challenges

to the application of current regulatory guidance

. Several of the most challenging issues have been identified and analyzed in recent years. The next section of this paper will discuss this history in some detail.

If SMR licensing is to succeed

, these issues must be resolved

to the satisfaction of the NRC and the public

.

2NC – Accidents – Links

Accidents DA —subsidies cause fast and risky nuclear construction without safety upgrades —causes meltdowns that destroy the industry

Koplow

, United Nations Environment Programme's Working Group on Economic Instruments, MBA – Harvard, and Vancko, project manager – nuclear/climate @ UCS,

‘11

(Doug and Ellen, “Nuclear Power: Still Not Viable without Subsidies,” Union of Concerned Scientists, February)

Because operating costs

account for a much smaller share of levelized costs than do capital costs, they are often ignored.

The logic here is

somewhat circular: operating costs are low in part because of government subsidies.

Most prominently, these subsidies shift the long-term, though uncertain, risks of accidents and nuclear waste management away from plant owners. In unsubsidized industries, these risks would affect current operations through elevated annual insurance costs and high waste management fees. Nuclear power

has two additional attributes that make it unattractive to investors. First, the period of risk exposure lasts too long. In most other sectors of the economy, the majority of the risks

that investors take on last only several years, or a few decades at most. By contrast, nuclear operations span many decades —longer even than coal plants once post-closure periods prior to decommissioning are included.

In particular, highly radioactive and extremely long-lived wastes are not only risky but also require oversight for centuries.

Second, a single negative event can wipe out decades of gains.

Although the risk of nuclear accidents in the U nited

S tates is considered quite low, it is not zero.

6

Plausible accident scenarios generate catastrophic damages, with corresponding levels of financial loss.

This characteristic creates a large disconnect between private interests

(which highlight an absence of catastrophic damages thus far) and public interests

(which must consider the damage that would be caused in the case of even a moderate accident, as well as the inadequacy of financial assurance mechanisms or insurance-related price signals to address the challenge).

Unlike car accidents, where one event generally has no impact on the perceived risk to unrelated drivers

or auto companies, risks in the nuclear sector are systemic.

An accident anywhere in the world will cause politicians and plant neighbors everywhere to reassess the risks they face and question whether the oversight and financial assurance are sufficient.

Generally, the cost implications of such inquiries will be negative for reactor owners. All of these factors, in combination with a poor track record of financial performance on new plant construction, have led investors in nuclear power to demand much higher rates of return, to shift the risks to other parties, or to steer clear

of the nuclear power sector entirely.7

These risks are real, and if they were

visibly integrated into the

nuclear cost structure, the resulting price signals would guide energy investment toward technologies that have more predictable and lower risk profiles.

Focus on industry trades off with safety

Gilinsky – previous NRC commissioner, 8 (Victor, independent consultant--primarily on matters related to nuclear energy. He was a two-term commissioner of the US Nuclear Regulatory Commission from 1975-1984, and before that Head of the Rand Corporation

Physical Sciences Department. He holds an Engineering Physics degree from Cornell University and a Ph.D. in Physics from the

California Institute of Technology, which granted him its Distinguished Alumni Award. “Pro-industry priorities derail NRC's publicsafety mission”, Bulletin of the atomic scientists , 30 May, http://www.thebulletin.org/web-edition/roundtables/the-future-of-thenuclear-regulatory-commission?order=asc#rt2324

Th e Nuclear Rgulatory Commission's ( NRC) problems lie in the priorities at the top . The overriding priority-evident from commission pronouncements and actions-is to facilitate a major expansion , or "renaissance," of nuclear power .

That's okay elsewhere in the federal government, but not at the NRC because it gets in the way of

public safety

and conducting

fair proceedings.

This is especially evident in the licensing review of Nevada's Yucca Mountain, the site the

Energy Department proposes as the country's high-level nuclear waste repository. The nuclear "renaissance" is said to depend on NRC approval of Yucca Mountain, and, thus far, the NRC has been accommodating. I'd like to provide a couple of examples based on my experience as a Nevada consultant--one example deals with safety and the other with fairness. The most disappointing aspect of NRC's role in Yucca Mountain is that it has agreed to toss overboard what has heretofore been the

sine qua non

of its safety philosophy--"defense-in-depth." Consequently,

Yucca Mountain radiation standards are much more lax than repository standards in other countries. To explain the lack of defense-in-depth at

Yucca Mountain requires some background. About 12 years ago, Energy discovered that supposedly dry Yucca Mountain had lots more water than estimated. It was moving a lot faster too, which meant the site wouldn't retain radioactive waste leakage. To keep the project alive despite this, Energy decided to put a near-total reliance on metal containers to retain the waste. Earlier, the NRC had warned against this approach when it approved Energy's geologic site criteria. But when push came to shove, the department kept the site and dropped the troublesome criteria and

NRC went along. To keep corrosive water off the containers, Energy dreamed up the "drip shield"--a heavy titanium alloy cover that would run the length of each tunnel containing waste containers. But Energy doesn't plan to actually make or install the enormously expensive shields for

100 years or more, which makes installation a pretty doubtful proposition, even more so because it will be difficult to maintain a remotely operated underground transport system for that long. But without counting the drip shield, Yucca Mountain can't come close to passing federal radiation dose standards. And there's no backup. Naturally, Energy insists the NRC should assume the drip shield will be in place, however implausible that is. Sad to say, the current NRC has been going along with this absurdity despite its 1998 white paper. That paper states: "The defense-in-depth philosophy ensures that safety will not be wholly dependent on any single element of the design, construction, maintenance, or operation of a nuclear facility ." The NRC website's current definition of defense-in-depth drops this language. All the while, it insists it's holding fast to design-in-depth.

2NC – Accidents – Turns Case

New disasters kill the nuclear industry- signals lack of reform

New Scientist 3-9-12 [“Fukushima's dirty inheritance,” http://www.newscientist.com/article/mg21328552.100-fukushimas-dirtyinheritance.html

]

These shutdowns will add to a problem that has been growing, largely ignored, for years: the decommissioning of unwanted or obsolete reactors.

The world's 400-plus power reactors are now 27 years old, on average.

Dozens are reaching the end of their lives

, and at some point they must be dismantled and their contents made safe. This is expensive, time-consuming work that involves the disposal of vast amounts of radioactive concrete, steel and much else. Although these materials are not as "hot" as the spent fuel, which must also be disposed of, the sheer quantities involved are daunting. There is a growing backlog of defunct reactors waiting to be decommissioned. But

even the world's biggest nuclear powers, such as the US, do not yet have the trained staff or institutional skills needed to manage the Herculean task of cleaning up all these stations

(see "Resilient reactors: Nuclear built to last centuries"). One pragmatic response is to postpone decommissioning, perhaps for decades, on the basis that radioactive decay will eventually reduce the scale of the task. The UK alone has more than 20 nuclear hulks in what is euphemistically termed "care and maintenance". That may be sensible: cleaner waste means less risk for the clean-up crew. But we have no right to simply foist this problem on future generations, who will be ill-equipped to address it if we do not start amassing the necessary expertise and infrastructure now. A forum recently established by the International Atomic Energy Agency should encourage the sharing of knowledge about decommissioning. To be truly effective, this must include the kind of nitty-gritty information that contractors might otherwise regard as commercially confidential. And

it is essential that it includes the most taboo subject of all - what happens when things go wrong

.

If the industry does not demonstrate its willingness to clean up its past messes, public concern may force more countries to rethink their stances on nuclear power

, which has big implications for climate change (see "Japan's refusenik farmers tackle nuclear waste"). That is why decommissioning has to be done now - and done right

2NC – Accidents – AT: All SMRs Safe

No basis for optimism – empirically new technological promises fall short and move at a snails pace – you should prefer our specific link evidence over their tech optimism

Biello ‘12

- Associate Editor at Scientific American (David, March 27, "Small Reactors Make a Bid to Revive Nuclear Power", http://www.scientificamerican.com/article.cfm?id=small-reactors-bid-to-revive-nuclear-power)

But multiple reactor sites proved problematic at Fukushima Daiichi, where an accident in

one rapidly

became a

crisis for multiple

reactors and spent fuel pools

. "If you're going to have multiple reactors, are you going to gain in safety or lose in safety?" asks physicist M.V. Ramana of Princeton University. "We don't know."

"

Early in the discovery of any new technology you have this rosy picture that is formed

,"

Candris admits of Small Modular Reactors.

"

In the early days of nuclear, there were people out there saying it would be too cheap to meter. We found out otherwise

."

Indian Leadership DA

1NC Indian Leadership DA

India is winning the SMR race in the status quo- the US is still behind- the plan’s fast development knocks them out- the counterplan’s slower development gives them time to win

CSIS ‘10

[“India’s Nuclear Push” http://csis.org/blog/india%E2%80%99s-nuclear-push ]

“ In India's statement to the

54th General Conference of the International Atomic Energy Agency (

IAEA

) in

Vienna, Indian Atomic Energy Commission chairman Srikumar

Banerjee said that Nuclear Power

Corporation of India Ltd (NPCIL) is ‘ready to offer Indian PHWRs of 220 MWe or 540

MWe for

export’

.

It’s happening– second-tier nuclear suppliers from China, South Korea, and now

India are waking up to the opportunities that may emerge from

intensified interest in nuclear power. India is entering the nuclear supply business at a time when new nuclear states are looking for alternatives to the

huge

, expensive reactors sold by the

French, Russians, Japanese, Canadians, and

Americans

.

Last year, Korea won the plum contract in the Middle East – a $20 billion agreement to build 4 nuclear power reactors in the United Arab Emirates. The UAE plans to construct a total of 10 reactors, using one contractor. China, while busily constructing nuclear power plants at home, will build a few new reactors in Pakistan and reportedly is interested in Turkish and Arab state plans to import. India will be next off the starting block of this export race. There’s no way to predict how price-competitive India’s export reactors will be. NPCIL is a public enterprise under the control of the government’s Department of Atomic Energy. One of the suggested virtues of the U.S.-India nuclear deal was that the Indian nuclear sector would be forced to clean up its act as foreign competition grew in India.

One way for the NPCIL to become more

self-sustaining

is through

exports

. What will motivate nuclear power newcomers to buy Indian

, Korean or Chinese? First, the reactor vendors from the advanced nuclear states are in disarray

. AREVA has its much-publicized cost overruns in Olkiluoto; Japanese vendors do not have an export history; and Russian reactors were previously sold only in the Eastern bloc countries or allies. Russia will expand from reactors in India and Iran to potential contracts with Turkey and Vietnam. China, South Korea and

India all have

smaller reactors

to offer

.

In the United States, while there is interest in small modular reactors,

there aren’t any licensed

.

These smaller reactors are more likely to fit the needs of states that are new to nuclear power

. Not only do they lack the billions of dollars it takes to build large 1000MWe-1600MWe reactors, but they also lack the extensive transmission grids to accommodate large, centralized electricity generators.

Nuclear market lead key to Indian leadership

K1 Team ’12

(The K1 Criticality Project is a think-tank led by Emlyn Hughes and Dr. Ivana Nikolic Hughes @ Columbia

University, Citing Institute for Defence Studies and Analyses, http://k1project.org/energy/fissile-material-indias-investments-in-newnuclear/ , July 2012)

With a population of 1.2 billion that is expected to multiply over the next couple of decades,

India has taken a keen interest in

new nuclear

technologies and is fast becoming a

key player

in the energy arena.

The

International Energy Agency, an energy research organization, expects that India’s energy demand will “more than double by 2030”.

It is furthermore clear that India will need to expand its power grid in order to reach the significant portion of the country that currently does not have electricity. With pressure coming from the international community to reduce its carbon emissions,

India is looking for energy investments that will pay-off in the long-term. As Rajendra K. Pachauri, chairman of the Intergovernmental

Panel on Climate Change, astutely stated, “India cannot emulate developed countries. We have to find a path that is distinctly different”. Part of the answer lies in India’s exploitation of

new

nuclear technologies.

Currently, uranium can be purchased on the market at a competitive price, which seems to preclude the much needed investment in research and development of new nuclear technologies. However, it is imperative that these newer technologies receive adequate attention because nuclear energy seems to be a likely interim fuel source for the transition from carbon-based fuels to fully-renewable energy sources. In its current state, nuclear energy does not seem to be safe or efficient enough to win wide-spread trust from citizens and policy-makers.

Therefore, the need for new nuclear technologies is becoming ever the more pressing. One important technology that India is making inroads on is the thorium-fueled fast breeder nuclear reactor. As India’s Department of Atomic Energy clearly recognizes, “We have

rather meager reserves of uranium…We, however, have nearly a third of the entire world’s thorium…Our strategies for large scale deployment of nuclear energy must be, and are therefore, focused towards utilisation of thorium.” India currently has a three-stage nuclear power program that will eventually allow it to make full use of its thorium reserves. In the first stage, the fast neutron reactors that India is developing will burn uranium in pressurized-heavy water reactors to produce plutonium. During the second stage, the fast neutron reactors will burn the plutonium with a uranium and thorium blanket. Thorium itself is a fertile element, and while it has the capacity to fission, it needs a boost from low-enriched uranium or plutonium, which can be sourced from spent fuel or decommissioned nuclear weapons. Thus, using thorium addresses many of the waste disposal, proliferation, and safety hazards that are often associated with conventional, uranium-based nuclear reactors. Investing in thorium-based reactors is cost efficient for India more so than for many other countries for two primary reasons: one being the vast thorium reservces, and the other being its limited reactor base. Both these factors would reduce the comparative cost that India would undertake with this investment. Gradually, as the country approaches the third stage of the nuclear program, the reactors will burn the U-233 from the second stage and the fuel blanket will be primarily composed of thorium. Thus, about two-thirds of the reactor’s power will be fueled by thorium. Additionally, thorium fuel bundles can last much longer than conventional uranium fuel bundles. Thus, the spent uranium would eventually be replaced by thorium, eventually creating a fully thorium-fueled reactor. In 2002, construction on a prototype fast breeder reactor at Kalpakkam was approved by the regulatory authority, and it is expected to progress to the second stage of the program by 2013. Six additional fast reactors are slated for construction, with four of them planned for 2020. Within 25 years, India plans to increase its use of nuclear power for electricity generation from 2.8% to 9%. With the passing of the U.S.-India Civil Nuclear Agreement, which allows India even greater independence in the trade of nuclear energy and technologies with other countrie s

,

India may

eventually

be established as a preeminent center for nuclear technologies.

There is speculation that India is offering for export the designs of its heavy-water reactors, and this would allow India’s considerable investments to become a global energy investment. Dr. S. Banerjee, Chairman of the Atomic Energy Commission, mentioned in a 2010 address to the IAEA that the

Nuclear Power Corporation of India Limited is “ready to offer Indian PHWRs of 220 MWe or 540 MWe capacity for export”.

India’s investments will provide India with the electricity capacity that it desperately needs

,

while simultaneously providing the global energy market

with a competitive source of safer and more efficient nuclear energy . The Institute for Defence Studies and Analyses writes that “The time has also come for India to think beyond domestic development of nuclear power reactors and showcase its civilian nuclear capabilities abroad.”

Indian leadership solves extinction

Kamdar ‘7

(Mira Kamdar, World Policy Institute, 2007, Planet India: How the fastest growing democracy is transforming

America and the world, p. 3-5)

No other country matters more to the future of our planet than India

. There is no challenge we face, no opportunity we covet where India does not have critical relevance. From combating global terror to finding cures for dangerous pandemics, from dealing with the energy crisis to averting the worst scenarios of global warming, from rebalancing stark global inequalities to spurring the vital innovation needed to create jobs and improve lives—

India is now a pivotal player. The world is undergoing a process of profound recalibration in which the rise of Asia is the most important factor

. India holds the key to this new world. India is at once an ancient Asian civilization, a modern nation grounded in Enlightenment values and democratic institutions, and a rising twenty-first-century power. With a population of 1.2 billion, India is the world’s largest democracy. It is an open, vibrant society. India’s diverse population includes Hindus, Muslims, Sikhs, Christians,

Buddhists, Jains, Zoroastrians, Jews, and animists. There are twenty-two official languages in India. Three hundred fifty million

Indians speak English. India is the world in microcosm. Its geography encompasses every climate, from snowcapped Himalayas to palm-fringed beaches to deserts where nomads and camels roam. A developing country, India is divided among a tiny affluent minority, a rising middle class, and 800 million people who live on less than $2 per day.

India faces all the critical problems of our time —extreme social inequality, employment insecurity, a growing energy crisis, severe water shortages, a degraded environment, global warming, a galloping HIV/AIDS epidemic, terrorist attacks— on a scale that defies the imagination . India’s goal is breathtaking in scope: transform a developing country of more than 1 billion people into a developed nation and global leader by 2020, and do this as a democracy in an era of resource scarcity and environmental degradation.

The world has to cheer India on. If India fails, there is a real risk that our world will become hostage to political chaos, war over dwindling resources, a poisoned environment, and galloping disease.

Wealthy enclaves will employ private companies to supply their needs and private militias to protect them from the poor massing at their gates. But, if India succeeds, it will demonstrate that it is possible to lift hundreds of millions of people out of poverty. It will prove that multiethnic, multireligious democracy is not a luxury for rich societies. It will show us how to save our environment, and how to manage in a fractious, multipolar world. India’s gambit is truly the venture of the century.

NNSA DA

1NC NNSA DA

NNSA stemming human capital shortages- plan trades off- no link turns

Aloise, 12

-- GAO Nuclear Security, Safety, and Nonproliferation director

(Gene, former GAO Assistant Director for Report and Testimony Quality Control, "Modernizing the Nuclear Security Enterprise:

Strategies and Challenges in Sustaining Critical Skills in Federal and Contractor Workforces," Government Accountability Office,

GAO-12-468, April 2012, http://www.gao.gov/assets/600/590488.pdf, accessed 9-4-12, mss)

The enterprise’s work environments and site locations pose recruiting challenges , and NNSA and its M&O contractors face shortages of qualified candidates, among other challenges. For example, staff must often work in secure areas that prohibit the use of personal cell phones, e-mail, and social media, which is a disadvantage in attracting younger skilled candidates. In addition, many sites are geographically isolated and may offer limited career opportunities for candidates’ spouses. Critically skilled positions also require security clearances—and therefore U.S. citizenship—and a large percentage of students graduating from top science, technology, and engineering programs are foreign nationals. The pool of qualified candidates is also attractive to high technology firms in the private sector , which may offer more desirable work environments.

NNSA and its M&O contractors are taking actions to address these challenges where possible , including streamlining hiring and security clearance processes and taking action s to proactively identify new scientists and engineers to build a pipeline of critically skilled candidates . The National Nuclear Security Administration (

NNSA

)—a separately organized agency within the Department of Energy (DOE)— has primary responsibility for ensuring

the safety, security, and reliability of the nation’s nuclear weapons stockpile.

1 NNSA carries out these activities at eight government-owned, contractor-operated sites, which include three national laboratories, four production plants, and one test site. Collectively, these sites are referred to as the nuclear security enterprise. The enterprise, formerly known as the nuclear weapons complex, has been a significant component of U.S. national security since the 1940s. Contractors operate sites within the enterprise under management and operations (M&O) contracts. 2 These contracts provide the contractor with broad discretion in carrying out the mission of the particular contract but grant the government the option to become much more directly involved in day-to-day management and operations. Historically, confidence in the safety and reliability of the nuclear stockpile was derived through a continuous process of designing, testing, and deploying new weapons to replace older weapons. In 1992, at the end of the Cold War, and in response to a congressionally imposed U.S. nuclear test moratorium, 3 the United States ceased underground testing of nuclear weapons, and adopted the Stockpile Stewardship Program as an alternative to testing and producing new weapons. The Stockpile Stewardship Program primarily relies on analytical simulations and computer modeling to make expert judgments about the safety, security, and reliability of the nation’s nuclear weapons. In addition, NNSA refurbishes weapons in the stockpile to extend their operational lives.

Under current national policy, NNSA may also be called upon to resume underground nuclear testing at the Nevada National Security Site within a 3-year time frame under certain circumstances, including the accumulation of uncertainties about the reliability of the nuclear stockpile. Currently,

NNSA’s workforce is made up of

about 34,000 M&O contractor employees that span the enterprise, and about 2,400 federal employees directly employed by NNSA in its Washington headquarters, at site offices located at each of the eight enterprise sites, and at its Albuquerque, New Mexico, complex.

NNSA’s staff provide leadership and program management for the nuclear security enterprise and support and oversee its M&O contractors by providing business, technical, financial, legal, and management advice, including support for contractor workforce planning and restructuring, compensation, benefits, oversight of labor management relations, and the quality of contractor deliverables such as nuclear weapons components. Many workers in the enterprise ––both NNSA’s staff and its M&O contractors–– possess certain critical skills not readily available in the job market. These workers often have advanced degrees in scientific or engineering fields or experience in high-skill, advanced manufacturing technique s. In addition, certain critical skills are unique to the enterprise and , according to NNSA officials, can only be developed with in its secure, classified environment . According to these officials, it generally takes a minimum of 3 years of on-the-job training to achieve the skills necessary to succeed in most critical skills positions .

Some nuclear weapons expertise can take even longer to develop and must be gained through several years of mentoring, training, and on-the-job experience . For example, according to officials at Los Alamos National Laboratory, it takes 5 to 10 years to train a scientist or engineer with an advanced degree to be a fully qualified nuclear weaponeer . Over the last 20 years, in an effort to operate more efficiently and at reduced cost, DOE has sharply reduced its enterprise contractor workforce ––from approximately 52,000 in 1992 to its current level of about

34,000. This decrease raised concerns about preserving critical skills in the enterprise.

In 1999, a report from a congressionally mandated commission warned that unless DOE acted quickly to recruit and retain its critically skilled staff and M&O contractor employees—and sharpen the expertise already available— the department could have difficulty ensuring the safety, security, and reliability of the nation’s nuclear weapons. 4 DOE, and later NNSA , took steps to correct these problems , and in February 2005, we reported that these efforts had been generally effective

. 5 However, in February 2011, in a report assessing the extent to which NNSA has the data necessary to make informed, enterprisewide decisions, 6 we found that NNSA did not have comprehensive information on the status of its M&O contractor workforce. In particular, we reported that NNSA did not have data on the critical skills needed to maintain the Stockpile Stewardship Program’s capabilities. As a result, we recommended that NNSA establish a plan with time frames and milestones for the development of a comprehensive contractor workforce baseline that includes the identification of critical human capital skills, competencies, and levels needed to maintain the nation’s nuclear weapons strategy. NNSA stated that it understood all of our recommendations in that report and believed that it could implement them . As of March 2012, NNSA had completed a draft plan and was incorporating stakeholders’ comments. NNSA officials said that they expect to complete the final contractor workforce baseline plan by May 2012. NNSA expressed concerns in its FY 2012 Stockpile Stewardship Management Plan about the state of both its federal and contractor workforces, stating that there was an urgent need to “refresh” both. In particular, NNSA noted that many employees have retired or are expected to retire soon. At the same time,

NNSA’s mission has become even more dependent on high-level science, computer science, technology, and engineering skills as it has moved from underground testing as a means for assessing the safety and reliability of nuclear weapons to one dependent on advanced computer simulations , analyses, and nonnuclear tests. These changes make it even more important that NNSA and its M&O contractors preserve critical skills in their workforces . Additional concerns about human capital in the enterprise have been raised by the debate over––and eventual ratification of––the New Start Treaty, 7 which commits the United States to reduce the size of its strategic nuclear weapon stockpile from a maximum of 2,200 to 1,550 nuclear weapons. Reductions in the number of nuclear weapons make it all the more important that NNSA and contractor staff have the requisite critical skills to maintain t he safety, security, and reliability of the remaining weapons. However, as the enterprise has contracted, NNSA officials note that training opportunities have been limited, leaving little or no redundancy in certain critical skills within the contractor workforce. In this context, you asked us to examine NNSA’s human capital planning. Specifically, our objectives were to examine: (1) the strategies NNSA and its M&O contractors use to recruit, develop, and retain the workforces needed to preserve the critical skills in the enterprise; (2) how NNSA assesses the effectiveness of these strategies; and (3) challenges that NNSA and its M&O contractors face in recruiting, retaining, and developing this specialized workforce and their efforts to mitigate these challenges. To address these three objectives, we conducted interviews with human capital planning officials at NNSA headquarters, the Albuquerque complex in New Mexico, and all eight NNSA site offices. We also obtained and reviewed NNSA information about recruiting and retention practices for critically skilled employees, as well as each site’s efforts to preserve knowledge needed to sustain critical capabilities. We visited six of the eight sites in the enterprise, including the three national laboratories, Los Alamos National Laboratory and Sandia National Laboratories in New Mexico and Lawrence Livermore National Laboratory in California; two of the production plants, the Pantex Plant in Texas and the Y-12 Plant in Tennessee; and the test site, Nevada National Security Site in Nevada. We conducted telephone interviews with human capital managers at the two other production plants, the Kansas City Plant in Missouri and the Savannah River Site in South Carolina. To examine the strategies NNSA and its M&O contractors use to recruit and retain critically skilled workers, we collected key workforce data from each facility, including NNSA and M&O contractor reports and other documents on the performance and progress made in meeting recruitment and retention targets. To identify challenges in retaining, recruiting, and developing the critical skills workforce, we sent a standardized set of questions about workforce planning efforts and challenges to each M&O contractor and NNSA site office, and analyzed their written responses. We also interviewed NNSA and M&O human capital officials at each site about site-specific workforce challenges and their efforts to address them. We reviewed two NNSA systems for managing human capital data; to assess the reliability of these systems, we interviewed knowledgeable NNSA officials to assess the reliability of these data and determined that they were sufficiently

reliable for the purposes of this report. We conducted this performance audit from December 2010 through April 2012, in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives. To ensure the safety, security, and reliability of the nation’s nuclear weapons stockpile, NNSA relies on contractors who manage and operate government-owned laboratories, production plants, and a test site. NNSA’s eight enterprise sites each perform a different function, all collectively working toward fulfilling NNSA’s nuclear weapons-related mission. Figure 1 shows the locations of the sites and describes their functions. To provide support and oversight, NNSA locates between about 30 and 110 NNSA staff in a site office at each facility, and also draws on the resources of NNSA staff in headquarters and the Albuquerque complex. According to NNSA officials, this support and oversight requires that some NNSA staff have critical skills comparable to the contractors they support and oversee. For example, NNSA staff may need technical knowledge and expertise to accept and review deliverables from M&O contracts and, when presented with options, be able to determine how best to proceed to meet contract goals, mission, and objectives. They may also need skills related to the safe operation of sensitive defense nuclear facilities such as expertise in occupational safety and fire safety. For example, according to NNSA officials at the Livermore Site Office, most of the staff in critical skills positions there are focused on ensuring safety at the laboratory’s nuclear facilities.

Maintaining critical skills within its workforce is not a challenge unique to NNSA. Every 2 years, we provide Congress with an update on GAO’s high-risk program, under which GAO designates certain government operations as high risk due to their greater vulnerabilities to fraud, waste, abuse, and mismanagement, or their need for transformation to address economy, efficiency, or effectiveness challenges. In 2001, GAO designated strategic human capital management across the entire federal government as a high-risk area, in part because critical skill gaps could undermine agencies’ abilities to accomplish their missions. We have also reported in the past that NNSA and its predecessor organizations’ record of inadequate management and oversight of contractors has left the government vulnerable to fraud, waste, abuse, and mismanagement.

Contract management at DOE has been on GAO’s high risk list since 1990, the first year our high-risk list was published. 8

Progress has been made, but NNSA and DOE’s Office of

Environmental Management remain on our high-risk list . 9 As of 2011, our most recent update of the high-risk list, significant steps had been taken to address some of the federal government’s strategic human capital challenges. Strategic human capital management was designated a high-risk area 10 years earlier governmentwide and remains on the high-risk list because of a need for all federal agencies to address current and emerging critical skills gaps that are or could undermine agencies’ abilities to meet their vital missions. Specifically, across the federal government, we reported that resolving remaining high-risk human capital challenges will require three categories of actions: • Planning. Agencies’ workforce plans must define the root causes of skills gaps, identify effective solutions to skills shortages, and provide the steps necessary to implement solutions. •

Implementation. Agencies’ recruitment, hiring, and development strategies must be responsive to changing applicant and workforce needs and expectations and also show the capacity to define and implement corrective measures to narrow skill shortages. • Measurement and evaluation. Agencies need to measure the effects of key initiatives to address critical skills gaps, evaluate the performance of those initiatives, and make appropriate adjustments.

Plan trades off- its zero-sum

Lorentzen, 8

-- Human Sciences Research Council chief research specialist

(Jo, PhD from the European University Institute in Italy, worked at universities and research institutes in Europe and in the US for a decade during which he taught courses on international business and economic development, and Il-Haam Petersen "Human Capital

Dynamics in Three Technology Platforms: Nuclear, Space and Biotechnology," March 2008, https://www.labour.gov.za/downloads/documents/research-documents/Technology%20Platforms.pdf, accessed 9-6-12, mss)

For the new build programme, the time lines are such that construction could feasibly start in 2010 and would last six years, irrespective of location. New build implies a massive human capital effort at the level of artisans, technicians, and engineers.

Insofar as the new plants are turn-key projects, it would be the contractor’s responsibility to field the required number and quality of welders, electricians, and so forth. But it is also true that in view of the scarcity of these kinds of skills in the country, any upscale of the nuclear workforce would come at the expense of other infrastructure projects , thus resulting in a zero-sum game . In light of this massive market failure, it is unlikely that the solution to the skills constraints could be entirely privatised, i.e. rest with

Westinghouse and whoever else makes up its consortium.

NNSA human capital key to solve disease

D'Agostino, 10

– U.S. Under Secretary for Nuclear Security

(Thomas, former Stockpile Stewardship Program director, "NNSA Administrator Addresses Next Generation of Computational

Scientists," 6-22-10, www.nnsa.energy.gov/mediaroom/speeches/csgfremarks062210, accessed 9-4-12, mss)

Since I spoke to this group last summer, a lot has changed. I believe that the long-term opportunities to promote our Nation’s nuclear security are greater today than at any point since the end of the Cold War. And I believe that means even more opportunities for you and your generation of nuclear security professionals to make valuable and rewarding contributions to our nation’s security. Take, for example, the Nuclear Posture Review released publicly this past April. While it obviously defines the role of nuclear weapons for our future national security, it also recognizes and explicitly mentions a key theme I have been promoting for a number of years: the importance of recruiting and retaining the “human capital” needed in the NNSA for the nuclear security mission. In order to succeed in our mission, we must have the best and brightest minds working to tackle the toughest challenges.

Without question, our highly specialized work force is our greatest asset.

This Nuclear Posture Review has helped generate renewed interest in nuclear security by elevating these issues to the very top of our national security agenda. I want to share with you a statement from the Directors of Los

Alamos, Sandia, and Lawrence Livermore that provides their views on the NPR. The Directors universally state that: “We are reassured that a key component of the NPR is the recognition of the importance of supporting ‘a modern physical infrastructure comprised of the national security laboratories and a complex of supporting facilities--and a highly capable workforce…..’” The

President has now clearly outlined the importance of nuclear issues for our national security, and of keeping the U.S. nuclear deterrent safe, secure, and effective for the foreseeable future. The Administration’s commitment to a clear and long-term plan for managing the stockpile and its comprehensive nuclear security agenda, ensures the scientists and engineers of tomorrow like yourselves will have the opportunity to engage in challenging research and development activities. The mission in NNSA encompasses the nuclear deterrent, nonproliferation, nuclear propulsion, nuclear counterterrorism, emergency management, nuclear forensics and nuclear intelligence analysis. And, we anticipate that those R&D activities will expand far beyond the classical nuclear weapons mission. At the Department of Energy, we are expected to deliver for the Nation in science, energy, and security. The Department will soon issue a new Strategic Plan that reflects an integrated approach to national security activities. We anticipate that our nuclear security facilities will provide significant science, technology, and engineering capabilities that can address non-NNSA issues. Conversely, we anticipate that other DOE programs can provide science, technology, and engineering capabilities to NNSA for our issues. We are looking at a number of areas to move forward: Exa-scale Computing, Energy Systems Simulation, the behavior of Materials in

Extreme Environments, and Inertial Fusion Energy – these are some of the cross cutting areas we are a looking at as we map out the future strategic vision of the Department. Already, the supercomputing capabilities born of our nation’s investment in nuclear security are providing the tools to tackle global challenges like climate change, the spread of pandemic diseases , and even hurricane modeling. As we move to the next generation of supercomputers, we will see even more opportunities for the kind of cutting edge science and research that can engage people like you and your colleagues. Creating computational simulations to provide solutions –

in effect, creating a new discipline of predictive sciences – is a technical base we need and is a direction that many of you in this room will help pioneer. Like generations of scientists and researchers before you, we hope you will find the opportunity we provide to develop novel solutions to critical challenges to be irresistible to your career path decisions. And I am confident of our future when I look out at audiences like this and see people like you. The work you do, your interests and your choices will form our future. Don’t be bashful about striving for what you want. Your investments now in developing your skills make you best able to contribute towards solving our most complex national problems. From Oppenheimer during the Manhattan Project, to the men and women serving in our national laboratories today, the people who come before you have included some of the greatest names in science and discovery. You are the inheritors of a proud tradition of achievement and advancement. I am confident that legacy is in good hands. Secretary Chu recently stated that the Department of Energy “...must discover and deliver the solutions to advance our national priorities.” The

NNSA and our Nuclear Security Enterprise are poised to provide those solutions along with the rest of the Department.

Extinction

Keating, 9

-- Foreign Policy web editor

(Joshua, "The End of the World," Foreign Policy, 11-13-9, www.foreignpolicy.com/articles/2009/11/13/the_end_of_the_world?page=full, accessed 9-7-12, mss)

How it could happen: Throughout history, plagues have brought civilizations to their knees. The Black Death killed more off more than half of Europe's population in the Middle Ages. In 1918, a flu pandemic killed an estimated 50 million people, nearly 3 percent of the world's population, a far greater impact than the just-concluded World War I. Because of globalization, diseases today spread even faster - witness the rapid worldwide spread of H1N1 currently unfolding. A global outbreak of a disease such as ebola virus -- which has had a 90 percent fatality rate during its flare-ups in rural Africa -- or a mutated drug-resistant form of the flu virus on a global scale could have a devastating, even civilization-ending impact. How likely is it? Treatment of deadly diseases has improved since

1918, but so have the diseases. Modern industrial farming techniques have been blamed for the outbreak of diseases, such as swine flu, and as the world’s population grows and humans move into previously unoccupied areas, the risk of exposure to previously unknown pathogens increases. More than 40 new viruses have emerged since the 1970s, including ebola and HIV. Biological weapons experimentation has added a new and just as troubling complication.

NNSA Key to Nuclear

Takes out solvency- nuclear labs are a pre-req

LANL, 8

(Los Alamos National Laboratory, "Advanced Nuclear Energy," 6-15-8, www.lanl.gov/news/factsheets/pdf/AdvancedNuclear.pdf, accessed 9-16-12, mss)

Nuclear energy is an important source of power, supplying 20 percent of the nation’s electricity. More than 100 nuclear power plants are operating in the U.S., and countries around the world are implementing nuclear power as a carbon-free alternative to fossil fuels.

We can maximize the climate and energy security benefits provided by responsible global nuclear energy expansion by developing options to increase the energy extracted from nuclear fuel, improve waste management, and strengthen nuclear nonproliferation controls. To develop viable technical solutions , these interdependent challenges must be addressed through tightly integrated multidisciplinary r esearch and d evelopment efforts.

Los Alamos

National Laboratory is playing a key role in developing

these solutions with its

core strengths in

- nuclear fuels development, testing, and characterization - advanced

structural and cladding materials science

- high-accuracy nuclear data measurements - nuclear nonproliferation - modeling, simulation, and high-performance computing

- actinide chemistry - repository science - reactor design

- licensing support. With these combined strengths, we can improve fuel performance, reduce the long-lived content of radioactive waste, develop new tailored waste forms, understand and predict repository performance, and address the safeguards challenges associated with the future global nuclear fuel cycle. Advanced Nuclear Fuels Nuclear waste can be greatly reduced if spent uranium fuel is recycled and reprocessed into a new type of “TRU” fuel (named for the TRansUranic elements it would contain) that could be consumed in advanced burner reactors. This process would extract more energy from the fuel and result in less waste needing storage in high-level repositories. It also eases long-term storage requirements because the waste is mostly a short-lived fission product. To implement this advanced method, we must understand how new TRU fuels will react in a fastneutron reactor. This will require an integration of new materials fabrication, materials testing under new reactor conditions, and modeling and simulation. Unique

Facilities for Fabrication and Testing Fabrication and testing of new nuclear materials require unique facilities like those at Los

Alamos

. Los Alamos is using the resources in its Plutonium Facility and Materials Science Laboratory to develop advanced ceramic fuels. The new fuels can be tested at the Materials Test Station (MTS)—a new facility planned for construction at the Los

Alamos Neutron Science Center (LANSCE) and expected to open in 2012. The MTS will be powered by LANSCE’s 800-millionelectronvolt proton beam, and will be the only experimental facility in the U.S. capable of providing the neutron intensity approaching that expected within new fast-neutron reactors. LANSCE and the Lab’s Lujan Center also make possible highly accurate measurement of key nuclear data. A new level of accuracy for neutron cross section measurements will be possible with a time projection chamber designed to allow the first-ever 3D visualization of nuclear fission events; these data will improve the design and cost of new reactors.

And “hot cells” at the Laboratory’s Chemistry and Metallurgy Research facility allow safe and remote research into the development of new fuels and cladding and structural materials. Researchers are currently using this facility to analyze an irradiated fuel duct retrieved from a decommissioned fast reactor, providing valuable data for the future design of fast reactors. Modeling and Simulation

Designing the nuclear fuel cycle of the future will also require advanced modeling and simulation. Los Alamos has decades of reactor modeling experience and can simulate the entire nuclear energy process from the detailed physics in the reactor’s core to the operation of an entire nuclear power plant and the flow and transport of nuclear materials throughout the nuclear fuel cycle. Los Alamos’ Monte

Carlo N-Particle (MCNP) code, with over 1,100 users in 250 institutions, is the gold standard for predicting nuclear reactions.

Fission, the process that creates nuclear power, relies on the behavior of neutrons in nuclear fuels. Since MCNP provides accurate predictions of the movement of neutrons during nuclear reactions, it is a critical tool in the design of advanced fuels and reactors. Los

Alamos scientists are now combining MCNP with other computer codes to create one overarching code that can accurately predict the flow of energy in a fast reactor and track other reactor behaviors in addition to neutron movement. Los Alamos also has reactor modeling experience dating back to the 1970’s with the pioneering TRAC code—the first computer code capable of realistic reactor safety analysis. TRAC safety evaluations extended the lives of 18 nuclear reactors for more than 20 years. With TRAC, Los Alamos can perform multi-dimensional modeling and simulation of advanced fast-neutron reactors, from microscale investigation of the fuel cladding materials to macroscale modeling of an entire facility.

Key to nuclear power- actinide science- human capital is key

LANL, 7

(Los Alamos National Laboratory, "Preferred Alternative," 12-18-7, www.lanl.gov/news/factsheets/complex_trans.shtml, accessed 9-

16-12, mss)

The preferred alternative selection confirms that Los Alamos is first and foremost a science R&D Laboratory. The Laboratory is the nation's choice for materials-centric national security science that relies on effective integration of experiments with exceptional theory, modeling, and high-performance computing. Interdisciplinary excellence in theory, modeling, and simulation with experimental science and nuclear science continue to provide the Laboratory with innovative and responsive solutions to broad national security challenges through the agile, rapid application of key science and technology strengths. For example, for a community, simulation of flu pandemics could help contain a deadly influenza outbreak. Weapons design & engineering Los Alamos

National Laboratory provides the fundamental science-based understanding of nuclear weapon physics and engineering performance.

It is this basic understanding that is the basis for confidence in the nation's nuclear deterrent without the need for further nuclear testing. Los Alamos's design and engineering of both nuclear and nonnuclear weapons components are enabled through small-scale experiments, nonnuclear hydrotests, and subcritical experiments, relying on the full spectrum of scientific excellence across all

disciplines, with a focus on materials, high-explosives chemistry, and shock physics. Plutonium research, development, & manufacturing Los Alamos has a long and successful history in actinide science and limited plutonium manufacturing that support a credible, sustainable nuclear deterrent. The Laboratory's expertise in the production, handling, and processing of nuclear and nonnuclear materials makes it the best, most logical site for future limited plutonium manufacturing. Radiation-monitoring systems in

Russia and key borders The Laboratory is the world leader in actinide science

—the exploration of the elements from thorium to lawrencium, with particular emphasis on uranium and plutonium, a set of elements on the frontier of scientific inquiry. Los Alamos's scientists publish more than 300 studies a year with a focus on the actinide elements. In 2007, the Laboratory delivered the first war reserve W88 pit in nearly 20 years with small-scale plutonium experiments, legacy test data, groundbreaking materials science, extensive statistical analysis, adapted computer weapons codes, and a refined manufacturing process that results in increased efficiencies and lower costs. LANL's Seaborg Institute for Actinide Science investigates the science that underpins energy security, nuclear power generation, and the production, purification, characterization, analysis, and eventual disposal of actinide elements. The

Laboratory also supports actinide research in physics, chemistry, metallurgy, theory, modeling, and experimental technique development. New facilities, such as the Chemistry and Metallurgy Research Replacement building, now under construction, along with materials consolidation, means that the nation's special nuclear materials inventory can be protected to meet the security challenges of the 21st century. Additionally, leading-edge new technologies alongside the latest in best practices and procedures will further enhance the Laboratory's already rigorous approach to worker safety, health, and security. Research-driven supercomputing

Computer modeling and simulation, supported by experimental data and utilizing some of the world's most powerful supercomputers, are central to understanding weapons performance in the absence of nuclear testing. The Laboratory has a suite of supercomputing assets, led by "Roadrunner," slated to be the first computer in the world to operate at sustained petaflop speeds. Phase 3 of Roadrunner is a unique hybrid petascale system, a very large cluster of nodes linked together at high speeds. Each computer node in this cluster consists of two AMD Opteron™ dual-core processors plus four Cell™ processors used as computational accelerators. The Cell processors used in Roadrunner are a special IBM-developed variant of the Cell processor used in the Sony PlayStation 3®. The

Laboratory's supercomputing assets also enable research of broader scientific questions related to complex systems like Earth's weather, disease pandemics, and the security of the U.S. electricity grid. Los Alamos will continue to be at the forefront of highperformance computing, exploring advanced architectures, operating systems, and applications. Broader national security missions

The Laboratory's capabilities in the areas of weapons design, plutonium research, and research supercomputing as outlined above also support a broader set of national security challenges. As the preferred site, the Laboratory would continue its ability to respond quickly to emerging threats, and support a broad spectrum of mission objectives in stockpile stewardship, nuclear energy research, nuclear forensics, nuclear safeguards, and counterterrorism. Large-scale modeling and simulations with broad experimental science capability allow LANL to address challenges such as biothreats, climate change, and infrastructure security. At the same time, world-class nuclear facilities enable waste minimization and environmental cleanup. Emerging national security challenges also require the

Laboratory to advance its scientific user-facility infrastructure and to attract and retain the best talent . Currently in development is a set of research facilities called MaRIE, or Material-Radiation Interaction in Extremes. The purpose of MaRIE is to provide tools that would allow the Laboratory to address the critical materials-related scientific questions relevant to a broad spectrum of current and future missions.

Disease Kills Solvency

Takes out solvency- collapses USFG functions

Greger, 6

-- Humane Society public health director

(Dr. Michael, The Humane Society of the United States Director of Public Health and Animal Agriculture, graduate of the Cornell University School of Agriculture and the Tufts University School of Medicine, Bird

Flu , 2006, http://birdflubook.com/a.php?id=37&t=p, accessed 9-16-12, mss)

Business Week’s bird flu cover story, “Hot Zone in the Heartland,” featured Osterholm contrasting Katrina with the prospect of a pandemic. “The difference between this and a hurricane is that all 50 states will be affected at the same time,” said Osterholm. “And this crisis will last a year or more. It will utterly change the world.”695 Even those sympathetic to the administration have cast doubt on its abilities to manage the crisis. Colonel Lawrence Wilkerson, for example, Colin Powell’s right-hand man at the State

Department, recently said, “If something comes along that is truly serious…like a major pandemic, you are going to see the ineptitude of this government in a way that will take you back to the Declaration of Independence .”696

Turns case- pandemic causes state collapse and war

Brown, 3

-- RAND science & technology policy analyst

(Jennifer Brown, RAND S&T policy analyst, Ph.D. in public health from Harvard University, Codirected the congressionally mandated Advisory Panel to Assess Domestic Response Capabilities for Terrorism Involving WMD, and Peter Chalk, RAND senior political scientist, Ph.D. in political science from the University of British Columbia, correspondent for Jane's Intelligence Review and associate editor of Studies in Conflict and Terrorism, one of the foremost journals in the international security field, adjunct professor at the Postgraduate Naval School in Monterey, California, and contractor for the Asia Pacific Center for Security Studies in Honolulu,

HI, and the United States Institute of Peace, "The Global Threat of New and Reemerging Infectious Diseases; Reconciling U.S.

National Security and Public Health Policy," www.rand.org/pubs/monograph_reports/MR1602.html, accessed 9-16-12, mss)

The argument that the transnational spread of disease poses a threat to human security rests on the simple proposition that it seriously threatens both the individual and the quality of life that a person is able to attain within a given society, polity or state. Specifically, this occurs in at least six ways. First and most fundamental, disease kills—far surpassing war as a threat to human life. AIDS alone is expected to have killed over 80 million people by the year 2011, while tuberculosis (TB), one of the virus’s main opportunistic diseases, accounts for three million deaths every year, including 100,000 children. 2 1 In general, a staggering 1,500 people die each hour from infectious ailments, the vast bulk of which are caused by just six groups of disease: HIV/AIDS, malaria, measles, pneumonia, TB, and dysentery and other gastrointestinal disorders. 22 Second, if left unchecked, disease can undermine public confidence in the state’s general custodian function, in the process eroding a polity’s overall governing legitimacy as well as undermining the ability of the state itself to function . When large-scale outbreaks occur, such effects can become particularly acute as the ranks of first responders and medical personnel are decimated, making it doubly difficult for an already stressed government to respond adequately. During the initial weeks of the anthrax attacks in fall 2001, the lack of coordination at the federal level, especially with regard to communication, led to a loss of confidence by some citizens, especially postal workers in Washington, D.C. Potentially exposed individuals were given conflicting advice on antibiotic treatment and the efficacy of the anthrax vaccine. The general public, largely because of inconsistent information enunciated by government officials, bought Cipro, the antibiotic approved for the treatment of anthrax, in large numbers. Similarly, in 1996, Japan suffered a severe food poisoning epidemic caused by Escherichia coli

O157. Over the course of two months, eight people died and thousands of others were sickened. The perceived inability of the Tokyo government to enact an appropriate response generated widespread public criticism, compounding popular dissatisfaction with an administration that was still reeling from the effects of the previous year’s Kobe earthquake. As one commentator remarked at the height of the crisis, “The cries against government authorities are growing louder by the day. . . . The impression here [in Japan] is too much talk and not enough action has led to yet another situation that has spun out of control.” 23 Third, disease adversely affects the economic foundation upon which both human and state security depends. The fiscal burden imposed by the HIV/AIDS epidemic provides a case in point. Twenty-five million people are currently HIV-positive in subSaharan Africa, costing already impoverished governments billions of dollars in direct economic costs and loss of productivity. Treating HIV-related illnesses in South Africa, the worst-hit country on the continent, is expected to generate annual increases in healthcare costs in excess of US$500 million by 2009

(see Chapter Three). 2 4 South and Southeast Asia are expected to surpass Africa in terms of infections by the year 2010. If this in fact occurs, demographic upheaval could tax and widely destabilize countries with fragile economies and public health infrastructures.

Economies will be greatly affected by the loss of a stable and productive workforce as well as from a reduction of external capital investment, potentially reducing general gross domestic product (GDP) by as much as 20 percent. 25 Fourth, disease can have a profound, negative impact on a state’s social order, functioning, and psyche. In Papua New Guinea, for instance, AIDS has severely distorted the wa n t o k system—which formalizes reciprocal responsibilities, ensuring that those who hit hard times will be taken care of by extended family—because of the fear and stigma attached to the disease. 26 The Ebola outbreak that hit the crowded Ugandan district of Gulu in late 2000 caused people to completely withdraw from contact with the outside world, reducing common societal interactions and functions to a bare minimum. 27 Epidemics may also lead to forms of post-traumatic stress. A number of analyses have been undertaken to assess the long-term psychological effects on those who have been continually subjected to poor sanitary conditions and outbreaks of disease. The studies consistently document the extreme emotional stress suffered by these people and the difficulty of integrating them back into “normal society.” 28 Fifth, the spread of infectious diseases can act as a catalyst for regional instability . Epidemics can severely undermine defense force capabilities (just as they distort civilian worker productivity). By galvanizing mass cross-border population flows and fostering economic problems, they can also help create the type of widespread volatility that can quickly translate into heightened tension both within and between states. This combination of military,

demographic, and fiscal effects has already been created by the AIDS crisis in Africa. Indeed, the U.S. State Department increasingly speculates that the disease will emerge as one of the most significant “conflict starters” and possibly even “war outcome determinants” during the next decade.

Deterrence Impact

NNSA human capital key to reliable nuclear force- solves now

Aloise, 12

-- GAO Nuclear Security, Safety, and Nonproliferation director

(Gene, "Observations on NNSA’s Management and Oversight of the Nuclear Security Enterprise," GAO-12-473T, 2-16-12, www.gao.gov/assets/590/588648.pdf, accessed 9-4-12, mss)

Thank you for the opportunity to discuss our work on the governance, oversight, and management of the nation’s nuclear security enterprise. As you know, the National Nuclear Security Administration (NNSA), a separately organized agency within the Department of Energy (DOE), is responsible for managing its contractors’ nuclear weapon- and nonproliferationrelated national security activities

in research and development laboratories, production plants, and other facilities known collectively as the nuclear security enterprise. 1 Ensuring that the nuclear weapons stockpile remain s safe and reliable in the absence of underground nuclear testing is extraordinarily complicated and requires state-of-the-art experimental and computing facilities as well as the skills of top scientists in the field.

To its credit, NNSA consistently accomplishes this task, as evidenced by the successful assessment of the safety, reliability, and performance of each weapon type in the nuclear stockpile since such assessments were first conducted in 19

95

. NNSA’s three nuclear weapon design laboratories are heavily involved in this assessment process and, over the past decade, the United States has invested billions of dollars in sustaining the Cold War-era stockpile and upgrading the laboratories. With the moratorium on underground nuclear testing that began in 1992 and the subsequent creation of the Stockpile Stewardship Program, the mission of the nuclear security enterprise changed from designing, building, and testing successive generations of weapons to extending the life of the existing nuclear weapons stockpile through scientific study, computer simulation, and refurbishment.

Nuclear war

Caves 10

(John P, Senior Research Fellow in the Center for the Study of Weapons of Mass Destruction at the National Defense

University, January, Strategic Forum, No. 252, “Avoiding a Crisis of Confidence in the U.S. Nuclear Deterrent,” AD: 1/22/11) jl

Perceptions of a compromised U.S. nuclear deterrent as described above would have profound policy implications, particularly if they emerge at a time when a nuclear-armed great power is pursuing a more aggressive strategy toward U.S. allies and partners in its region in a bid to enhance its regional and global clout. A dangerous period of vulnerability would open for the United States and those nations that depend on U.S. protection while the United States attempted to rectify the problems with its nuclear forces. As it would take more than a decade for the United States to produce new nuclear weapons, ensuing events could preclude a return to anything like the status quo ante. The assertive, nuclear-armed great power, and other major adversaries, could be willing to challenge U.S. interests more directly in the expectation that the United States would be less prepared to threaten or deliver a military response that could lead to direct conflict. They will want to keep the United States from reclaiming its earlier power position. Allies and partners who have relied upon explicit or implicit assurances of U.S. nuclear protection as a foundation of their security could lose faith in those assurances. They could compensate by accommodating U.S. rivals, especially in the short term, or acquiring their own nuclear deterrents, which in most cases could be accomplished only over the mid- to long term. A more nuclear world would likely ensue over a period of years. Important U.S. interests could be compromised or abandoned, or a major war could occur as adversaries and/or the

United States miscalculate new boundaries of deterrence and provocation. At worst, war could lead to state-on-state employment of weapons of mass destruction (WMD) on a scale far more catastrophic than what nuclear-armed terrorists alone could inflict.

Biofuels DA

1NC Biofuels DA

DoD support for biofuels is increasing—that assuages investor fears

Lawrence 12/14

/12—Contributor @ Forbes [Mackinnon Lawrence, “Policy Shifts Signal Growth Ahead for Advanced

Biofuels,” Forbes, 12/14/2012, http://tinyurl.com/c5j372j]

Over the past year, the U.S. military has emerged as a

key torchbearer

leading the commercialization of advanced biofuels.

Spearheaded by the Navy

, which signed a Memorandum of

Understanding (

MOU

) with

the U.S. Department of Agriculture (

USDA

) and

Department of Energy (

DOE

) to develop costcompetitive advanced biofuels, the DoD has been a

lone bright spot

for an industry that has suffered from

press blowback and

investor retrenchment

in recent years.

Only $84 Billion to Go

Prior to the Hagan amendment, the Senate approved another amendment, offered by Senator Mark Udall of Colorado, to repeal section

313 of the annual

Defense

appropriations bill. Offered by Republican Senator James Inhofe of Oklahoma, Section 313 would have prohibited the DoD from procuring alternative fuels if they cost more than their conventional counterparts. The section was introduced in response to the U.S. Navy’s highly criticized purchase of advanced biofuels from firms like Solazyme and Dynamic

Fuels for its “Great Green Fleet” exercises off the coast of Hawaii, at an estimated price-tag of $15 per gallon.

These bills are expected to facilitate public-private partnerships and funnel much-needed capital to support advanced biorefinery construction within the U nited

S tates. In our Industrial Biorefineries report, Pike

Research forecasts that at least 13 billion gallons of advanced biorefinery production capacity will come online over the next decade

in the United States. Although that falls short of the 21 billion gallons of advanced biofuels carved out under the EPA’s Renewable Fuel Standard (RFS), more than $60 billion will be invested over that same period.

With the minimum cost of scale-up to meet the advanced biofuel production mandate estimated at $84 billion, the industry still has significant ground to make up. Although continued federal support will help

assuage investor fears

, uncertainties around feedstock supply and production profitability persist, translating into high levels of risk for investors.

Advanced biofuels

, which address these concerns at least in part, have enjoyed a

rising tide of policy support

in

recent months from

Washington

. In August,

Congress allocated $170 million to support the development of military biofuels

and other defense initiatives, voted to extend key tax credits

for advanced biofuel producers, and granted algae producers tax credit parity

with other feedstock pathways. Meanwhile, the recent commissioning

of first-of-kind facilities from advanced biofuel producers

KiOR and INEOS Bio are strong indicators of a

maturing cellulosic biofuels

industry

.

They force a tradeoff with the fuel budget

Eoyang 12

—National Security Director @ Third Way [Mieke Eoyang, Julie Zelnick (Policy Advisor for National Security @

Third Way), & Ryan Fitzpatrick (Senior Policy Advisor for the Third Way Clean Energy Program), “Fuel Costs Squeeze Defense

Budget,” Third Way Digest, May 2012, pg. 1]

In 2011, Congress passed the B udget

C ontrol

A ct, which put

long-term limits on defense spending

as part of a broader effort to curb the

$15.7 trillion federal budget deficit . Though DOD’s budget will grow over the next 10 years, it will rise at a smaller rate than previously projected. This means

DOD’s topline budget

going forward will be

more

flat

.

Rising costs in one area will come

at the expense of others

.1

Given such constraints, DOD must carefully

scrutinize every cost

and find efficiencies where it can.

One of those costs is fuel —a critical component of military operations, especially for ground vehicles, ships, and aircraft.

DOD spends about $16 billion on fuel each year—more than double what UPS, FedEx, and DHL spend on global shipping operations, combined.3

Biofuels will lose out

Erwin 12

—Editor of National Defense Magazine [Sandra I. Erwin, ‘Policy Uncertainty’ Could Choke Development of Military

Biofuels

,” National Defense, 7/26/2012, http://tinyurl.com/d82e34n]

To outsiders, the NDAA debate is just one more partisan battle in Washington’s larger political wars. But anti-biofuel sentiments on Capitol Hill

are rais ing serious

alarm bells

within the alternative-fuel industry and stir ring concerns among Pentagon officials who support green energy because of the

chilling effect

that the political divide could have on private investment

.

“ If there is

a lot of uncertainty, we are going to lose private capital,” said

Phyllis Cuttino, director of the

Pew Project on National Security, Energy, and Climate

.

The

Defense

Department’s plan to become a consumer of alternative fuels is predicated on the ability of the private sector to scale up production and on commercial airlines transitioning to biofuels so prices become more competitive. All that requires substantial private investments that might be at risk if venture capitalists decide that the politics of biofuels pose too big a financial risk

.

Assistant Secretary of Defense for Operational Energy Plans and Programs

Sharon Burke said

she does have concerns that legislative restrictions could jeopardize

the

Defense

Department’s goals

to diversify its sources of energy.

“For the future, our military will need alternatives to petroleum to keep our supplies diverse, especially for our legacy fleet of ships and planes, which will be with us for decades to come,” Burke said in a statement to National Defense. “

The private sector will be the leaders in developing

a commercially viable alt ernative fuels

industry, and we have concerns that restrictions

on the department's ability to obtain the milspec fuel we need to achieve our mission may reduce the development and availability

of these alternatives over the long term.”

The Defense Department began to step up its pursuit of alternative fuels in 2007, and over the past two years the Navy and the Air

Force have made headlines for their embrace of aviation biofuels as a future hedge against rising oil prices and unreliable foreign oil suppliers.

In the wake of the House and Senate NDAA amendments, Pew has mobilized biofuels supporters and released a letter this week that was signed by more than 350 veterans, including retired generals and admirals, as well as former Senate and House Armed Services

Committee chairmen Sen. John Warner and Rep. Ike Skelton, urging the president and Congress to support the Pentagon’s initiatives to diversify its energy sources. The letter echoes biofuel producers’ belief that the military is needed as an

essential anchor customer

.

Lawmakers

in the House and Senate have argue d that biofuels are

cost prohibitive at a time when the military’s budget is stretched

.

The

Navy’s “ great green fleet ” effort was

particularly criticized

by members of the House Armed Services Committee as an example of misplaced priorities when the Navy is cutting

back on new ship buys and other

modernization programs

.

The Senate Armed Services Committee agreed to add anti-biofuel provisions to the NDAA. Biofuel supporters’ best hope now lies with Sens. Jeanne Shaheen, D-N.H., and Susan Collins, R-Maine, who vowed in a recent op-ed article that they would fight to protect the Defense Department’s biofuel funds, including a Navy commitment of more than $200 million as part of joint $500 million effort with the Departments of Energy and Agriculture.

Cuttino said the green-energy community

has been taken aback by the partisan tenor of an issue that has national security implications.

“We’ve been dismayed by the

politicization

of these [ military biofuel]

efforts,” Cuttino said July 24 during a conference call with reporters. “These issues should not be politicized,” she said. “To have these innovations singled out is unfortunate.”

The Pentagon’s financial commitment is being blown out of proportion, she said. Biofuel expenditures are a tiny fraction of what the

Defense Department spends on fuel each year, Cuttino said. The Pentagon’s annual energy bill is about $15 billion, three-quarters of which is spent on liquid fuels. Pew estimated that Defense Department biofuel expenditures last year were $1.2 billion, up from $400 million two years ago. A Pew study projects military biofuel purchases will reach $10 billion annually by 2030.

When Congress was fighting a year ago over the nation’s debt ceiling, investors were alarmed. The battle over biofuels creates a

similar cloud of policy

uncertainty

that

could be damag ing to an industry that is just getting off the ground

, Cuttino said.

The trends in private investment in alternative energy in G-20 countries are cause for concern, she said, as they indicate that investors tend to flee when they see

policy indecision

. “What we know from all our research over several

years is that if there is a question of uncertainty when it comes to policy, private investment will move on to

another country where there is more policy certainty .”

The U nited

S tates currently is a

world leader

in attracting private capital to alt ernative energy

, she said. The European economic crisis might keep the United States in the lead for some time, but venture capitalists also may be souring on U.S. biofuels investments, according to analysts.

Interest in capital-intensive industries such as energy is fading, said a July report by Dow Jones VentureSource. Investors are raising red flags about biofuel investment because of the large amounts of capital needed to build infrastructure. “The second quarter is the worst for investment in energy and utilities start-ups since the first quarter of 2009,” said VentureSource.

The

C ommercial

A viation

A lternative

F uels

I nitiative — a coalition of airlines, aircraft and engine manufacturers, energy producers and U.S. government agencies — cautions that

project financing is still the “biggest remaining challenge to the deployment of alt ernative aviation fuels .” Nevertheless,

CAAFI is “confident that

environmentally friendly alt ernative jet fuel derived from several feedstocks will be available

in the next two to five years,” the group said in a statement on its website.

The barrier

to deployment, said CAAFI, is

the availability of capital,

as production plants cost on the order of $100,000 per barrel per day.

FlightGlobal.com reported that, since 2007, more than 1,500 passenger flights have been made using biofuels produced from feedstocks such as household waste and algae. “The major challenge now is to work out how to produce large quantities of sustainable biofuel at a cost that is commercially competitive to airlines,” FlightGlobal noted.

Lufthansa, one of the world’s largest airlines, has projected that renewable jet fuel will replace up to 5 percent of the market in the next five to seven years.

In the United States, the biofuel industry

needs the military to commit

to long-term purchases so it can secure investors , Pew said in a statement. “The military’s leadership, cooperation with the private sector, and early adoption have been critical to the commercialization of many technologies such as semiconductors, nuclear energy, the Internet, and the Global Positioning System,” Pew noted. “Maintaining energy innovation, inside and outside the Defense Department, is critical to our national security.”

Biofuels will end oil wars.

Ventura 12

—Essayist and cultural critic @ Austin Chronicle [Michael Ventura, “Letters at 3AM: A Big Picture and a Long

Game,” Austin Chronicle, Fri., Oct. 19, 2012, pg. http://tinyurl.com/col9hvh

It's like Alice watching the Queen of Hearts play cards and croquet: "Three times so far this year, the Joint Chiefs of Staff and the regional war-fighting commanders have assembled at [Marine Corps Base Quantico, Va.], where a giant map of the world, larger than a basketball court, was laid out on the ground. ... The generals and admirals walked the world and worked their way through a series of potential national security crises. ... 'Strategic seminar' is the name Gen. Martin E. Dempsey, chairman of the Joint Chiefs of Staff, has chosen for these daylong sessions" (The New York Times online, Sept. 12).

Let's walk this immense map. We'll stroll roughly 5,500 miles from the Strait of Gibraltar eastward to the Afghan-Pakistani border.

Then let's amble another 7,000 miles from Kazakhstan

in Asia to Angola

in Africa. In the area we've walked, alliances overlap and contradict one another—and are further complicated by trade routes, oil fields, rebels, pirates, and terrorists— and the U nited

S tates has positioned itself in such a way that its chain can be yanked from

almost

any direction

.

Focus on oil.

According to the U.S. Energy Information Administration (www.eia.gov), in 2011, 69% of U.S. oil originated in five countries, listed by volume: Canada, Saudi Arabia, Mexico, Venezuela, and Nigeria. Of the next 10 largest sources, six are in the area we've walked: three in the Persian Gulf—Iraq, Kuwait, and Oman; three in Africa—Angola, Algeria, and Chad.

Imagine some

general scenarios

:

A destabilized Tunisia impacts

bordering

Algeria.

A destabilized

Libya impacts

bordering Algeria and

Chad. Chad

, destabilized by a destabilized Libya, in turn destabilizes Nigeria.

Move west from Africa. A destabilized

Yemen impacts

neighboring

Saudi Arabia

and Oman. A belligerent

Iran impacts Iraq

, Kuwait, Saudi Arabia, and Oman.

Draw lines of possible crises this way and

that, and the generals, admirals, and

war commanders

walking the big map

must be bumping into one another with alarming frequency

any way they turn.

All for imported oil.

Oil dependence has put the U nited

S tates in a

strategically vulnerable

and

ultimately

untenable position

. There's no way we can cover all that turf indefinitely. We've neither the money nor the manpower.

One issue is clear: The cessation of our participation in Iraq and Afghanistan won't affect the overall situation.

"

Large numbers of MRAPs

[armored troop carriers] ... in Iraq and Afghanistan

[will be] stored in Italy

, where they could be transported for

contingencies across Africa

" (The New York Times online, July 27). "Contingencies" is a neutral word for war.

In 2008, President George W. Bush authorized "the newest regional headquarters, Africa Command" (The New York Times, Oct. 5,

2008, p.8). "Africom" is based in Stuttgart, Germany, "owing to local [African] sensitivities." Its commander, Gen. William E. Ward,

"rejected criticisms that Africa Command would result in a militarization of foreign policy, and he said it was specifically structured for cooperative efforts," though he didn't define what that meant.

Whatever it meant, President Obama has appointed a new commander. Gen. David M. Rodriguez is an officer of "extensive combat experience. ... [He] served two tours in Iraq and two tours in Afghanistan ... and later [was] deputy commander of allied forces there with responsibility for day-to-day management of the war. ... [Rodriguez] was one of the architects" of Obama's Afghan surge (The

New York Times online, Sept. 19).

Sounds like the Pentagon and the White House anticipate action in Africa.

The July 27 report cited above added that "

MRAPs would be sent to warehouses in the

western Pacific

" and

"significant numbers are stored in

Southwest Asia

."

The U.S. is building a base in Darwin, on the northwest tip of Australia, "as a new center of operations in Asia as it seeks to ... grapple with China's rise" (The New York Times, Nov. 15, 2011, p.6).

Recently, Secretary of State Hillary Rodham Clinton and Secretary of Defense Leon E. Panetta crisscrossed the western Pacific from

China to New Zealand assuring everybody that we're not trying to "contain" China; we're merely, in Panetta's words, continuing "to be what we have been now for seven decades: the pivotal military power in the Asia-Pacific region" (The New York Times online, Sept.

13).

But something is true today that has not been true for most of those seven decades. According to the Central Intelligence Agency

(www.cia.gov), China is the No. 1 trading partner of Australia, Japan, South Korea, Malaysia, the Philippines, the Solomon Islands,

Taiwan, and Thailand. And China is a major commercial player with everybody else in the region.

We're defending these Pacific countries against their major trading partner?

"'What worries us is having to choose [between the U.S. and China]—we don't want to be in that position,' said the foreign minister of

Indonesia" (The New York Times online, June 1). You bet they don't.

China, Japan, and others are jockeying for some seemingly worthless (even uninhabited) islands in the

South and East China seas

.

"

Quarrels over these hunks of volcanic rock

wouldn't matter much except that China, Vietnam, and the

Philippines are running into one another in the

race for oil

" (The New York Times, Nov. 13, 2011, p.SR4).

It's about offshore drilling, that report says. "The South China Sea alone is estimated to have 61 billion barrels of petroleum—oil and gas—plus 54 billion yet to be discovered." Oil again.

In the long game

, who wins influence over the area? The United States or China? Put it another way: Who wins? The depleted, financially struggling, politically deadlocked nation many thousands of miles away or the money- and manpower-rich rising nation playing in its own pool? (After all, the disputed areas are called the South and East China Seas.)

Again, the U.S. is setting itself up in a strategically untenable position.

Navy Secretary

Ray Mabus said

, "

We buy too much fossil fuels from

potentially or actually volatile places on earth

" (NPR online, Sept. 26, 2011).

But the unexpected always happens, and that NPR report reveals something most unexpected: Of all U.S. federal institutions, the

Navy and Air Force lead in seeking a nonviolent

, eco-friendly path out of America's strategic morass.

They "have been busy testing

their aircraft ... on jet biofuel

. ... [

T]he Navy has launched a project to invest

up to half a billion dollars in biofuel refineries

. Mabus says he is committed to getting 50 percent of the Navy's fuel for aircraft and surface ships from renewable sources by 2020 because dependence on foreign oil makes the U.S. military vulnerable

."

Predictably, "the biofuel program has struck a nerve among Republicans," who are trying to limit military biofuel use by law (The

New York Times online, Aug. 27). Their Big Oil donors know that if a military market makes biofuels cheap, then

America's airlines, railways, and truckers

will want it too, and

other oil-dependent nations will follow our lead

.

Mostly for the sake of oil

, the

Obama

administration's strategies extend U.S. military reach beyond practical limits —limits that Mitt Romney, if elected, plans to strain still further. But the military has come up with an elegant solution: Strategically and environmentally, a U.S. military powered by biofuels could be a 21st century

game-changer

that

ends the oil wars

and drains Big Oil's political dominance.

That is a real possibility. It is also possible that

, walking a map bigger than a basketball court, our commanders will bump into one another indefinitely, attempting to defend an indefensible strategy

.

AND, it solves warming

Alic 12

[Jen Alic “4 Biofuels That Don't Take Food Off People's Tables,” Oilprice.com Published: Wednesday, 12 Sep 2012 | 3:53

PM ET, pg. http://tinyurl.com/d4pmjqm

Algae: Growing on Us

Algae

produces some carbon dioxide when burned, but it takes

the same carbon dioxide in to grow

. So

when algae farms grow massive quantities to be turned into biofuels, the

end result is that they actually

suck g

reen

h

ouse

g

as

out of the air

.

It

also has other advantages over biofuels from corn or soybeans, in that it

does not require soil or fresh water

to grow

.

It

also has the potential to produce more energy per hectare than any land crop

.

1NR—Overview

Competition for oil will be hot and dangerous—the coming wars will involve

China and Russia

Meacher 08

—Labour MP for Oldham West and Royton, was environment minister 1997-2003. [Michael Meacher, “The era of oil wars,” guardian.co.uk, Sunday 29 June 2008, pg. http://www.guardian.co.uk/commentisfree/2008/jun/29/oil.oilandgascompanies

The US maintains 737 military bases in 130 countries

under cover of the "war on terror" to

defend American economic interests, particularly access to oil

.

The principal objective for

the continued existence and expansion of

Nato

post-cold war is the

encirclement of Russia

and the pre-emption of China dominating access to oil and gas in the Caspian

Sea and Mid dle

East

regions.

It is only the beginning of the unannounced

titanic global resource struggle

between the US and China

, the world's largest importers of oil (China overtook Japan in 2003). Islam has been dragged into this tussle because it is in the Islamic world where most of these resources lie, but Islam is only a secondary player. In the case of Russia, the recent pronounced stepping up of western attacks on Putin

and claims he is undermining democracy are ultimately aimed at securing

a pro-western government there, and access to

Russian oil

and gas when Russia has more of these two hydrocarbons together than any other country in the world.

The struggle has also spilled over into West Africa

, reckoned to hold some 66 billion barrels of oil typically low in sulphur and thus ideal for refining. In 2005 the US imported more oil from the Gulf of Guinea than from Saudi and Kuwait combined, and is expected over the next 10 years to import more oil from Africa than from the Middle East. In step with this, the Pentagon is setting up

a new unified military command for the continent named

Africom

. Conversely,

Angola is now China's main supplier of crude oil

, overtaking Saudi Arabia last year. T here is no doubt that Africom, which will greatly increase the US military presence in Africa, is

aimed at the growing conflict with

China over oil

supplies

.

As Joe Lieberman, former US presidential candidate, put it, efforts by the US and China to use imports to meet growing demand " may

escalate competition

for oil to something a s

hot and dangerous

as the nuclear arms race between the US and the Soviet Union

".

War with Russia is an existential risk

Krieger & Starr 12

—President of the Nuclear Age Peace Foundation & Senior Scientist for Physicians for Social

Responsibility. [David Krieger & Steven Starr, “A Nuclear Nightmare in the Making: NATO, Missile Defense and Russian

Insecurity,” Nuclear Age Peace Foundation, January 03, 2012 http://www.wagingpeace.org/articles/db_article.php?article_id=321]

This is a dangerous scenario, no matter which NATO we are talking about, the real one or the hypothetical one. Continued

US indifference to Russian security concerns could have dire consequences: a breakdown in US-

Russian relations; regression to a new nuclear-armed standoff

in Europe;

Russian withdrawal from

New

START; a new nuclear arms race

between the two countries; a breakdown of the

Nuclear

N on-

P roliferation

T reaty leading to new nuclear weapon states; and a

higher probability

of nuclear weapons use by accident or design

. This is a scenario for nuclear disaster, and it is being provoked by US hubris in pursuing missile defenses, a technology that is unlikely ever to be effective, but which Russian leaders must view in terms of a worst-case scenario.

In the event of increased US-Russian tensions, the worst-case scenario from the Russian perspective would be a US first-strike nuclear attack on Russia, taking out most of the Russian nuclear retaliatory capability. The Russians believe the US would be emboldened to make a first-strike attack by having the US-NATO missile defense installations located near the Russian border, which the US could believe capable of shooting down any Russian missiles that survived its first-strike attack.

The path to a

US-Russian nuclear war

could

also begin with a conventional military confrontation via NATO

. The expansion of NATO to the borders of Russia has created the potential for a local military conflict with Russia to quickly escalate into a nuclear war.

It is now Russian policy to

respond with

tactical

nuclear weapons

if faced with overwhelmingly superior conventional forces, such as those of

NATO

. In the event of war, the “nuclear umbrella” of NATO guarantees that NATO members will be protected by US nuclear weapons that are already forward-based in Europe.

Biofuls are key to military readiness.

Gardner 12

—Junior Fellow @ American Security Project [Robert Gardner, “Budgeting for Biofuels:The Military’s

Dependence on Petroleum Must be Mitigated,” American Security Project, June 21, 2012, http://americansecurityproject.org/blog/2012/budgeting-for-biofuelsthe-militarys-dependence-on-petroleum-must-be-mitigated/]

Petroleum is currently used to satisfy 80% of the US military’s energy needs

and is relied upon as the single source of liquid fuel for transportation, operations, and training. The

volatile price

of oil has incurred huge unbudgeted costs for the military, causing national security

risks for the military’s operations

.

In light of national security risks it has become widely agreed upon that the Department of Defense should be hedging its bets against petroleum use. The Navy is seeking to move away from petroleum dependence by investing in biofuels, the primary alternative to petroleum fuels.

However, both the House and Senate Armed Services Committees have move d to block

the Navy’s plans to purchase biofuels for testing and to directly invest in domestic biofuels producers. This action undermines the military’s efforts to mitigate the long term strategic risks posed by its dependence on petroleum

. Biofuel research and development needs to be on the table as the military reduces its dependence on petroleum.

Why does the military need to shift away from petroleum fuel?

Currently the military is dependent upon volatile petro leum prices set on the global market. These prices are largely determined by the unpredictable politics of foreign countries

. Even if the military dose not import oil directly from Iran or the Middle East, the price paid for petroleum is largely set by market conditions in the region.

Price instability has caused budgeting dilemmas for the military

in recent years. A June 2012 Congressional

Research Service report found that the cost of buying fuel has increased faster than any other major DoD budget category. Despite the

DoD’s cutting back 4% on petroleum use from FY2005 to FY2011, its spending on petroleum ballooned 381% in real (i.e., inflationadjusted) terms during this time period.

Along with rising prices, the

short term volatility of oil prices poses substantial risks for DoD budgeting and operations

. Secretary of the Navy Ray Mabus has stated that every dollar increase in the price of a barrel of petroleum costs the Navy about $31 million of unbudgeted funding annually

.

DoD reports have found that a 10% increase from the FY2011 price of fuel would cost the DoD as a whole an additional $1.7 billion a year.

Former Defense Secretary Robert Gates asserted that unbudgeted fuel costs could force operational cuts in

Air Force flying hours, Navy steaming days, and training

for homestationed Army troops. These cuts pose

serious security risks for military operations

. While testifying on military budgeting for 2013 Secretary Mabus stated that “we would be irresponsible if we did not reduce our dependence on foreign oil.”

Steps Forward

Steep increases and fluctuations in petroleum spending emphasize the need for the DoD to hedge its bets against rising petro leum prices.

The Navy and Air Force have set forth

2020 goals to reduce their oil usage

by 50%, by using al ternative fuels.

Secretary Mabus and others have stated that efforts toward biofuel development will

increase the security of the energy supplies and reduce the service’s vulnerability to price shocks

.

We must decrease the amount of CO2 already in the atmosphere to prevent extinction—they can’t access this impact.

EarthTalk 12

[“Atmospheric CO2—Is it Too Late Anyway?” E Magazine, Thursday, August 23rd, 2012, pg. http://globalwarmingisreal.com/2012/08/23/earthtalk-atmospheric-co2-is-it-too-late-anyway/]

Actually the amount of

carbon dioxide (

CO2) in the atmosphere today is

roughly 390

parts per million

( ppm

). And that’s not good news

. “

Experts agree

that this level cannot be sustained

for many decades without potentially catastrophic consequences,”

reports the Geos Institute, an Oregon-based non-profit and consulting firm that uses science to help people predict, reduce and prepare for climate change.

While we’re unlikely to get atmospheric CO2 concentrations down as low as they were (275 ppm) before we started pumping pollution skyward during the Industrial Revolution, climate scientists

and green leaders agree that 350 ppm would be a tolerable upper limit

. Prior to 2007 scientists weren’t sure what emissions reduction goal to shoot for, but new ev idence led researchers to reach consensus on 350 ppm if we wished to have a planet,

in the words of NASA climatologist James Hansen, “similar to the one on which civilization developed and to which

life on earth is adapted

.”

1NR—Warming

Algae consumes CO2

Bosselman 11

—Professor of Law Emeritus @ Chicago-Kent College of Law [Fred Bosselman, “GREEN DIESEL: FINDING

A PLACE FOR ALGAE OIL,” CHICAGO-KENT LAW REVIEW, Vol 86:1, 2011]

Under natural growing conditions, algae grow by using

photosynthesis, a process that uses carbon dioxide (

CO2

) from the air as a nutrient

.

But although the amount of carbon dioxide in the air is growing

, it is

a small percentage, and

far too small to support mass production of algae for oil

.

Therefore, scientists assume that supplemental carbon dioxide would be needed

, which would be likely to make the process prohibitively expensive if the carbon dioxide had to be purchased on the open market.106

This has led to extensive exploration of the possibility that algae production facilities might be fed with the exhaust gases from coal or gas fired power plants, cement plants, breweries, fertilizer plants, or steel mills

.107

If ponds are

located in the vicinity of a coal-fired power plant

or other industrial facility that can provide flue gas that is high in CO2, the growth rate of the algae might be increased substantially

.108

The opportunity to grow algae using waste CO2 from power plants or industrial facilities has already led to a number of prototype projects

.109 For example,

Inventure Chemical and Seambiotic have announced that they have formed a joint venture to construct a pilot commercial biofuel plant with algae created from CO2 emissions as a feedstock

.

The plant will use algae strains that Seambiotic has developed coupled with conversion processes developed by Inventure to create ethanol, biodiesel and other chemicals. 110

1NR—Funding Now

The BCA makes the budget zero-sum

Garamone 12

[Jim Garamone, “Panetta, Dempsey Say Pentagon Feels Sequestration’s Shadow,” American Forces Press

Service, April 16, 2012, http://tinyurl.com/6q94et2]

“ In the end,

it’s up to Congress

,” Panetta said. “In the coming weeks, they will

begin consider ing the defense authorization and appropriations bills

.

Our hope is that Congress will carefully consider the new defense strategy and the budget decisions

that resulted from that strategy.”

Any changes

the Congress contemplates will affect other sections of the budget, because it is a

zerosum game

, the secretary noted.

Because of the Budget Control Act

, he added, any change in any one area of the budget and force structure will inevitably require offsetting changes elsewhere

.

This means the plan which is out of the blue must be offset

Serbu 12

[Jared Serbu, “Panetta to Congress: Don't mess with my budget,” Federal News Radio, 5/11/2012, http://www.federalnewsradio.com/394/2861074/Panetta-to-Congress-Dont-mess-with-my-budget]

Panetta warned lawmakers that

tinkering with DoD's budget plan

is a recipe for stalemate with the

Senate and will have negative consequences for national security

.

"The Department of

Defense is not going to support additional funds that come at the expense of critical national security priorities

," he said. "

If members of

Congress

try to restore

their favorite programs without regard to an overall strategy, the cuts will have to come from areas that

impact overall readiness.

There's no free lunch

here."

The remarks at a Pentagon news conference came hours after the House panel approved its version of the 2013 Defense authorization bill. The panel's chairman, Buck McKeon (R-Calif.), has been extremely critical of DoD's plan to reduce spending by $487 billion over the next 10 years despite having voted in favor of the 2011 Budget Control Act that mandated the spending reductions.

Every extra dollar must have an offset

Given the parameters of the deficit-cutting legislation lawmakers passed last year, the military must cut $487 billion from national security programs one way or another

, Panetta argued.

"

Every dollar that is added by Congress will have to be

offset

somewhere

.

And if for some reason they don't want to comply with the Budget Control Act, they'd certainly be adding to the deficit, which certainly puts our national security even further

at risk,"

he said.

Budgets are tight but biofuels are winning – proves the brink.

Peterka 1/22

/2013 [Amanda Peterka, E&E reporter, Airlines piggyback on DOD's test flights, push for expanded production http://www.eenews.net/Greenwire/2013/01/22/archive/5?terms=biofuels]

The military

also depends on Congress for funding to test and purchase biofuels

, said John Heimlich, vice president and chief economist at Airlines for America, a consortium of 11 airlines that has entered a strategic alliance with the Navy to advance aviation biofuels.

"

That's one thing that makes the military effective,"

Heimlich said.

"It's not just their know-how and commitment. It's their balance sheet

."

But although the Pentagon could guarantee a market for aviation biofuels, the effort could be toppled by Washington budget battles

.

So far

, though, news from Washington has been encouraging for biofuel promoters

. President

Obama signed a defense authorization act last month that included funding for the military's biofuel programs.

And early this month, Obama signed a "fiscal cliff" package that extended tax incentives for the cellulosic biofuel and biodiesel industries.

To keep momentum going in the industry, Holland said, the military needs to be aggressive about putting those biofuel programs in place.

The commercial aviation industry also needs to get off the ground, he said.

There is no new spending.

Brannen 1/22

/2013 [Kate, Wary Defense Department slows spending, Politico, http://dyn.politico.com/printstory.cfm?uuid=33904F89-38B8-46ED-97A8-ADF4E02C829C ]

One precautionary measure

raising questions in the defense world is

Deputy Defense Secretary Ashton

Carter’s order not to award any research and development, production contracts and contract modifications that obligate more than $500 million without first clearing them with

Frank Kendall, the undersecretary of defense for acquisition

, technology and logistics.

“I saw this as the critical line in Carter’s memo,” said David Berteau, director of the international security program at the Center for

Strategic and International Studies.

In a follow-up memo dated Jan. 15 and obtained by POLITICO, Kendall explained that by “obligation,” the Pentagon meant not only the amount of the specific contract action but also the total potential obligation of the contract. In other words, even if an agency wanted to award a contract that by itself was worth less than $500 million, but that was part of a total agreement worth more, it would have to get clearance.

Before Kendall signs off on the contract, he wants to see a page-long explanation of the contract, its proposed dollar value, the appropriation and the year of funding, its purpose and an assessment of why it cannot be delayed. The requests are to be submitted to

Kendall through Richard Ginman, the Pentagon’s director of defense procurement and acquisition policy.

Of all the precautionary steps the Pentagon is taking, this one has the industry the most worried.

“ This requirement will have a chilling effect

on any contract, because adding on another layer of review will deter the services from moving forward on big awards ,” said Loren Thompson, chief operating officer of the Lexington Institute and a consultant for several of the biggest defense companies.

One congressional source read Carter’s directive to mean the military won’t be signing any big contracts anytime soon

.

At the very least, adding this layer of review will very likely slow the process by which contracts are awarded — and that is probably the point.

1NR—Sequestration

Congressional support increasing—however, Pentagon is not out of the woods. New political realities will renew opposition

Daly 12/23

/12 [John Daly, “U.S. Military Biofuels Survives Republican Congressional Euthanasia Attempt,” OilPrice.com| Sun,

23 December 2012 00:00, pg. http://tinyurl.com/cl34qu3

During the

heated U.S. presidential debate

last month,

Republicans lined up the U.S. military’s interest in renewable fuels in their gunsights,

with both House and Senate Republicans introducing legislation to prohibit the

Pentagon from buying any fuels with a price tag greater than those generated from traditional fossil fuels.

Those efforts have apparently fallen by the wayside

, as unofficial reports indicate that biofuels provisions have survived a House-Senate conference over the upcoming National Defense Authorization Act legislation.

According to Capitol Hill sources, speaking on condition of anonymity, original House of Representatives text prohibiting Department of Defense spending on biofuels has been removed and replaced with a requirement that DOD funding be matched by the Department of Energy and the department of Agriculture. Giving heart to biofuel proponents, the USDA has already committed funds, while DOE funding is contingent on appropriations.

Pentagon interest in biofuels is not a recent event, but has been

gridlocked by Washington power plays

. The 2007 Energy Independence and Security Act mandated that the country’s fuel supply include 36 billion gallons of biofuel by 2020, three years later, 2010 the USDA reported that to meet the mandate, 527 new bio-refineries would be required at a cost of 4168 billion to meet demand.

Shortly before his inauguration in January 2008 President-elect Obama promised to invest $150 billion over the next decade to develop biofuels, plug-in hybrid vehicles, renewable energy production and a skilled work force for clean technologies.

Obama made clean energy a centerpiece of his administration’s policy from the outset. In recognition of the potential of the US bioeconomy, in July 2010 the Obama Administration issued an Executive Memorandum called ‘Science and Technology Priorities for the

FY2012 Budget’ (M-10-30), which mandated a priority for federal agen¬cies to “support research to establish the foundations for a

21st century bio-economy.”

The following year, during his State of the Union address on 25 January 2011 Obama said, “This is our generation's Sputnik moment.

Two years ago, I said that we needed to reach a level of research and development we haven’t seen since the height of the Space Race.

And in a few weeks, I will be sending a budget to Congress that helps us meet that goal. We’ll invest in biomedical research, information technology, and especially clean energy technology—(applause)—an investment that will strengthen our security, protect our planet, and create countless new jobs for our people. Already, we’re seeing the promise of renewable energy.”

Obama’s initiatives gathered substantial support. Enter the Pentagon.

In

January

2010, USDA

Secretary Tom Vilsack and Secretary of the Navy

Ray Mabus signed a M emorandum o f

U nderstanding to develop advanced biofuels and other renewable energy systems for commercial and military transportation needs

. Two years later USDA Under Secretary Dallas Tonsager signed an agreement with the

Airlines for America on a “Farm to Fly” project, investigating feedstock and infrastructure needs for the development of a U.S. aviation biofuels industry.

In October 2010 the Navy purchased 20,055 gallons of algae biofuel at an eye-watering cost of $424/gallon. Nevertheless, the contract was one of the biggest U.S. purchases of a non-corn ethanol biofuel up to that time. A year later, the Navy reportedly spent

$12 million for 450,000 gallons of biofuel. The bad news was that the biofuel’s cost worked out to around $26.67 per gallon, roughly six times the current cost of traditional gas.

In

January 20

11

, bringing together three different federal agencies, Secretaries

Vilsack, Mabus and

Department of Energy

Secretary Steven

Chu signed an agreement to work with private industry to develop drop-in biofuels for military and commercial use (drop-in biofuels are direct replacements for existing gasoline, diesel, and jet fuels

that do not require changes to existing fuel distribution networks or engines).

Building on that momentum, the White House in its ‘Blueprint for a Secure Energy Future’, released on 30 March 2011, again emphasized its commitment to developing the US biofuel sector with a USD800 million commitment for advanced biofuel projects.

After noting that “the Administration is investing in the research and deployment of alternative fuels that can be safely used in the aviation sector”, the document continued: “Competitively-priced drop-in biofuels could help meet the fuel needs of the Navy, as well as the commercial aviation and shipping sectors.”

But all of this eventually bogged down in

bipartisan gridlock

. Last autumn, U.S. House of Representatives along with Sentate

Republicans introduced legislation to ban the military from purchasing or developing biofuels if they cost more than traditional fossils fuels.

Given the new political realities

, both Congressional initiatives have fallen by the wayside

, with both the House and Senate having been forced to harmonize their variant versions of the annual National Defense Authorization

Act (NDAA) appropriations bill. With the “fiscal cliff” approaching, the final version, earlier this week, removes attempts to block the

DoD’s biofuels program.

But renewable fuels advocates are hardly out of the woods yet — they have 12 months

to deliver before

military appropriations issues reemerge

, and only the most ardent optimist at this point can assume that the 2012 DoD appropriations will include fiscal largesse for all military “guns and butter”—err, vegetable oil biofuel—needs.

No restrictions now—Pentagon budget battles will renew the fight

Colman 12/18

/12 [Zack Colman, “Defense bill preserves military biofuels program,” E2 Wire, 12/18/12 05:39 PM ET, pg. http://tinyurl.com/bbxcmgr

A House-Senate deal on defense legislation omits a GOP

-backed plan to thwart military purchases of biofuels

.

The Senate already had stripped restrictive language from its version

of the defense authorization bill last month, making it differ from the House.

House and Senate negotiators took cues from the Senate's version.///

“ There is no limiting language in there. It looks favorable at this point and I commend the administration for the hard line it took ,” Michael McAdams, president of the Advanced Biofuels Association, told The

Hill on Tuesday.

A House-Senate negotiating group unveiled the compromise bill Tuesday afternoon. House Armed Services Chairman Buck McKeon

(R-Calif.) said the bill is scheduled for a Thursday House vote, is expected to pass the Senate and will hit President Obama's desk

Friday.

Republicans in the Senate and the House had previously added amendments to the authorization bills that blocked the military from spending on biofuels

.

They argued the fuels were too expensive with sequestration set to shave

$500 billion from the

Pentagon's budget

through the next 10 years. And others

, such as Sen. James Inhofe (R-Okla.), said

the

Energy

Department — not Defense

(DoD) — should be investing in

such fuels.

Coal DA

1NC Chinese Coal

U.S. coal exports to China are low, but downward pressure on domestic demand expands them massively

Bryan

Walsh 12

, Senior Editor at TIME, May 31, 2012, “Drawing Battle Lines Over American Coal Exports to Asia,” online: http://science.time.com/2012/05/31/drawing-battle-lines-over-american-coal-exports-to-asia/

But across the Pacific

Ocean, the demand for coal has never been hotter, with China burning 4.1 billion tons in 2010

alone, far more than any other country in the world.

That

insatiable demand forced China in

2009 to become a net

coal importer

for the first time, in part because congested rail infrastructure raised the cost of transporting coal from the mines of the country’s northwest to its booming southern cities

. In April, Chinese coal imports nearly doubled from a year earlier.

Right now Australia and

Indonesia

supply much of China’s foreign coal

.

U.S. coal

from the Powder River Basin

could be a perfect addition to the Chinese market

. Montana and Wyoming are just short train trips to ports on the

Pacific Northwest coast, and from there it’s a container ship away from Asian megacities where coal doesn’t have to compete with cheap natural gas and air-pollution regulations are far weaker than in the U.S. To a wounded Big Coal, China is a potential savior. As I write in the new edition of TIME, there’s just one problem: right now, ports on the West Coast lack the infrastructure needed to transfer coal from railcars into container ships. (Just 7 million of the 107 million tons of U.S.-exported coal left the country via Pacific Ocean ports last year.)

That’s why coal companies

like Peabody and Ambre Energy are ready to spend millions to build coal-export facilities

at a handful of ports in Washington and Oregon.

If all those plans go forward

, as much as

150 million tons

of coal could be exported from the Northwest annually —-nearly all of it coming from the Powder -River -Basin and headed to Asia.

Even if the U.S. kept burning less and less coal at home

, it would have a

reason to keep mining it

.

SMRs cause coal plant retiring

Marcus

King et al 11

, Associate Director of Research, Associate Research Professor of International Affairs, Elliot School of

International Affairs, The George Washington University, et al., March 2011, “Feasibility of Nuclear Power on U.S. Military

Installations,” http://www.cna.org/sites/default/files/research/Nuclear%20Power%20on%20Military%20Installations%20D0023932%20A5.pdf

SMRs

have potential advantages over larger plants because they provide owners more flexibility in financing, siting, sizing, and end-use applications

. SMRs can reduce an owner's initial capital outlay or investment because of the lower plant capital cost. Modular components and factory fabrication can reduce construction costs and schedule duration.

Additional modules can be added incrementally as demand for power increases. SMRs can provide power for applications where large plants are not needed or may not have the necessary infrastructure to support a large unit such as smaller electrical markets, isolated areas, smaller grids, or restricted water or acreage sites.

Several domestic utilities have expressed considerable interest in SMRs as

potential

replacements for aging fossil plants

to increase their fraction of non-carbon-emitting generators. Approximately

80 percent of the

1174 total operating U.S. coal plants have power outputs of less than 500 MWe; 100 percent of coal plants that are more than 50 years old have capacities below 500 MWe

[3].

SMRs would be a

viable replacement option for these plants

.

U.S. exports lock in expanded Chinese coal capacity---causes warming over the tipping point---it’s unique because absent U.S. exports the rising cost of coal will cause a shift to renewables

Thomas M.

Power 12

, Research Professor and Professor Emeritus, Department of Economics, University of Montana; Principal,

Power Consulting; February 2012, “The Greenhouse Gas Impact of Exporting Coal from the West Coast: An Economic Analysis,” http://www.sightline.org/wp-content/uploads/downloads/2012/02/Coal-Power-White-Paper.pdf

The cumulative impact of these coal port proposals on coal consumption in Asia could be much larger than even that implied by the two pending proposals. If Arch, Peabody, and other western U.S. coal producers’ projections of the competitiveness of

western coal in Asia are correct, facilitating the opening of the development of

West Coast coal

ports could have a very large impact on the supply of coal to China and the rest of Asia

.

6.4 The Long-term Implications of Fueling Additional Coal-Fired Electric Generation

Although the economic life of coal-fired generators is often given as 30 or 35 years, a permitted, operating, electric generator is kept on line

a lot longer than that, as long as 50 or more years

through ongoing renovations and upgrades. Because of that long operating life, the impact of

the lower Asian coal prices

and costs triggered by

PRB coal competing with other coal sources

cannot be measured by the number of tons of coal exported each year. Those lower coal costs will lead to commitments to more coal being burned for a half-century going forward.

That time-frame is very important.

During exactly this time frame

, the next half-century, the nations of the world will have to get their greenhouse gas emission stabilized and then reduced or the concentrations of greenhouse gases

in the atmosphere may pass a point that will make it

very difficult to avoid massive, ongoing, negative climate impacts

.

Taking actions now that encourage fifty-years of more coal consumption

around the world is not a minor matter

. Put more positively, allowing coal prices to rise

(and more closely approximate their full cost, including “external” costs) will encourage

extensive investments in improving the efficiency with which coal is used and the shift to cleaner sources of energy

.

This will lead to long-term reductions in greenhouse gas emissions

that will also last well into the next half-century. 57

Extinction

Flournoy 12

– Citing Feng Hsu, PhdD NASA Scientist @ the Goddard Space Flight Center, Don FLournoy, PhD and MA from

UT, former Dean of the University College @ Ohio University, former Associate Dean at SUNY and Case Institute of Technology,

Former Manager for Unviersity/Industry Experiments for the NASA ACTS Satellite, currently Professor of Telecommunications @

Scripps College of Communications, Ohio University, “Solar Power Satellites,” January 2012, Springer Briefs in Space Development, p. 10-11

In the Online Journal of Space Communication , Dr. Feng Hsu, a NASA scientist at Goddard Space Flight Center, a research center in the forefront of science of space and Earth, writes, “ The evidence of global warming is alarming ,” noting the potential for a catastrophic planetary climate change is real and troubling (Hsu 2010 ) . Hsu and his NASA colleagues were engaged in monitoring and analyzing climate changes on a global scale, through which they received first-hand scientific information and data relating to global warming issues, including the dynamics of polar ice cap melting.

After discussing this research with colleagues who were world experts on the subject, he wrote: I now have no doubt global temperatures are rising, and that global warming is a serious problem confronting all of humanity . No matter whether these trends are due to human interference or to the cosmic cycling of our solar system, there are two basic facts that are crystal clear: (a) there is overwhelming scientific evidence showing positive correlations between the level of CO2 concentrations in Earth’s atmosphere with respect to the historical fluctuations of global temperature changes; and (b) the overwhelming majority of the world’s scientific community is in agreement about the risks of a potential catastrophic global climate change . That is, if we humans continue to ignore this problem and do nothing, if we continue dumping huge quantities of greenhouse gases into Earth’s biosphere, humanity will be at dire risk (Hsu 2010 ) . As a technology risk assessment expert, Hsu says he can show with some confidence that the planet will face more risk doing nothing to curb its fossil-based energy addictions than it will in making a fundamental shift in its energy supply. “This,” he writes, “is because the risks of a catastrophic anthropogenic climate change can be potentially the extinction of human species, a risk that is simply too high for us to take any chances” (Hsu

2010 )

Chinese emissions are sufficient to cause extinction

John Copeland

Nagle 11

, the John N. Matthews Professor, Notre Dame Law School, Spring 2011, “How Much Should China

Pollute?,” Vermont Journal of Environmental Law, 12 Vt. J. Envtl. L. 591

Third, the rest of the world suffers because of the inability of China and the United States to agree on a method for reducing their greenhouse gas emissions. Even if the rest of the world were to reach such an agreement, the failure to include China and the

United States would doom the project from the start. Together, China and the United States account for forty-one percent of the

world's greenhouse gas emissions. [FN19]

Left unchecked

,

China's emissions alone could result in

many of the harms associated with climate change . [FN20] That is why many observers believe that “[t]he decisions taken in Beijing

, more than anywhere else

, [ will

] determine whether humanity thrive[s] or perishe[s ].”

2NC Overview

Warming magnifies all impacts and makes global conflicts inevitable – and turns water shortages

Ginsborg et al. 12 –

Mikkel Funder, Signe Marie Cold-Ravnkilde and Ida Peters Ginsborg - in collaboration with Nanna

Callisen Bang, Denmark Institute for International Studies, 2012, "ADDRESSING CLIMATE CHANGE AND CONFLICT IN

DEVELOPMENT COOPERATION EXPERIENCES FROM NATURAL RESOURCE MANAGEMENT" www.diis.dk/graphics/Publications/Reports2012/RP2012-04-Addressing-climate-change_web.jpg.pdf

2.2 Climate change as a conflict multiplier Climate change is therefore best seen as a

conflict multiplier

, rather than as a major direct cause of conflict in itself.

Climate change may aggravate and extend the scope of existing conflicts, or trigger underlying and latent conflicts to break out into the open . Previous studies have identified a number of areas in which climate change may contribute to a worsening of conflicts (Brown & Crawford 2009). These include: •

Land and water access

. Access and use rights to land are a key feature in most situations where climate change has contributed to natural resource conflicts so far. Climate change can intensify existing conflicts over land , as land becomes less fertile or is flooded, or if existing resource sharing arrangements between di ff erent users and land use practices are disrupted. In some parts of Africa, climate change may lead to a decline in available water resources of some 10–20% by the end of the century (op cit.). This may intensify existing competition for access to water at intra-state and/or subnational levels. •

Food security

. Reduced rainfall and rising sea levels may lead to a decline in agricultural production and a substantial loss of arable land in some parts of Africa. Reduced yields for own consumption and increasing domestic food prices may in some cases lead to civil unrest, and competition over access to land may intensify . •

Migration and displacement

. In some cases, increased scarcity of and competition over access to water and arable land may contribute to internal or regional migration, and disasters such as floods may lead to temporary or longterm local displacement. This may in turn strengthen conflicts between host societies/communities and migrants looking for access to new land and resources. •

Increasing inequality and injustice

. Through processes such as the above, some population groups may be particularly hard hit, leading to increased inequality and a sense of injustice. This may intensify existing grievances and disputes between natural resource users and/or between resource users and outside actors such as governments – thereby

increasing the risk and intensity of conflict

.

Even 1% risk justifies action - the consequences are too big

Podesta and Ogden 7 – *President of the Center for American Progress and ** Senior

National Security Analyst at the Center for American Progress (John and Peter, The

Security Implications of Climate Change, The Washington Quarterly 31.1, Winter

2007)

Consequently, even though the IPCC projects that temperature increases at higher latitudes will be approximately twice the global average, it will be the developing nations in the earth's low latitudinal bands, as well as sub-Saharan African countries, that will be most adversely affected by climate change. In the developing world,

even a relatively small climatic shift can trigger or exacerbate food shortages, water scarcity, destructive weather events, the spread of disease, human migration, and natural resource competition

. These crises are all the more dangerous because they are interwoven and self-perpetuating: water shortages can lead to food shortages, which can lead to conflict over remaining resources, which can drive human migration, which can create new food shortages in new regions.

Once underway, this chain reaction becomes increasingly difficult to stop. It is therefore critical that policymakers do all they can to prevent the domino of the first major climate change consequence, whether it be food scarcity or the outbreak of disease, from toppling

. The most threatening first dominos, where they are situated, and their cascading geopolitical implications are identified in this essay.

Climate change collapses hegemony – ultimately causing extinction

Matthew, 2008

University of California, Irvine [Richard A., “Global Climate Change National Security Implications,”

May,

http://www.strategicstudiesinstitute.army.mil/pdffiles/PUB862.pdf]

Against this background, climate change and security can be linked in a number of ways. Where climate changes abruptly, security problems will be immediate and extensive and perhaps even

existential

. We can easily envision threats on this scale in Bangladesh or other poor low-lying countries, but even here a significant number of Americans would be affected by a sudden barrage of massive flooding, Katrina-sized hurricanes, and tropical disease epidemics—perhaps enough to make climate change a national security issue. Another possible threat that we should take seriously is that of the gradual

erosion of American power

as endless demands are placed on it due to abrupt changes elsewhere. These are likely to arise as we face humanitarian disasters, as drought intensifies throughout Africa, and as South Asia collapses into conflict over things like fresh water. The greater our sense of interdependence, the greater our sense that national security depends on the welfare of things beyond our borders, and the more likely it is that the climate change will be a real security threat. This poses a big problem today. To what extent should we intervene to assist abroad? When should we use our resources and when should we show restraint? It is going to be difficult to make these decisions. We are playing with a lot of uncertainty. We do not know how other actors in the world will behave.

Warming causes nuclear indo-pak conflict and arctic resources wars.

Burke

, Senior Fellow & Director – Energy Security Project at the Center for a New American Security,

2008

Sharon, Catastrophic Climate Change over the next hundred years, In Climatic Cataclysm p. 162-3

At the same time, the probability of conflict between nations will rise. Although global interstate resource wars are generally unlikely,13 simmering conflicts between nations, such as that between India and Pakistan, are likely to boil over, particularly if both nations are failing. Both India and Pakistan, of course, have nuclear weapons, and a nuclear exchange is possible, perhaps likely, either by failing central governments or by extremist and ethnic groups that seize control of nuclear weapons.

There will also be competition for the Arctic region, where natural resources, including oil and arable land, will be increasingly accessible and borders are ill defined. It is possible that agreements over Arctic territories will be worked out among Russia, Canada, Norway, the United States, Iceland, and Denmark in the next two decades, before the truly catastrophic climate effects manifest themselves in those nations. If not, there is a strong probability of conflict over the

Arctic, possibly even armed conflict. In general, though, nations will be preoccupied with maintaining internal stability and will have difficulty mustering the resources for war. Indeed, the greater danger is that states will fail to muster the resources for interstate cooperation.

Global warming will engulf the Middle East in conflict

Duchene 2008

research assistant at Penn State [Lisa, “Probing Question: Are water wars in our future?” http://www.physorg.com/news131901803.html]

With rapid population growth, wasteful practices, and impending climate change, the situation is likely to get worse. Water resources in semi-arid regions are expected to be especially hard-hit, warned the Intergovernmental Panel on Climate Change in its 2007 summary report. By some estimates, two-thirds of the world's population will be water-stressed by 2025. During a year when many states across the U.S. are suffering some of the worst droughts ever, water is a topic on people's minds. Will the prospect of a diminishing water supply result in serious geopolitical conflict?

"

Freshwater resources are unevenly distributed around the globe," says Robert B. Packer, lecturer in political science at Penn State, who studies international political economy and the causes of war. "While freshwater is relatively abundant in Europe and much of North America, other regions of the globe, such as the Middle East, Central Asia, and parts of West and Eastern Africa, face

increasingly severe shortages

." According to the BBC, the number of 'water-scarce' countries in the Middle

East grew from three in 1955 to eight in 1990, with another seven expected to be added within 20 years. "Of particular concern," said Packer, "are certain riparian basins that could

explode into conflict

as sources of freshwater diminish. Conflict is more likely to occur where water can be seized and controlled in addition to being scarce." Among

Middle East countries, where every major river crosses at least one international border, up to 50 percent of water needs of any specific state finds its source in another state, Packer noted. "Hydro-politics already play a central role among states in riparian basins, such as the Tigris-Euphrates, the Nile, the Jordan, as well as those sharing the underground aquifers of the

West Bank."

Conflicts

are likely to emerge as competition intensifies to control river waters for hydroelectricity, agricultural use, and human consumption, he added. "Farms and cities downstream are vulnerable to the actions and decisions of upstream countries that they have little control over. This is exemplified in the tensions over the Tigris-Euphrates, where

Turkey commenced construction of a system of hydroelectric dams. Iraq and Syria have protested, citing the project would reduce the rivers' flow downstream. Turkey's response to the Arab states has been 'we don't control their oil, they don't control our water.'" To the west, the Nile has been the lifeline for Egyptian civilization dating back to antiquity. Nearly all of

Egypt's 80 million people live on the three percent of Egyptian territory that is the river's valley and delta. "For Egypt the

Nile is life, and its government has voiced to upstream countries that any reduction of Nile waters would be taken as national security threat that could trigger a military response," says Packer. "Nearly all freshwater in the Israeli-occupied West Bank comes from underground aquifers," he added. "Water access has become a major issue between Israelis and Palestinians."

"Perhaps the greatest of all modern Middle East conflicts, the

Six Day War

of 1967, began as a dispute over water access," Packer noted. Israel built a National Water Carrier to transport freshwater from the Jordan and the Sea of Galilee to the country's farming and urban centers. (The Carrier now supplies half the drinking water in Israel.) In 1965, Israeli forces attacked a Syrian water diversion project that would have cut the Carrier's supply, and prolonged violence led to war. "For

Israelis, control of the Golan Heights is important strategically in terms of controlling the headwaters of the Jordan River,"

Packer noted. The effects of global warming and desertification also have impacted hydro-politics around the world. In West

Africa, rainfall has declined 30 percent over the last four decades and the Sahara is advancing more than one mile per year.

Senegal and Mauritania engaged in militarized conflict in 1989 across the Senegal River that divides them, in part over changing access to arable land.

2NC AT: Regs

Voyles evidence – politician obviously biased – making a predictive claim – doesn’t say regs have passed

Coal jobs are up---if Obama’s fighting a war on coal he sucks at it

Daniel J.

Weiss 12

, Senior Fellow and Director of Climate Strategy at the Center for American Progress, May 25, 2012, “The

‘War On Coal’ Is A Myth,” online: http://thinkprogress.org/climate/2012/05/25/490444/war-on-coal-myth/

Big polluters and their Congressional allies

have created a new straw man

to knock down with the

invention of the so-called “War on Coal

.” It is a

multi-million dollar disinformation campaign funded by Big Coal polluters to protect their profits and distract Americans

from the deadly effects of air pollution on public health.

However, with the

number of coal jobs

in key coal states actually

on the rise since 2009

, it’s more like peacetime prosperity than war in coal country.

The War on Coal is nothing more than a new shiny object, designed by big polluters to distract Americans from the real war – the polluters’ attacks on their health – and the truth.

//

Coal companies

and dirty utilities claim

that long overdue requirements to reduce mercury

, arsenic, smog, acid rain, and carbon

pollution from power plants will kill jobs

.

In West Virginia

, however, coal mining employment was higher in 2011 than

at any time over the last 17 years. Federal jobs statistics

also show modest coal mining job growth

in coal states like Virginia and Pennsylvania.

In West Virginia, a

recent report from the non-partisan West Virginia Center for Budget and Policy showed coal mining jobs are

actually rising

, with 1,500 new coal jobs added since 2009. In Pennsylvania, Energy

Information Administration (EIA) data shows a 2.3% increase in coal related jobs. And in Virginia, EIA data shows a

6.7% increase in coal mining employment

from 2009 to 2010.

EPA mercury and carbon regs are net-positive for jobs---no adverse net impact on the industry

Daniel J.

Weiss 12

, Senior Fellow and Director of Climate Strategy at the Center for American Progress, May 25, 2012, “The

‘War On Coal’ Is A Myth,” online: http://thinkprogress.org/climate/2012/05/25/490444/war-on-coal-myth/

The Environmental Protection Agency (

EPA) has promulgated

or proposed new clean air standards for smog, acid rain, mercury

, air toxics, and carbon

pollution that will

save lives,

create jobs

and protect public health. For example, the Mercury and Air Toxics Standard

alone could prevent

up to

11,000 premature deaths

, 130,000 asthma incidents, and 540,000 lost work days every year.

This would provide

at least

$59 billion in economic benefits

.

The Economic Policy Institute projects

that the mercury standard will

actually

have a “positive net impact on overall employment

– likely leading to the

net creation of 84,500 jobs

between now and 2015.”

The jobs created by the standard, however, would not just be limited to certain industrial sectors. EPI’s study projects that “ 8,000 Jobs would be gained in the utility industry itself ,” along with the over

80,500 jobs

that would be created to build pollution control equipment

.

While

dirty coal companies claim

that the mercury standard will cause massive unemployment, EPI notes

that

“only 10,600 jobs would be displaced due to higher energy costs

.” Richard Morgenstern, a former Reagan

and Clinton

EPA official, predicts

that the new standard will have “no net impact” on employment

.

EPA predicts

that its proposed carbon pollution standard

for new power plants will have no impact on employment or existing coal plants

. In fact, the standard simply complements existing market factors

, as the EPA points out:

Because this standard is in line with current industry investment patterns

,

this proposed standard is not expected to have notable costs

and is not projected to impact electricity prices or reliability

.

Independent studies conclude compliance costs are even less than EPA predicted

Alex

Chamberlain 11

, ERA Environmental Consulting, 2011, “EPA Utility MACT Regulations Face Similar Criticism as

Boiler MACTs,” online: http://info.era-environmental.com/blog/bid/40758/EPA-Utility-MACT-Regulations-Face-Similar-Criticismas-Boiler-MACTs

Industry groups are critical of the new Utility MACTs. Some

have even projected

that the costs to industry will actually amount to $110 billion

- ten times the estimated price tag cited by EPA

.

Independent studies, however

, have shown that the new EPA regulations

for the energy sector will

actually have less of an economic impact than the EPA itself had predicted

.

They also fear that the court-imposed short deadline imposed on EPA’s final publication will result in a repeat of the Boiler

MACT situation. In March 21, 2011, EPA published its final Boiler MACT regulations in the federal register, only to immediately announce it was officially reconsidering many aspects of the final rule and indefinitely delaying the regulation enforcement date. The resulting uncertainty has created unrest and confusion across the manufacturing industry and the political sphere which the energy sector would rather avoid.

Natural gas not key – coal is recovering in the US

Reuters 12-7 –

Reuters, December 7th, 2012, "Coal prices to rise on increased Chinese, U.S. demand -Deutsche" www.reuters.com/article/2012/12/07/energy-coal-prices-idUSL5E8N76W120121207

But Deutsche Bank said that

higher gas demand in the U.S. would push American gas prices up, leading to a reduction of U.S. coal exports

, while Chinese demand for coal imports would rise, further supporting coal prices. "Therefore, while the outlook in the next month is ambiguous, the second half of 2013 provides

clearer signals

for an improvement in thermal coal fundamentals next year," the bank said.

U.S. coal exports to China are low, but downward pressure on domestic demand expands them massively

Bryan

Walsh 12

, Senior Editor at TIME, May 31, 2012, “Drawing Battle Lines Over American Coal Exports to Asia,” online: http://science.time.com/2012/05/31/drawing-battle-lines-over-american-coal-exports-to-asia/

But across the Pacific

Ocean, the demand for coal has never been hotter, with China burning 4.1 billion tons in 2010

alone, far more than any other country in the world.

That

insatiable demand forced China in

2009 to become a net

coal importer

for the first time, in part because congested rail infrastructure raised the cost of transporting coal from the mines of the country’s northwest to its booming southern cities

. In April, Chinese coal imports nearly doubled from a year earlier.

Right now Australia and

Indonesia

supply much of China’s foreign coal

.

U.S. coal

from the Powder River Basin

could be a perfect addition to the Chinese market

. Montana and Wyoming are just short train trips to ports on the

Pacific Northwest coast, and from there it’s a container ship away from Asian megacities where coal doesn’t have to compete with cheap natural gas and air-pollution regulations are far weaker than in the U.S. To a wounded Big Coal, China is a potential savior. As I write in the new edition of TIME, there’s just one problem: right now, ports on the West Coast lack the infrastructure needed to transfer coal from railcars into container ships. (Just 7 million of the 107 million tons of U.S.-exported coal left the country via Pacific Ocean ports last year.)

That’s why coal companies

like Peabody and Ambre Energy are ready to spend millions to build coal-export facilities

at a handful of ports in Washington and Oregon.

If all those plans go forward

, as much as

150 million tons

of coal could be exported from the Northwest annually —-nearly all of it coming from the Powder -River -Basin and headed to Asia.

Even if the U.S. kept burning less and less coal at home

, it would have a

reason to keep mining it

.

2NC UQ

Domestic coal demand is increasing now

Platts 12

[Bob Matyi, “Alliance says is regaining coal customers as US gas prices rise,” December 4, http://www.platts.com/RSSFeedDetailedNews/RSSFeed/Coal/6870003]

Rising US natural gas prices are translating into

additional coal sales business for Alliance Resource Partners, a company official said Tuesday. "We're seeing some of our customers coming back to us this year and asking for additional deliveries of coal ,

" Brian Cantrell, the chief financial officer of the Tulsa, Oklahoma-based company, told the

Wells Fargo Pipeline, MLP and Energy Symposium in New York. In recent months, gas prices have been trending upward from historically low levels early this year, Cantrell said

. Analysts say that when gas hits about

$3.50/MMBtu, coal becomes more competitive, encouraging electric utilities that moved to gas months ago to switch back to coal

. NYMEX January gas futures settled at $3.539/MMBtu Tuesday. While

Alliance

, the third-largest coal producer in the eastern US

, is feeling good these days about its prospects

, Cantrell said the outlook for some coal producers may be more cloudy.

Utilities

, he said, are still choked with huge inventories, totaling as much as 185 million st to 195 million st, thanks in part to the mild winter of 2011-12

.

"

We think it will work its way through the system while demand picks up" in 2013

, he said.

However, much of the increased demand will be filled by existing inventory

. "In our case, given our contract book, we should be just fine

," he said. "But if you're open for the market, 2013 will continue to be a challenge."

2NC UQ- Gas Version

Domestic coal demand is increasing now due to rising gas prices

Platts 12

[Bob Matyi, “Alliance says is regaining coal customers as US gas prices rise,” December 4, http://www.platts.com/RSSFeedDetailedNews/RSSFeed/Coal/6870003]

Rising US natural gas prices are translating into

additional coal sales business for Alliance Resource Partners, a company official said Tuesday. "We're seeing some of our customers coming back to us this year and asking for additional deliveries of coal ,

" Brian Cantrell, the chief financial officer of the Tulsa, Oklahoma-based company, told the

Wells Fargo Pipeline, MLP and Energy Symposium in New York. In recent months, gas prices have been trending upward from historically low levels early this year, Cantrell said

. Analysts say that when gas hits about

$3.50/MMBtu, coal becomes more competitive, encouraging electric utilities that moved to gas months ago to switch back to coal

. NYMEX January gas futures settled at $3.539/MMBtu Tuesday. While

Alliance

, the third-largest coal producer in the eastern US

, is feeling good these days about its prospects

, Cantrell said the outlook for some coal producers may be more cloudy.

Utilities

, he said, are still choked with huge inventories, totaling as much as 185 million st to 195 million st, thanks in part to the mild winter of 2011-12

.

"

We think it will work its way through the system while demand picks up" in 2013

, he said.

However, much of the increased demand will be filled by existing inventory

. "In our case, given our contract book, we should be just fine

," he said. "But if you're open for the market, 2013 will continue to be a challenge."

2NC SMRs Link

SMRs cause coal plant retiring

Marcus

King et al 11

, Associate Director of Research, Associate Research Professor of International Affairs, Elliot School of

International Affairs, The George Washington University, et al., March 2011, “Feasibility of Nuclear Power on U.S. Military

Installations,” http://www.cna.org/sites/default/files/research/Nuclear%20Power%20on%20Military%20Installations%20D0023932%20A5.pdf

SMRs

have potential advantages over larger plants because they provide owners more flexibility in financing, siting, sizing, and end-use applications

. SMRs can reduce an owner's initial capital outlay or investment because of the lower plant capital cost. Modular components and factory fabrication can reduce construction costs and schedule duration.

Additional modules can be added incrementally as demand for power increases. SMRs can provide power for applications where large plants are not needed or may not have the necessary infrastructure to support a large unit such as smaller electrical markets, isolated areas, smaller grids, or restricted water or acreage sites.

Several domestic utilities have expressed considerable interest in SMRs as

potential

replacements for aging fossil plants

to increase their fraction of non-carbon-emitting generators. Approximately

80 percent of the

1174 total operating U.S. coal plants have power outputs of less than 500 MWe; 100 percent of coal plants that are more than 50 years old have capacities below 500 MWe

[3].

SMRs would be a

viable replacement option for these plants

.

2NC Gas Link

Further gas price drops crush the domestic coal market

Reuters 12

[“More US coal plants to retire due to green rules-study,” October 8, http://www.reuters.com/article/2012/10/08/utilities-brattle-coal-idUSL1E8L851620121008]

The economists said

natural gas prices would play a major factor in determining the number of coal plants to retire

.

Retirements would drop to between 21,000 and 35,000 MW if natural gas prices increased by just $1 per million

British thermal units (mm

Btu

) relative to April 2012 forward prices.

If gas prices fell by $1, the economists projected coal retirements would increase to between 115,000 and

141,000 MW.

Natural gas prices in April bottomed at $1.90 per mmBtu. Over the past decade, natural gas has traded in a wide range from less than $2 to more than $15, averaging about $6.

The current spot cost is $3.35

Decreasing domestic demand shifts coal to an export industry

Tristan

Brown

, Lawyer and professor of graduate-level courses on the law and policy, economics, and global issues surrounding the biorenewables sector,

12/12

/12 [“'NIMBYism' Is Unlikely To Derail U.S. Coal Exports,” Seeking Alpha, http://seekingalpha.com/article/999191-nimbyism-is-unlikely-to-derail-u-s-coal-exports]

The first response of any natural resource industry to a decrease in domestic consumption is to increase exports, particularly when global consumption of the commodity is increasing

. These exports must also be restricted if carbon leakage is to be avoided.

Treaty obligations and international relations prevent the Obama administration from directly restricting U.S. coal exports, leaving it the alternative of indirectly restricting exports by imposing restrictions on trade infrastructure bottlenecks. The U.S. is not an island nation, however, and is obliged by treaty not to restrict trade with the country that it also happens to share one of the longer land borders in the world with: Canada. Barring a complete rejection of globalization and the closure America's borders, the Obama administration will find that indirectly imposing restrictions on the export of coal via one route just causes it to follow another route

. Global demand for the commodity is growing too rapidly to prevent it from being utilized.

2NC AT: China Not Switching

U.S. exports lock in expanded Chinese coal capacity---causes warming over the tipping point---it’s unique because absent U.S. exports the rising cost of coal will cause a shift to renewables

Thomas M.

Power 12

, Research Professor and Professor Emeritus, Department of Economics, University of Montana; Principal,

Power Consulting; February 2012, “The Greenhouse Gas Impact of Exporting Coal from the West Coast: An Economic Analysis,” http://www.sightline.org/wp-content/uploads/downloads/2012/02/Coal-Power-White-Paper.pdf

The cumulative impact of these coal port proposals on coal consumption in Asia could be much larger than even that implied by the two pending proposals. If Arch, Peabody, and other western U.S. coal producers’ projections of the competitiveness of western coal in Asia are correct, facilitating the opening of the development of

West Coast coal

ports could have a very large impact on the supply of coal to China and the rest of Asia

.

6.4 The Long-term Implications of Fueling Additional Coal-Fired Electric Generation

Although the economic life of coal-fired generators is often given as 30 or 35 years, a permitted, operating, electric generator is kept on line

a lot longer than that, as long as 50 or more years

through ongoing renovations and upgrades. Because of that long operating life, the impact of

the lower Asian coal prices

and costs triggered by

PRB coal competing with other coal sources

cannot be measured by the number of tons of coal exported each year. Those lower coal costs will lead to commitments to more coal being burned for a half-century going forward.

That time-frame is very important.

During exactly this time frame

, the next half-century, the nations of the world will have to get their greenhouse gas emission stabilized and then reduced or the concentrations of greenhouse gases

in the atmosphere may pass a point that will make it

very difficult to avoid massive, ongoing, negative climate impacts

.

Taking actions now that encourage fifty-years of more coal consumption

around the world is not a minor matter

. Put more positively, allowing coal prices to rise

(and more closely approximate their full cost, including “external” costs) will encourage

extensive investments in improving the efficiency with which coal is used and the shift to cleaner sources of energy

.

This will lead to long-term reductions in greenhouse gas emissions

that will also last well into the next half-century. 57

Their evidence is terrible – from a blogger

China’s transitioning to clean tech now---political commitment from CCP leadership

Luke

Schoen 10-19

, Associate in the Climate & Energy Program at WRI Insights, 10/19/12, “Policy Experts Provide Insights

Into China’s Leadership Transition,” http://insights.wri.org/news/2012/10/policy-experts-provide-insights-chinas-leadership-transition

Deborah Seligsohn, a climate and energy advisor to WRI, rounded out the call by highlighting that

China’s economic restructuring can be compatible with environmental protection, including

around action to address climate change. China’s efforts to control emissions will be “good for climate change, the planet

, and other environmental issues

that they have to grapple with,” Seligsohn said.

She discussed the main drivers behind China’s energy and climate actions, includ ing the country’s desire to: restructure its economy; increase innovation

and development of new technologies; move toward greater environmental protections; and meet its targets in the

12th five-year plan

.

Seligsohn concluded that “ there is

strong agreement

[ among Chinese officials] that part of development is being

both cleaner and more technologically sophisticated

and having a more diverse economy.”

China’s Energy Future

The discussions held during the call point to one key takeaway: Together, these underlying factors may indeed push China toward a lower-carbon energy future

. These changes are unlikely to occur quickly, but we’ll

all

be watching

closely to see if China’s new leadership is able to manage a transition to clean energy

while ensuring the country stays on a solid growth pathway.

AT: North Korea

Seriously, lols

Short-term coal price declines cause fast investment in new coal generation that will be locked in for 50 years

Thomas M.

Power 12

, Research Professor and Professor Emeritus, Department of Economics, University of Montana; Principal,

Power Consulting; February 2012, “The Greenhouse Gas Impact of Exporting Coal from the West Coast: An Economic Analysis,” http://www.sightline.org/wp-content/uploads/downloads/2012/02/Coal-Power-White-Paper.pdf

Prices now

determine energy use for

decades

.

Lower coal prices reduce the incentives to retire older, inefficient, coal-using production processes and discourage additional investments in the energy efficiency of new and existing coal-using enterprises

.

As those lower prices flow through to consumers

, it

also reduces the incentives to shift to more energy efficient appliances

. Furthermore, lower coal costs will

encourage investments in new coal-burning facilities

in Asia — which

in turn

create a

30- to 50-year demand for coal.

• China responds to higher prices by improving efficiency.

Concerns over rising energy costs have led the Chinese government to develop tighter energy efficiency standards throughout the economy.

The rise in world oil prices, for example, led the Chinese government to announce strict five-year energy conservation goals including limiting the growth of coal consumption to about 4 percent per year, far below the expected expansion of the economy.

• Potential for energy efficiency remains largely untapped.

Energy usage per unit of GDP across the Chinese economy is almost four times that in the United States and almost eight times that in Japan.

The Chinese government and the large state-owned enterprises

that produce, distribute, and use larger amounts of energy are well aware of the burden that high and rising energy cost can impose on the economy

. The energy policies

embodied in the last several five-year plans have focused heavily on improving overall energy efficiency

in order to effectively control energy costs.

Lowering coal costs to China

would

undermine these valuable energy efficiency

efforts

.

U.S. coal exports change the long-term risks that planners account for in energy investment decisions---locks new coal plants that’ll operate for 50 years

Thomas M.

Power 12

, Research Professor and Professor Emeritus, Department of Economics, University of Montana; Principal,

Power Consulting; February 2012, “The Greenhouse Gas Impact of Exporting Coal from the West Coast: An Economic Analysis,” http://www.sightline.org/wp-content/uploads/downloads/2012/02/Coal-Power-White-Paper.pdf

The conclusion I draw from this analysis is that the

PRB coal exports

facilitated by the proposed coal ports will reduce the price of coal to Asian markets

, the cost of using coal there

, and the long-term price and supply risks that planners take into account when making

long-term energy infrastructure investment decisions

.

Coal export will encourage the continued, rapid expansion of coal-fired electric generation capacity

. Consequently, as I discuss in Section 6 below, the impacts of coal export will be much larger than the annual capacity of the port facilities

would suggest, because it will encourage investments in new coal-burning facilities in Asia and their

associated 30-

50 year combustion of coal

2NC China Emission

Extinction

AT: Impact D

Prefer scientific consensus – warming skeptics are paid off by fuel companies and cherry-pick data

Monbiot 8

– visiting fellowships or professorships at the universities of Oxford (environmental policy), Bristol (philosophy),

Keele (politics), Oxford Brookes (planning) and East London (environmental science). He has honorary doctorates from the

University of St Andrews and the University of Essex and an Honorary Fellowship from Cardiff University [George, “Big oil's big lie,” June 23, http://www.guardian.co.uk/commentisfree/2008/jun/23/climatechange.carbonemissions]

Of course, it's not a crime, and it's hard to see how, in a free society, it could or should become one. But the culpability of the energy firms the climate scientist James Hansen will indict in his testimony to Congress today is clear. If we fail to stop runaway climate change, it will be largely because of campaigning by oil, coal and electricity companies, and the network of lobbyists, fake experts and thinktanks they have sponsored. The operation sprang directly from Big

Tobacco's war against science. It has used the same fake experts, the same public relations companies and the same tactics: as I showed in my book Heat, the campaign against action on climate change was partly launched by the tobacco company Philip Morris. But while the tobacco companies' professional liars were smoked out by a massive class action in the US, the sponsored climate change deniers still have massive influence over public perception. A survey published yesterday by the Observer shows that six out of ten people in Britain agreed that "many scientific experts still question if humans are contributing to climate change." This is an inaccurate perception, which results from Big Energy's lobbying. Almost

without exception

, the scientists who claim to doubt that manmade climate change is taking place fall into two categories: either they are

not qualified

in the branch of science they are discussing or they have

received money

from fossil fuel companies. Of all the self-professed climate

"sceptics", I have been able to find only one – Dr John Christy of the University of Alabama – who has relevant qualifications and who does not appear to have received fees from lobby groups or thinktanks sponsored by the energy companies. But even he has had to admit that the figures on which he based his claims were the results of "errors in the … data". The others are the very opposite of sceptics. Many of them are paid to start with a conclusion – that climate change isn't happening or isn't important – then to find data and arguments to support

it. In most cases, they

cherrypick scientific findings

; in a few cases, like the fake scientific paper attached to the celebrated Oregon petition, they make them up altogether. But people who don't understand the

difference between a peer-reviewed paper and a pamphlet are taken in. The energy companies' propaganda campaign is amplified by scientific illiterates in the media, such as Melanie Phillips, Christopher Booker,

Nigel Lawson, Alexander Cockburn and the television producer (who made Channel 4's documentary The Great

Global Warming Swindle) Martin Durkin. I don't believe that the energy companies should be prosecuted for commissioning the truckload of trash their sponsored experts publish. But their campaign of disinformation must be exposed again and again. Like the tobacco lobbyists, they are not only delaying essential public action; they also create the impression that science is for sale to the highest bidder. The awful truth is that sometimes it is.

We haven’t reached the tipping point yet

Hansen et al 7

(James, Director @ NASA Goddard Institute for Space Studies and Adjunct Prof. Earth and Env. Sci. @

Columbia U. Earth Institute), and others, Atmospheric Chemistry and Physics, “Dangerous human-made interference with climate: a

GISS model study”, 7:2287-2312, http://pubs.giss.nasa.gov/docs/2007/2007_Hansen_etal_1.pdf)

Have we already passed a “tipping point”

such that it is now impossible to avoid “dangerous” climate change (Lovelock,

2006)? In our estimation, we must be close to such a point, but

we may not have passed it yet

. It is still feasible to achieve a scenario that keeps additional global warming under 1C

, yielding a degree of climate change that is quantitatively and qualitatively different than under BAU scenarios. The “alternative” scenario, designed to keep warming less than 1C, has a significantly smaller forcing than any of the IPCC scenarios. In recent years net growth of all real world greenhouse gases has run just slightly ahead of the alternative scenario, with the excess due to continued growth of CO2 emissions at about 2%/year.

CO2 emissions would need to level out soon and decline before mid-century to approximate the alternative scenario

. Moderate changes of emissions growth rate have a marked effect after decades, as shown by comparison to BAU scenarios. Early decreases in emissions growth are the most effective.

Warming’s on track to be catastrophic---action now solves

Nuccitelli 9/1

Dana, environmental scientist at a private environmental consulting firm in Sacramento and has a Bachelor's

Degree in astrophysics from the University of California at Berkeley, and a Master's Degree in physics from the University of

California at Davis, 2012, “Realistically What Might The Future Climate Look Like?”, http://thinkprogress.org/climate/2012/09/01/784931/realistically-what-might-the-future-climate-look-like/

This is Why

Reducing Emissions is Critical We’re not yet committed to surpassing 2°C global warming, but as Watson noted, we are quickly running out of time to realistically give ourselves a chance to stay below that ‘danger limit’

. However,

2°C is not a do-or-die threshold. Every bit of CO2 emissions we can reduce means that much avoided future warming, which means that much avoided climate change impacts

. As Lonnie Thompson noted, the more global warming we manage to mitigate, the less

adaption and suffering we will be forced to cope with

in the future.

Realistically

, based on the current political climate (which we will explore in another post next week), limiting global warming to 2°C is

probably the best we can do. However, there is a big difference between 2°C and 3°C, between 3°C and 4°C, and

anything greater than 4°C can probably accurately be described as catastrophic

, since various tipping points are expected to be triggered at this level. Right now,

we are on track for the catastrophic consequences (widespread coral mortality, mass extinctions, hundreds of millions of people adversely impacted by droughts, floods, heat waves

, etc.).

But we’re not stuck on that track just yet

, and we need to move ourselves as far off of it as possible by reducing our greenhouse gas emissions as soon and as much as possible

. There are of course many people who believe that the planet will not warm as much, or that the impacts of the associated climate change will be as bad as the body of scientific evidence suggests. That is certainly a possiblity, and we very much hope that their optimistic view is correct. However, what we have presented here is the best summary of scientific evidence available, and it paints a very bleak picture if we fail to rapidly reduce our

greenhouse gas emissions

.

If we continue forward on our current path,

catastrophe

is not just a possible outcome, it is the

most probable outcome

. And an intelligent risk management approach would involve taking steps to prevent a catastrophic scenario if it were a mere possibility, let alone the most probable outcome

. This is especially true since the most important component of the solution – carbon pricing – can be implemented at a relatively low cost, and a far lower cost than trying to adapt to the climate change consequences we have discussed here (Figure 4).

Climate contrarians will often mock

‘CAGW’ ( catastrophic anthropogenic global warming

), but the sad reality is that CAGW is looking more and more likely every day

. But

it’s critical that we don’t give up

, that we keep doing everything we can do to reduce our emissions as much as possible in order to avoid as many catastrophic consequences as possible,

for the sake of future generations and all species on

Earth

. The future climate will probably be much more challenging for life on Earth than today’s, but

we still can and must limit the damage.

Extinction

Flournoy 12

(Citing Dr. Feng Hsu, a NASA scientist at the Goddard Space Flight Center, in 2012, Don Flournoy, PhD and MA from the University of Texas, Former Dean of the University College @ Ohio University, Former Associate Dean @ State University of New York and

Case Institute of Technology, Project Manager for University/Industry Experiments for the NASA ACTS Satellite, Currently

Professor of Telecommunications @ Scripps College of Communications @ Ohio University, Citing Dr. "Solar Power Satellites,"

Chapter 2: What Are the Principal Sunsat Services and Markets?, January, Springer Briefs in Space Development, Book)

In the Online Journal of Space Communication, Dr. Feng

Hsu, a NASA scientist

at Goddard Space Flight Center, a research center in the forefront of science of space and Earth, writes , “The evidence of global warming is alarming ,” noting the potential for a catastrophic planetary climate change is real and troubling (Hsu 2010).

Hsu and his NASA colleagues were engaged in monitoring and analyzing cli- mate changes on a global scale, through which they received first-hand scientific information and data relating to global warming issues, including the dynamics of polar ice cap melting.

After discussing this research with colleagues who were world experts on the subject, he wrote

: I now have no doubt global temperatures are rising, and that global warming is a serious problem confronting all of humanity . No matter whether these trends are due to human interference or to the cosmic cycling of our solar system, there are two basic facts that are crystal clear: (a) there is overwhelming scientific evidence showing positive correlations between the level of CO2 concentrations in Earth’s atmosphere with respect to the historical fluctuations of global temperature changes

; and

(b) the overwhelming majority of the

world’s scientific community is in agreement about the risks of a potential catastrophic global climate change

. That is, if we humans continue to ignore this problem

and do noth- ing, if we continue dumping huge quantities of greenhouse gases into Earth’s biosphere, humanity will be at dire risk (Hsu 2010).

As a technology risk assessment expert

,

Hsu says he can show with some confi- dence that the planet will face more risk doing nothing to curb its fossil-based energy addictions than it will in making a fundamental shift in its energy supply . “This,” he writes, “is because the risks of a catastrophic anthropogenic climate change can be potentially the extinction of human species , a risk that is simply too high for us to take any chances ” (Hsu 2010). It was this NASA scientist’s conclusion that humankind must now embark on the next era of “sustainable energy consumption and re-supply, the most obvious source of which is the mighty energy resource of our Sun” (Hsu 2010) (Fig. 2.1).

Venture Capital DA

1NC

Venture capital shifting away from renewables to grid modernization now

NBC 12

[Dinah Wisenberg Brin, award-winning writer with a strong background producing financial, healthcare, government news, “Clean Tech Investing Shifts, With Lower-Cost Ventures Gaining Favor” March 1, http://www.cnbc.com/id/46222448/Clean_Tech_Investing_Shifts_With_Lower_Cost_Ventures_Gaining_Favor]

For many investors, that change means shifting funds from capital-intensive alternative-energy technologies

, such as solar panels, to lower-cost ventures focused on energy efficiency and “smart grid” technologies

that automate electric utility operations.

“ We continue to be very optimistic about things like the smart grid and the infusion of information technologies and software services ” into old lines like electricity, agriculture and the built environment," says Steve Vassallo , general partner in Foundation Capital. “

We’re very bullish on what I would consider the nexus of information technology and clean tech.”

Foundation, based in Menlo Park, Calif., reflects this in investments such as Sentient Energy Inc., a smart-grid monitoring company that allows utilities to remotely find power outages, and Silver Spring Networks, which provides utilities a wireless network for advanced metering and remote service connection.

Another holding, EnerNOC [ENOC 10.13 -0.22 (-2.13%) ], a demand-response business with technology to turn off noncritical power loads during peak periods, went public in 2007.

EMeter, a one-time Foundation investment, was recently acquired by Siemens Industry [SI 93.09 0.23 (+0.25%) ].

To be sure, investors have not abandoned costlier technologies with longer-term horizons, but many — put off, in part, by last year’s bankruptcy and shutdown of solar power firm Solyndra

— now favor smaller infusions in businesses with a quicker potential payoff

.

Rob

Day

, partner in Boston-based Black Coral Capital, says his cleantech investment firm maintains some solar holdings, but he sees a shift from an emphasis on those types of plays to more “intelligence-driven

, software-driven, web-driven businesses .” These technologies can be used to improve existing businesses, he says.

One Black Coral smart-technology investment is Digital Lumens of Boston, which makes high-efficiency, low-cost LED lighting for warehouses and factories. Software and controls are embedded in the fixtures, which can cut lighting bills by 90 percent, providing customers a two-year payback, says Day.

U.S. venture capital investment in cleantech companies hit $4.9 billion last year, down 4.5 percent in dollar terms but flat in the number of transactions,

according to Ernst & Young LLP, which analyzed data from Dow Jones

VentureSource. Cleantech companies raised 29 percent more capital last year than in 2009, E&Y said recently.

Most of that decline, however, came from less investment in sectors that were once hot.

Investment in energy and electric generation, including solar businesses, fell 5 percent to $1.5 billion, while that of industry products and services companies plunged 34 percent to $1 billion, according to E&Y's analysis of equity investments from venture capital firms, corporations and individuals.

The energy efficiency category leads the diverse industry in deals with 78 transactions worth $646.9 million.

Energy-storage companies raised $932.6 million, a 250 percent increase and 47 percent deal increase.

Plan reverses that trend—causes capital diversion

De Rugy 12

Veronica, Testimony Before the House Committee on Oversight and Government Reform. Dr.de Rugy received her MA in economics from the University of Paris IX-Dauphine and her PhD in economics from the University of Paris 1Pantheon-Sorbonne.

She is a senior research fellow at the Mercatus Center at George Mason University. Her primary research interests include the U.S. economy, federal budget, homeland security, taxation, tax competition, and financial privacy issues. Her popular weekly charts, published by the Mercatus Center, address economic issues ranging from lessons on creating sustainable economic growth to the implications of government tax and fiscal policies. http://mercatus.org/publication/assessing-department-energy-loan-guarantee-program

3. Mal-investments Loan guarantee programs can also have an impact on the economy beyond their cost to taxpayers. Malinvestment— the misallocation of capital and labor—may result from these loan guarantee programs

.

In theory, banks lend money to the projects with the highest probability of being repai d. These projects are often the ones likely to produce larger profits and, in turn, more economic growth.

However, considering that there isn’t an infi- nite amount of capital

available at a given interest rate, loan guarantee programs could displace resources from non-politically motivated projects to politically motivated ones.

Think about it this way: When the government reduces a lender’s exposure to fund a project it wouldn’t have funded otherwise , it reduces the amount of money available for projects that would have been viable without subsidies.

This

government involvement can distort the market signals further. For instance, the data shows that private investors tend to congregate toward government guarantee projects, independently of the merits of the projects, taking capital away from unsubsidized projects that have a better probability of success

without subsidy and a more viable business plan. As the Government Accountability Office noted, “

Guarantees would make projects [the federal government] assists financially more attractive to private capital than conservation projects not backed by federal guarantees. Thus both its loans and its guarantees will siphon private capital away .”[26] This reallocation of resources by private investors away from viable projects may even take place within the same industry—that is, one green energy project might trade off with another, more viable green energy project.

More importantly, once the government subsidizes a portion of the market, the object of the subsidy becomes a safe asset. Safety in the market, however, often means low return on investments, which is likely to turn venture capitalists away. As a result, capital investments will likely dry out and innovation rates will go down.[27] In fact, the data show that in cases in which the federal government introduced few distortions, private inves- tors were more than happy to take risks and invest their money even in projects that required high initial capital requirements.

The Alaska pipeline project, for instance, was privately financed at the cost of $35 billion, making it one of the most expensive energy projects undertaken by private enterprise

.[28]

The project was ultimately aban- doned in 2011 because of weak customer demand and the development of shale gas resources outside Alaska. [29] However, this proves that the private sector invests money even when there is a chance that it could lose it.

Private investment in U.S. clean energy totaled $34 billion in 2010, up 51 percent from the previous year.[30] Finally, when the government picks winners and losers in the form of a technology or a company, it often fails.

First, the government does not have perfect or even better information or technology advantage over private agents

. In addition, decision-makers are insulated from market signals and won’t learn important and necessary lessons about the technology or what customers want

. Second, the resources that the government offers are so addictive that companies may reorient themselves away from producing what customers want, toward pleasing the government officials.

Solves competitiveness, economic collapse, and giant blackouts

Stephen

Chu

, Nobel Prize is Physics,

12

[“America’s Competitiveness Depends on a 21st Century Grid,” May 30, Energy.Gov, http://energy.gov/articles/america-s-competitiveness-depends-21st-century-grid] PMA=Power Marketing Administrations

Upgrades are Key to American Competitiveness

The leadership of the PMAs is critically important because

America’s continued global competiveness in the 21st century will be significantly affected by whether we can efficiently produce and distribute electricity to our businesses and consumers, seamlessly integrating new technologies and new sources of power.

Other countries are moving rapidly to capitalize on cost-saving new smart grid and transmission technologies -- and we will find ourselves at a competitive disadvantage unless we do the same. Blackouts and brownouts already cost our economy tens of billions of dollars a year

, and we risk ever more serious consequences if we continue to rely on outdated and inflexible infrastructure

. For example, across the country, most of the transmission lines

and power transformers we depend upon are decades old and in many cases nearing or exceeding their expected lifespan

.

Lessons of the September 2011 Blackout

One recent example of the challenges we face occurred in September 2011, when a relatively minor loss of a single transmission line triggered a series of cascading failures that ultimately left 2.7 million electric customers in Arizona, Southern California, and Baja California, Mexico without power

, some for up to 12 hours. The customers of five utilities -- San Diego Gas and Electric (SDG&E), Imperial Irrigation District (IID),

Western Area Power Administration-Lower Colorado (WALC), Arizona Public Service (APS), and Comision Federal de Electridad

(CFE) -- lost power, some for multiple hours extending into the next day.

Put simply, this disruption to the electric system could have been avoided. The investigation into the blackout conducted by the Federal

Energy Regulatory Commission and the North American Electric Reliability Council concluded the system failure stemmed primarily from weaknesses in two broad areas: 1) operations planning and 2) real-time situational awareness

. Without these two critical elements, system operators are unable to ensure reliable operations or prevent cascading outages in the event of losing a single component on the grid.

As our system ages, these situations threaten to become more frequent and even more costly.

The Role of the PMAs in

Accelerating the U.S. Transition to a 21st Century Grid

Most of our nation’s electric transmission system is privately owned. However, the federal government directly owns and controls significant portions of the electric transmission system

through its four PMAs, created to market and distribute hydroelectric power from federally owned dams. The

PMAs

, part of the Energy Department, are responsible for more than 33,000 miles of transmission that overlay the transmission systems of utilities in 20 states, which represent about

42% of the continental United States

.

The PMAs provide the federal government the ability to lead by example in modernizing and securing our nation’s power grid, or risk putting the entire system -- and America’s economy -- at risk

.

The benefits of action, as well as the risks and consequences of inaction, could directly or indirectly affect nearly every electricity consumer and every business in the United States.

This is why my March 16th memo set forth foundational goals that DOE is considering for the PMAs.

This is part of a much broader effort to transition to a more flexible and resilient electric grid and establish much greater coordination among system operators

.

Competitiveness decline triggers great power wars

Baru 9

(Sanjaya, Visiting Professor at the Lee Kuan Yew School of Public Policy in Singapore Geopolitical Implications of the

Current Global Financial Crisis, Strategic Analysis, Volume 33, Issue 2 March 2009 , pages 163 – 168)

The management of the economy

, and of the treasury, has been a vital aspect of statecraft

from time immemorial. Kautilya’s Arthashastra says, ‘

From the strength of the treasury the army is born . …men without wealth do not attain their objectives even after hundreds of trials… Only through wealth can material gains be acquired, as elephants

(wild) can be captured only by elephants (tamed)…

A state with depleted resources

, even if acquired, becomes

only a liability .’4 Hence, economic policies and performance do have strategic consequences

.5 In the modern era, the idea that strong economic performance is the foundation of power

was argued most persuasively by historian Paul Kennedy. ‘Victory (in war),’ Kennedy claimed, ‘has repeatedly gone to the side with more flourishing productive base.’6 Drawing attention to the interrelationships between economic wealth, technological innovation, and the ability of states to efficiently mobilize economic and technological resources for power projection and national defence

, Kennedy argued that nations that were able to better combine military and economic strength scored over others.

‘The fact remains,’ Kennedy argued, ‘that all of the major shifts in the world’s military-power balance have followed alterations in the productive balances; and

further, that the rising and falling of the various empires and states in the international system has been confirmed by the outcomes of the major side with the greatest material resources.’7

Great Power wars

, where victory has always gone to the

2NC UQ

Venture capital is pouring into renewable tech – causes massive tech innovation and full adoption

Charles

Fletcher

, Associate at Intellectual Property based Lawfirm,

11

[“VCs and the cleantech funding divide,” AltAssets,

November 3, http://old.altassets.net/index.php/private-equity-features/by-author-name/article/nz18489.html]

This article discusses how, whilst venture capital funds have historically preferred to invest in the latter due to the fact they are capital efficient and easier to exit, investment

in the former could be set for an increase, as novel methods emerge to invest in capital intensive cleantech

and companies get wise to more creative fundraising techniques.

Strategies for funding capital intensive projects

While many VCs have shied away from capital-intensive cleantech

for obvious reasons, there is increasingly a realisation that the capital-intensive sub sector presents opportunities for value realisation

. As the cleantech market matures a little, many investors are taking a second look at some of the technologies that might previously have been deemed to be too capital intensive

.

This is partly in order to diversify their portfolio

, but partly because there is a perception that there are great opportunities to be had.

At the same time, cleantech companies themselves are looking very closely at ways of addressing the investor concerns traditionally associated with these sectors.

Some of the external financing strategies which capital intensive cleantech companies should evaluate in determining their business plan include the following:

Licensing

There are many ways in which licensing techniques can be used to expedite a company's path to profitability, and those in capital-intensive sectors should be sure they are alive to licensing opportunities accordingly

. Recently, there has been particularly strong evidence of clean technologies being developed in-house and then manufactured cheaply abroad under license

. The appeal of this is obvious in terms of the time and capital costs involved in setting up an in-house manufacturing operation.

Not only should a licensing model be considered for the core business of a company: if it holds non-core IP or applications which it does not wish to prioritise, due to shortage of capital or other reasons, then it should consider extracting some value from these assets by licensing them to an interested partner.

Corporate venture investors

While traditionally earlier stage external financing strategies would have been funded by risk-sharing syndicates of institutional VCs, venture arms of corporate venturers

(including major utilities in particular) are increasingly seeking to back promising development programs at all stages of development using these techniques

, whether alone or as part of a syndicate. Corporate venturers are not bound by any prescribed investment criteria, or by the demands of their LPs in the same way as institutional VC funds – major corporations making cleantech venture investments as a way of meeting their own strategic objectives include BASF, Boeing, GE, Honda, Intel, Norsk Hydro,

Mitsubishi, Motorola, Royal Dutch Shell, Siemens and Unilever. The corporates are happy to address the cleantech funding gap if the technology excites them

.

Traditionally, corporate involvement has tended to be seen most often in the later stages of development, when there is a measure of comfort that the product is likely to reach the market

. However, corporate venturers are increasingly making investments at an earlier stage, and this can be attributed at least in part to their aims of aiding a culture of innovation, stimulating a "knowledge ecosystem" within high priority areas for development, and producing a much more focused pipeline of new technologies to complement their existing operations

.

Corporate spin-out

This is a technique which can be used where companies hold a variety of underfunded intellectual property assets, or where they are not pursuing alternative applications/ markets for their technology due to funding constraints.

Corporate spin-outs enable companies to extract value from under-funded intellectual property assets by moving them across into a separate autonomous company into which external investors may invest venture capital

.

So from a development perspective, a capital-intensive cleantech company can set up a separate self-contained development company into which it will contribute one or more of its products in return for a lump sum payment. This provides the capital-intensive cleantech company with a modest cash return, but typically it will also retain a minority stake in the new company to enable it to benefit in any future upside, and it may also negotiate future opt-in rights to increase its holding in the future.

It will always be worthwhile for technology companies in all sectors to analyse and appraise whether they are making the best use of all their IP: if the answer is no, then it may be appropriate to break the IP down into constituent parts to be commercialised separately.

Venture development

In “venture development”, a funding technique honed in the biotech sector, the holder of an under-funded capital-intensive cleantech development opportunity could create a development company with external capital, which will often be VC-backed

.

Such venture development is distinct from development spin-outs, in that ownership of development

programs will be traded for an option to repurchase at a later stage (i.e. after proof of concept) on pre-negotiated terms.

The capital-intensive cleantech company could retain day-to-day control of these development programs, albeit that their ownership will effectively have been "pawned

".

Royalty financing

Like venture development, royalty financing is a feature of the biotech sector which may be applicable to capital-intensive cleantech.

Royalty financing involves an external investor paying a lump sum up-front for the right to receive current and/or future revenue on a product

.

Smaller biotech companies, having successfully navigated their way to the later stages of development for any given drug, will often take royalties in exchange for distribution rights from a big pharma partner. This can work for them because they have no direct routes to market themselves and it will enable them to use the precious cash revenue from the royalty stream to develop other candidates in their portfolio. By the same token, capital-intensive cleantech companies, having proven their concept, could take royalties in exchange for distribution rights from a major utility.

Royalty deals in the biotech sector tend to take place at later stages of development because it is easier to quantify the royalty streams with relative certainty. However, the lack of revenues is not an insurmountable barrier to royalty financing: in biotech, “direct investment” techniques can be used whereby the initial lump sum investment takes place where there are no royalty revenues in place yet, but rather is by reference to “synthetic royalties”.

Public funding

While not the focus of this article, any discussion of the cleantech funding gap would be incomplete without mention of state backed sources of finance.

In the US, the US

Department of Energy, the US Department of Agriculture and the 2008 Farm Bill have been instrumental in subsidising emerging clean technology sectors, including solar, wind and biofuels

.

In the UK, the Carbon Trust has its own dedicated cleantech venture funds, which is one of the UK’s leading co-investors in clean technology. These funds are not just about profit – their focus is on reducing carbon emissions as well as earning financial return.

Thus, the Carbon Trust can be a valuable source of financing for companies looking to bridge the cleantech funding gap.

Capital intensive cleantech companies should explore all options for obtaining favourable state finance

(including grants, subsidies and direct investment opportunities): private investors increasingly expect companies to have done so as a matter of routine.

Summary The dynamic of the venture capital model, which requires large multiple returns on investment, place restrictions on the ability of venture capital firms alone to fund some capital intensive clean technology companies. In addition, the recent economic environment has dramatically impacted private company valuations and their ability to raise equity.

Companies in the cleantech sector which can demonstrate that they have the most potential to deliver a quick return on investment will receive most interest from VC investors

– indeed this is true in any sector. In contrast, more ambitious, capital intensive projects are less likely to generate the same levels of appeal to VCs. This has meant that, with VC investment levels down, many capital intensive new technologies offering incremental improvements or facilitative methods have found it extremely difficult to generate the levels of funding they require

. By contrast, clean technologies which improve existing infrastructures, rather than consign them to the past completely

(e.g. smart grid technology), have been particularly well placed.

However, while there is little doubt that a funding gap exists in the capital-intensive cleantech sphere, the cleantech sector remains a relatively new playing field of investment. As the market for clean technologies matures and new investment models become recognised, this funding gap will become increasingly surmountable for high quality opportunities

. Drawing from licensing and partnering techniques seen in sectors such as biotech, and focused state support, there is an undoubted future for capital intensive cleantech, and increasingly creative approaches are being looked at to address this funding gap.

2nc investment high

Prefer newest research and trends

Subnet 11/28 (SUBNET is a software products company dedicated to serving the needs of the electric utility industrySUBNET provides innovative interoperability solutions that combine the latest SUBstation technologies with modern day NETworking and computing technologies enabling electrical utilities to build a smarter, more effective electricity grid, 11/28/2012, " Smart grid VC crackling back to life", www.subnet.com/news-events/smart-grid/smart-grid-vc-crackling-back-to-life.aspx)

After some experts feared a slowdown in the amount of mergers and acquisitions in the smart grid components industry, new research shows

that M&A could be resurging

in the sector with additional venture capital funding

.

According to a new report from Mercom Capital Group, VC funding in the third quarter of 2012 hit $238 million in 12 deals, which was supported heavily by Alarm.com, a security and home automation company that raised $136 million in VC funding. "The

Alarm.com funding deal and the acquisition of Vivint for $2.2 billion by Blackstone Group is part of a growing trend where home security companies have expanded into home automation," said Raj Prabhu, Managing Partner at Mercom Capital Group. "We expect to see more transactions in this niche where security, cable and telecom companies expand their offerings to cover the whole 'connected or digital' home services which would include everything from communication and automation services to solar installations." The need for more investment in smart grid technologies has been said by experts to be one of the biggest remaining hurdle s on the route to major smart grid technology proliferation

. SUBNET is also doing its part to modernize the aging North American electric grid by offering innovative interoperable solutions that will also be crucial for widespread adoption of smart grid systems.

Prefer recency

Gammons 12/26 (Brad Gammons is General Manager, IBM Global Energy and Utilities Industry., 12/26/2012, "The Smart Grid in

2013: Charged for Growth", energy.aol.com/2012/12/26/the-smart-grid-in-2013-charged-for-growth/)

So what will 2013 bring? This year's extremes all point in the same direction: towards the growth of the smart grid. A new stage is opening - where the public was once ambivalent about the smart grid, consumers are now starting to demand these improvements, spurred by the need to improve reliability, participation and the resiliency to recover from large scale grid events. This shift has been years in the making, starting in 2009 when the U.S. invested in adding intelligence to the electrical system. Going into the New Year, pressure to rebuild the northeast's grid with more resilience will further boost trends that point towards investment in these smart technologies to continue to expand by more than 10 percent per year over the next five years. And while efforts to date have focused on improving the grid's heavy-duty backbone, a look ahead suggests that coming smart grid efforts will reach more directly into everyday life. Here's what's in store for 2013: Renewables and Smart Grid will reinforce the growth of one another. It's no secret the costs of wind power and solar systems are falling fast. Per unit of capacity, today's solar systems are a third of the price from a decade ago. Less well known is that a portion of the decline comes from falling "soft" costs, such as the price to install, inspect, connect and operate the PV panels. More intelligent interconnection with the grid, using new smart meters, makes it easier not just to install these systems, but to track and manage their output, as well as the savings they deliver to your energy bill. There's plenty of room for more savings, too. A recent study of solar prices in Germany suggests that streamlining these soft costs could reduce prices for PV systems in the US by half again. Just as the smart grid is helping to spur solar, the benefits are reciprocal. Solar panels reach their maximum output on hot, sunny days when demand for air conditioning can sometimes overwhelm the supply of power. Utilities are finding that the addition of solar capacity can help provide a critical buffer of extra supply, reducing the risk of blackouts. Additionally, plug-in electric vehicles can be recharged synchronously with the availability of renewables like solar and wind, which may have excess energy in off peak hours. Watch for solar to continue to grow, as prices to keeping falling, while utilities push ahead with grid modernization efforts to make the most of new renewable resources. Distributed generation will go mainstream. Another reason for the growth in solar is its promise to generate power even when weather events take down the traditional grid. Yet, having solar panels alone doesn't guarantee they'll operate during blackouts-in some areas, grid rules require that solar panels shut down during black outs.

To keep homes lit up, solar panels must be paired with intelligent meters under grid rules amended to permit the home or business to island "behind" the meter, even when the wider grid is down. Extreme weather events have revealed other weak links in our energy infrastructure too, for example, without electric power, gas stations cannot pump the fuel they have on hand. There are easy solutions to this problem that the smart grid can help deliver. For example, stations outfitted with batteries, backup generators and/or canopied with solar panels are able to keep pumping, and lessen the stress on communities during power failures. Watch for more homeowners and companies to invest in distributed generation technologies , including gas-powered alternative in geographies benefiting from abundant new natural gas resources. At the same time, regulators will face pressure to modify rules that make it easier for grids to handle two-way power flows, and for customers to generate power independently. Social networks will cement their status as power restoration and crisis communication tools. Utilities are learning to take advantage of the expansion of social networks, such as Facebook and Twitter. During many of this year's grid-damaging weather events, Twitter feeds from utilities often proved to be the most up-to-date sources of information to monitor storm impacts. In the aftermath, utilities'

Facebook pages regularly became a sort of virtual village square where restoration efforts are publicized, and where the public can post problems or share thanks. To make the most of efforts to communicate via social networks, utilities will need to improve the sophistication of integrating smart meters as outage sensors and the related data systems to communicate with customers. In an era driven by social media, it's now important they also track customers' opinions and concerns online, using them to speed up response time to customer requests that come in through these non-traditional channels. Watch for energy companies to formalize these efforts combining advanced metering with smarter customer services data systems, as they recognize that keeping the public apprised of grid developments, repairs and outages is no longer a nicety, but a necessity. Smarter analytics will be necessary to deliver these new

services. As grid enhancements mature and multiply, each is demanding more computational horsepower to handle the big, new flows of data they generate. Utilities are already working to develop advanced analytics to orchestrate the complex operation of their grids, repair crews and customer communications in near real time. Let's say a storm is inbound. Analytics can enable the utility to model risk of wind and flood damage in key areas, giving the utility a head start to pre-position repair crews. With customer relationship systems, the utility can be proactive, notifying at risk customers in advance, via texts, tweets, or voicemails. And if bad weather does knock out power, smart meters can signal the utility precisely which houses are affected, and notify customers of the initial outage, as well as keep them informed of ongoing restoration work. Watch for utilities to double down on the resources to optimize their operations through smarter information processing and management. These advances are primed to take off in the coming year, thanks to a combination of growing public demand, rising regulatory urgency, and ready-today technology. IBM Smarter Energy is helping to spur this transformation by optimizing grid management and preventing blackouts, by streamlining the ways customers interact with utilities, and by applying data analytics to help speed recovery and to predict where extreme weather will hit the grid hardest. Every

New Year brings with it wishes for prosperity and security. The smart grid is poised to deliver that and more in 2013.

2NC Coal Link

The plan trades off with renewables – saps capital

EJLFCC 8

Environmental Justice Leadership Forum on Climate Change, The Fallacy of Clean Coal, http://www.jtalliance.org/docs/Fallacy_of_Clean_Coal.pdf

The impact that government financial support has on the development and adoption of wide-scale energy technology cannot be understated

. As with any government spending, the money that goes toward coal limits the resources available for other energy R&D

.

The continued absorption of coal’s financial costs

by the federal government through investment in CCS technology will cause investment in renewable energy and efficiency to suffer

. 37

In addition, government investment in CCS restricts financial investments in energy subsidies, green jobs, and efficiency programs that target lowincome communities

. This unintended consequence is particularly unacceptable for community groups working to position the new “green economy” as a way to bring jobs and resources to un- and underemployed populations. For these groups and others working to improve environmental, public health, and economic equality, a massive shift in government investments is needed to make alternative energy sources viable. Continuing to invest billions in non-renewable energy sources like

CCS diverts funds away from new clean technologies and delays full-scale climate change mitigation strategies

.

2NC Decentralized Solar Link

Utilities are upgrading to smart grids–moving to decentralized generation first takes capital out of their hands–regulations are key

Clareo

Partners, business consulting firm, 8/10/

2012

(http://www.clareopartners.com/pages/2012/08/10/transformation-challenge-electric-utility-industry/)

Utilities today face a host of significant challenges. Among them are environmental regulations; fuel price uncertainty; and fresh capital needs for plant upgrades, baseload generation investments, and transmission investments. One of the largest disruptors, however, may be the erosion of the utility business model itself. For almost a century, the utility business model has been built upon electric demand growth, or load growth. Reliable load growth fueled infrastructure investments in power generation and the grid that, in turn, allowed utilities to earn a capped return. The balance of grid-load growth = ROI is now under attack on both sides of the equation, and with it, the entire utility business model. Load growth has been flattening over time, raising the question whether we are, or will soon, be witnessing ‘peak electric load’ in the United States. Utility ROI’s are being redefined as the industry is moving from traditional expansionary investments to largely environmentally motivated upgrades or retrofits of existing generation and grid capital bases . Whether and how much of these different types of utility investments will be returned to shareholders is subject to heated debate and negotiation between many utilities and their regulators right now. However, the focus here will be to assess the left side of the equation, in particular, the outlook for grid-load growth. Load growth in the U.S. has been flattening considerably over time. Electric retail sales to end customers peaked in 2007. The 2008 recession and subsequent slow recovery initially played an important role. More recently utilities have pointed to ‘unfavorable weather’ in their quarterly reports in an attempt to explain continued demand stagnation. Is ‘weather’ masking a more fundamental underlying trend? Macro Impacts on Load Growth Stepping back and looking at longer-term historic drivers, two macro impacts generally affect developed economies in mature stages: (1) flattening or stagnating economic growth and (2) reduced energy intensity. Following the decade long growth stagnation or decline in

Western Europe and Japan, the U.S. is now facing relatively anemic GDP growth of 1.5% to 2.5% in future years3. Gone are the heydays of 3% or more growth per year. Long-term structural realignment in the housing, financial, manufacturing and other sectors is the cause. As the U.S. continues to mature from resource and manufacturing sectors to service, finance, and other less energyintensive sectors, electric intensity per GDP continues to drop accordingly. A muted economic growth outlook combined with declining energy and electric intensity point to a sub-1.5% or possibly sub-1% load growth outlook for utilities based on macro drivers—a material departure from the traditionally more accepted 1.5% to 2.5% organic growth potential assumed for utilities. Macro drivers alone, however, are no longer sufficient to determine load growth for utilities. Several technology drivers are gaining momentum and could effectively reverse any remaining load growth on the grid. Technology Driving Load Growth Three technology drivers have real potential to further reduce load growth on the utility grid: • energy efficiency gains, • smart grids, • and decentralized generation and storage. Power consumption efficiency gains derived from CFL or LED lights to appliances and air conditioners are making their way into more homes. Generally high IRRs are pushing commercial and industrial users to adopt large-scale building and facility retrofits. While their impact today may still be small, projections see electric consumption curbed by 4% to 5% in 2020 and around 8% by 2030 owing to efficiency gains alone. Driving these gains are improved energy efficiency labeling; new metrics; green product launches at lower prices; and active merchandising by major retailers that have accelerated mindset changes and adoptions by consumers. A n enhanced, more optimized grid—or smart grid

—will do its part to mute load growth on the grid. Peak-shaving and valley-filling technologies, such as: • demand-side management, • real-time metering, • congestion management, • tiered or spot pricing, • and other processes or tools, will allow for better distribution and utilization of existing energy on the grid . Smart grids will also improve integration and leverage new intermittent sources such as wind and solar. The third and most underestimated load growth disruptor is decentralized self-generation and new storage technologies. Decentralized or end-use generation and storage takes load away from the grid and replaces it on the edge of the grid close to the end-user. This trade-off in the generation footprint captures business from traditional utilities that control the grid and transfers it into a nascent grid-edge market with new players and different business models.

2NC Community Solar Link

PACE solar funding draws in VC investment

St. John, 12

(7/16, Staff Writer-Green Tech Media, http://www.greentechmedia.com/articles/read/a-pace-rebirth-sacramento-and-ygrene-try-tounlock-green-homes

A PACE Rebirth? Sacramento and Ygrene Try to Unlock Green Homes)

Ygrene is one of many PACE-based businesses, including VC-backed startups like Renewable Funding, that are hoping for a favorable outcome from all this legal and regulatory turmoil. Since the FHFA’s 2010 decisions, all but a handful of city/county residential PACE programs have closed down (Sonoma and New Babylon, N.Y. are two exceptions), and startups in the space have left the field , including GreenDoor and recent Tendril acquisition Recurve.

2NC Nuclear Link

Nuclear trades off and collapses the smart grid

Antony

Froggatt

, Senior Research Fellow at Chatham House, where he specializes in issues relating to climate change, EU energy policy and nuclear power,

and

Mycle

Schneider

works as an independent international consultant on energy and nuclear policy and advisor to German Environmental Agency,

Renewables?” Heinrich Böll Foundation, March, pdf]

10

[“Systems for Change: Nuclear Power vs. Energy Efficiency +

Global experience of nuclear construction shows a tendency of cost overruns and delays

. The history of the world’s two largest construction programs, that of the United States and France, shows a five and threefold increase in construction costs respectively. This cannot be put down to first of a kind costs or teething problems, but systemic problems associated with such large, political and complicated projects. Recent experience, in Olkiluoto in Finland and the Flamanville project in France, highlight the fact that this remains a problem.

The increased costs and delays with nuclear construction not only absorb greater and greater amounts of investment, but the delays increase the emissions from the sector. From a systemic point of view the nuclear and energy efficiency+renewable energy approaches clearly mutually exclude each other

, not only in investment terms. This is becoming increasingly transparent in countries or regions where renewable energy is taking a large share of electricity generation, i.e., in Germany and Spain.

The main reasons are as follows.

Competition for limited investment funds. A

euro, dollar

or yuan can only be spent once and it should be spent for the options that provide the largest emission reductions the fastest. Nuclear power is not only one of the most expensive but also the slowest option

.

Overcapacity kills efficiency incentives. Centralized, large, power ‐ generation units tend to lead to structural overcapacities. Overcapacities leave no room for efficiency

.

Flexible complementary capacity needed. Increasing levels of renewable electricity sources will need flexible, medium ‐ load complementary facilities

and not inflexible, large, baseload power plants.

Future grids go both ways.

Smart metering and smart grids are on their way. The logic is an entirely redesigned system where the user gets also a generation and storage function

. This is radically different from the top ‐ down centralized approach. For future planning purposes, in particular for developing countries, it is crucial that the contradictory systemic characteristics of the nuclear versus the energy efficiency+renewable energy strategies are clearly identified. There are numerous system effects that have so far been insufficiently documented or even understood. Future research and analysis in this area is urgently needed.

This is particularly important at the current time because the next decade will be vital in determining the sustainability, security and financial viability of the energy sector for at least a generation

.

2NC Grid Solves Wind/Solar

Smart grid leads to solar/wind development

Douglass 12/4 (Elizabeth Douglass, Energy Reporter at InsideClimate News, and Maria Gallucci, Staff Reporter at InsideClimate

News, 12/4/2012, "A Smart Grid Primer: Complex and Costly, but Vital to a Warming World", insideclimatenews.org/news/20121204/smart-grid-superstorm-sandy-climate-change-global-warming-electrical-grid-smart-metersobama-doe-stimulus-dollars?page=show)

Most experts agree that smart grids will pave the way for more renewable power

and distributed generation like

small-scale rooftop solar

arrays. Unlike fossil fuel plants, which provide a steady flow of electricity to the grid, solar and wind energy systems deliver power to the grid intermittently, when the sun shines or the wind blows. The swings in power production

(when a cloud temporarily shades a solar system, for example) are hard to manage on today's grid.

And without proper controls, a region with a lot of solar production can overwhelm the system on a sunny day

.

Because

most of today's power grids don't have smart controls

, regulators severely limit the amount of renewable power that can be connected to the grid

.

Current grids

also automatically shut down renewables when the grid is under duress

to protect workers from being injured by uncontrollable inflows of power. That engineering safeguard rendered thousands of solar panels useless in New

Jersey—the nation's No. 2 solar state—after Sandy ravaged the region.

Smart grid

technologies, by contrast, tip off operators to any potential disturbances so they can keep the flow of electricity balanced

by adjusting and rerouting power or by changing the location where power is being added to the grid.

2NC Pre-req to Aff

Pre-req to the aff

MIT Tech Review 9

[David Talbot, Tech Review Head, “Lifeline for Renewable Power,” Jan/Feb 2009, http://www.technologyreview.com/featured-story/411423/lifeline-for-renewable-power/]

Without a radically expanded and smarter electrical grid, wind and solar will remain niche power sources

. Push through a bulletproof revolving door in a nondescript building in a dreary patch of the former East Berlin and you enter the control center for Vattenfall Europe Transmission, the company that controls northeastern Germany's electrical grid. A monitor displaying a diagram of that grid takes up most of one wall. A series of smaller screens show the real-time output of regional wind turbines and the output that had been predicted the previous day. Germany is the world's largest user of wind energy, with enough turbines to produce 22,250 megawatts of electricity. That's roughly the equivalent of the output from 22 coal plants--enough to meet about 6 percent of Germany's needs. And because Vattenfall's service area produces 41 percent of German wind energy, the control room is a critical proving ground for the grid's ability to handle renewable power. Like all electrical grids, the one that

Vattenfall manages must continually match power production to demand from homes, offices, and factories. The challenge is to maintain a stable power supply while incorporating elec­tricity from a source as erratic as wind. If there's too little wind-generated power, the company's engineers might have to start up fossil-fueled power plants on short notice, an inefficient process. If there's too much, it could overload the system, causing blackouts or forcing plants to shut down. Advertisement The engineers have few options, however. The grid has a limited ability to shunt extra power to other regions, and it has no energy-storage capacity beyond a handful of small facilities that pump water into uphill reservoirs and then release it through turbines during periods of peak demand. So each morning, as offices and factories switch their power on, the engineers must use wind predictions to help decide how much electricity conventional plants should start producing. But those predictions are far from perfect. As more and more wind turbines pop up in

Germany, so do overloads and shortages caused by unexpected changes in wind level. In 2007, ­Vattenfall's engineers had to scrap their daily scheduling plans roughly every other day to reconfigure electricity supplies on the fly; in early 2008, such changes became necessary every day. Power plants had to cycle on and off inefficiently, and the company had to make emergency electricity purchases at high prices. Days of very high wind and low demand even forced the Vattenfall workers to quickly shut the wind farms down.

Video Vattenfall's problems are a preview of the immense challenges ahead as power from renewable sources, mainly wind and solar, starts to play a bigger role around the world. To make use of this clean energy, we'll need more transmission lines that can transport power from one region to another and connect energy

-­hungry cities with the remote areas where much of our renewable power is likely to be generated.

We'll also need far smarter controls throughout the distribution system--technologies that can store extra electricity from wind farms in the batteries of plug-in hybrid cars,

for example, or remotely turn power-hungry appliances on and off as the energy supply rises and falls.

If these grid upgrades don't happen, new renewable-power projects could be stalled, because they would place unacceptable stresses on existing electrical systems

. According to a recent study funded by the

European Commission, growing electricity production from wind (new facilities slated for the North and Baltic Seas could add another 25,000 megawatts to Germany's grid by 2030) could at times cause massive overloads.

In the United States

, the North

American Electric Reliability Corporation, a nongovernmental organization set up to regulate the industry after a huge 1965 blackout, made a similar warning in November. "

We are already operating the system closer to the edge than in the past,"

says the group's president, Rick Sergel. "

We simply do not have the transmission capacity available to properly integrate new renewable resources

." The challenge facing the United States is particularly striking. Whereas

Germany already gets 14 percent of its electricity from renewable sources, the United States gets only about 1 percent of its electricity from wind, solar, and geothermal power combined. But more than half the states have set ambitious goals for increasing the use of renewables, and president-elect Barack Obama wants 10 percent of the nation's electricity to come from renewable sources by the end of his first term, rising to 25 percent by 2025. Yet unlike Germany, which has begun planning for new transmission lines and passing new laws meant to accelerate their construction, the United States has no national effort under way to modernize its system. "

A failure to improve our grid will be a significant burden for the development of new renewable technologies

," says Vinod Khosla, founder of Khosla Ventures, a venture capital firm in Menlo Park, CA, that has invested heavily in energy technologies. Gridlock When its construction began in the late 19th century, the U.S. electrical grid was meant to bring the cheapest power to the most ­people. Over the past century, regional monopolies and government agencies have built power plants--mostly fossil-fueled--as close to popu­lation centers as possible. They've also built transmission and distribution networks designed to serve each region's elec­tricity consumers. A patchwork system has developed, and what connections exist between local networks are meant mainly as backstops against power outages. Today, the United States' grid encompasses 164,000 miles of highvoltage transmission lines--those familiar rows of steel towers that carry electricity from power plants to substations--and more than

5,000 local distribution networks. But while its size and complexity have grown immensely, the grid's basic structure has changed little since Thomas ­Edison switched on a distribution system serving 59 customers in lower Manhattan in 1882. "If Edison would wake up today, and he looked at the grid, he would say, 'That is where I left it,'" says Guido ­Bartels, general manager of the IBM

Global Energy and Utilities Industry group. While this structure has served remarkably well to deliver cheap power to a broad population, it's not particularly well suited to fluctuating power sources like solar and wind. First of all, the transmission lines aren't in the right places. The gusty plains of the Midwest and the sun-baked deserts of the Southwest--areas that could theoretically provide the entire nation with wind and solar power--are at tail ends of the grid, isolated from the fat arteries that supply power to, say,

Chicago or Los Angeles. Second, the grid lacks the storage capacity to handle variability--to turn a source like solar power, which generates no energy at night and little during cloudy days, into a consistent source of electricity. And finally, the grid is, for the most part, a "dumb" one-way system. Consider that when power goes out on your street, the utility probably won't know about it unless you or one of your neighbors picks up the phone. That's not the kind of system that could monitor and manage the fluctuating output of rooftop solar panels or distributed wind turbines. The U.S. grid's regulatory structure is just as antiquated. While the Federal Energy

Regulatory Commission (FERC) can approve utilities' requests for electricity rates and license transmission across state lines, individual states retain control over whether and where major transmission lines actually get built. In the 1990s, many states revised their regulations in an attempt to introduce competition into the energy marketplace. Utilities had to open up their transmission lines to other power producers. One effect of these regulatory moves was that companies had less incentive to invest in the grid than in new power plants, and no one had a clear responsibility for expanding the transmission infrastructure. At the same time, the more open market meant that producers began trying to sell power to regions farther away, placing new burdens on existing connections between networks. The result has been a national transmission shortage. These problems may now be the biggest obstacle to wider use of renewable energy, which otherwise looks increasingly viable. Researchers at the National Renewable Energy Laboratory in Golden,

CO, have concluded that there's no technical or economic reason why the United States couldn't get 20 percent of its elec­tricity from wind turbines by 2030. The researchers calculate, however, that reaching this goal would require a $60 billion investment in 12,650 miles of new transmission lines to plug wind farms into the grid and help balance their output with that of other electricity sources and with consumer demand.

The inadequate grid infrastructure "is by far the number one issue with regard to expanding wind," says Steve Specker, president of the Electric Power Research Institute

(EPRI) in Palo Alto

, CA, the industry's research facility. "It's already starting to restrict some of the potential growth of wind in some parts of the West." The Midwest Independent Transmission System Operator, which manages the grid in a region covering portions of 15 states from Pennsylvania to Montana, has received hundreds of applications for grid connections from would-be energy developers whose proposed wind projects would collectively generate 67,000 megawatts of power. That's more than 14 times as much wind power as the region produces now, and much more than it could consume on its own; it would represent about 6 percent of total

U.S. electricity consumption. But the existing transmission system doesn't have the capacity to get that much electricity to the parts of the country that need it.

In many of the states in the region, there's no particular urgency to move things along, since each has all the power it needs. So most of the applications for grid connections are simply waiting in line, some stymied by the lack of infrastructure and others by bureaucratic and regulatory delays. Lisa Daniels, for example, waited three years for a grid connection for a planned development of 9 to 12 turbines on her land in Kenyon, MN, 60 miles south of

Minneapolis. The installation would be capable of producing 18 megawatts of power. Its site--only a mile and a half from a substation--is "bulldozer ready," says Daniels, who is also executive director of a regional nonprofit that aims to encourage local wind projects. "The system should be plug-and-play, but it's not," she says. Utilities, however, are reluctant to build new transmission capacity until they know that the power output of remote wind and solar farms will justify it. At the same time, renewable-energy investors are reluctant to build new wind or solar farms until they know they can get their power to market. Most often, they choose to wait for new transmission capacity before bothering to make proposals, says Suedeen Kelly, a FERC commissioner. "It is a chickenand-egg type of thing," she says.

Prereq to solar/wind development – California proves

Hamilton ’12

9/4 (Katherine Hamilton, Policy Director for the Electricity Storage Association, 9/4/2012, "Energy Storage

Could Be Required for Future Renewable Energy Projects", www.sustainablebusiness.com/index.cfm/go/news.display/id/24036)

Energy storage technology might be moving from a nice-to-have addition to solar and wind installations to a component that's necessary for project approval, if developments in California are any indication

.

Solar developer

BrightSource Energy and its utility partner Southern California Edison (SCE) may face rejection

for two of five proposed power purchase agreements because they are too expensive and don't include a plan to store the power

, reports GigaOM.

The

California Public Utilities Commission (

CPUC) sites energy storage in its recommendation for a "no" vote

for BrightSource's planned Rio Mesa project, but three other projects that have energy storage are likely to get the greenlight when they are voted on this month. "These projects incorporate molten salt storage capacity which will allow SCE to optimize generation from these facilities based on changing system requirements. This unique attribute decreases renewable integration risk and provides more value for ratepayers," says CPUC staff in its recommendations report.

The intermittent nature of solar and wind is challenging

, because it means these projects can't send electricity to the grid as consistently or reliably as fossil-fuel or nuclear sources

.

Energy storage technology levels the playing field by helping generators store power so that demand can be better balanced

.

2NC Pre-Req to Solar

Lack of grid integration crushes solvency—prerequisite to the aff

Outka

, visiting scholar – Energy and Land Use Law @ FSU, ‘

10

(Uma, 37 Ecology L.Q. 1041)

Most of the siting considerations for centralized energy projects are irrelevant for rooftop solar. Plainly, no terrestrial power generation can top it in terms of land use efficiency, as the panels are incorporated into existing structures on already developed land, with few geographic restrictions. The systems do not require water, so proximity to water resources is unnecessary. They produce no air pollutants or waste when generating electricity, making it safe to site rooftop solar in and around populated areas. n234 Despite the regulatory inconsistencies and barriers discussed above, with no environmental review or land use change needed, siting rooftop solar takes little time by comparison to centralized energy projects.

As system installations increase nationwide, however,

grid integration looms as a siting barrier

for rooftop

PV. This is not so much a site-by-site barrier, like interconnection can be, but a barrier to how much DG the grid can support. Solar

(and wind) energy is variable, meaning these resources produce power only intermittently. In 2007, the U.S. Department of Energy launched a renewable systems interconnection study, noting, "concerns about potential impacts on operation and stability of the electricity grid may create barriers to further expansion." n235 The study found that

grid-related barriers are likely to inhibit distributed generation sooner

than previously expected, based [*1082] on market and policy advancements in support of onsite solar energy. n236 Existing distribution systems were designed for centralized power transmission and have limited capacity for reverse flows of electricity from distributed sites. n237 The study concluded that it is "clearly time to begin planning for the integration of significant quantities of distributed renewable energy onto the electricity grid." n238 It is still uncertain just how much variability existing infrastructure can absorb, but there seems to be general agreement that significant potential for solar energy

cannot be realized

without modernizing the grid. n239

Grid capacity is not yet a barrier to rooftop solar in Florida, but it is a technical problem that will become a policy problem and siting barrier if grid limitations begin to impede new system installations. According to the Florida Solar Energy Research and Education

Foundation, solar electric and water heater system installations together increased more than 40 percent in response to a state rebate program, n240 but there is still significant room to grow: approximately 27 percent of residential and 60 percent of commercial roof space is considered "available" for PV installations. n241 It may well be that grid capacity will keep pace with solar growth, but it is too early to know. Grid integration research is ongoing at the federal level, and Florida universities have received a federal grant to support a five-year research plan directly concerning integration of solar energy into the grid. n242

Even if solar is technically feasible – utilities block it

Umberger

, JD candidate – Golden Gate University School of Law, ‘

12

(Allyson, 6 Golden Gate U. Envtl. L.J. 183, Fall)

Even with DG's many benefits, the technology cannot reach its full potential without the support of infrastructure and pro-DG policy. n70 Unfortunately,

the technology is unappealing to

those regulating the energy sector and the

utility companies

that provide nearly all American with their electricity. n71 Consequently, energy policy is severely lacking in support for DG, and its benefits for the ratepayer remain untapped. n72

IV. Distributed Generation Maximization and What is Holding It Back

When maximized in an urban setting, DG can provide many [*192] benefits to the community and to the environment. n73 At the individual level, DG offers an individual the opportunity to become an energy entrepreneur who can attract capital and equity into an investment that benefits the community at large. n74 By producing local energy that is cheaper (based on mandated, fixed rates), more reliable, and more secure, distributed generation systems have enormous potential to pay for themselves with a quick rate of return. n75 By adding upfront financial incentives and energy or financial credits for contributing electricity to the smart grid, DG systems could pay for themselves even sooner. n76

On a larger scale, DG can drive employment and generate tax revenue at virtually no cost to the government. n77 Diverting the cost can be accomplished by making solar panel and wind turbine manufacturers responsible for one hundred percent of distribution grid

(D-grid) upgrade costs without any need for reimbursement from the government. n78 For example, when a solar panel manufacturer improves its technology, it could be held responsible for replacing its customers' panels with the new panels at no cost to the customer and without financial support from the government. n79 New and localized jobs will be created for the design, manufacture, installation, and connection of solar panels and other renewable technologies and for the smart grid, all of which offers the great potential of strengthening local economies while, at the same time, bringing domestic energy production to the forefront of our energy infrastructure.

In addition to economic benefits, the use of DG and a smart grid, which manages DG-contributions virtually, can enable local systems to reduce their peak loads (i.e., high periods of demand, such as early morning and dinnertime) by having consumers meet their own demand. n80 This method, known as "demand response," is highly favored in energy procurement planning because it prevents utilities from providing more energy than is demanded at a given time, which reduces the amount of [*193] wasted energy. n81

Demand response systems can thrive with DG because consumers can manage their own periods of high demand without utility oversight. n82 DG also allows consumers to provide ancillary services such as reactive power and voltage support, and the technology improves overall power quality and reliability for consumers connected to the smart grid. n83 With urbanization on the rise, this type of smart infrastructure is needed to support massive populations. DG provides energy security when traditional, vulnerable grids crash; price stability immune to utility manipulation; less demand for utility-scale energy; and fewer or zero emissions coming from the renewable energy sources for distributed generation. n84

All of these potential benefits raise the question of why our energy system is so behind in employing this option. First and foremost, the current regulatory scheme is extremely unfavorable to DG. n85 This is because investor-owned-utilities (IOUs) such as

California's Pacific Gas & Electric (PG&E), Southern California Edison (SCE), and San Diego Gas & Electric (SDG&E) do not profit from DG programs. n86 It is no coincidence that their reasons for being against DG are the same reasons why consumers would profit from DG because when consumers begin to meet their own demand, they gain control over production, and the IOUs lose control. n87

Another cause for IOU concern is the fact that utilities are relatively unfamiliar with DG technologies, or at least they pretend to be, which creates an

air of uncertainty and risk that make it unattractive

to utility companies. n88

Between uncertain risks, a lack of experience with DG, and the prospect of having to abandon their profitable business models, utility companies have generated little to no data, models, or analytical tools for evaluating DG systems. n89 In turn,

this

lack of data

makes utilities even more wary

of DG. n90 This self-fulfilling prophecy has led utilities away from DG, even though state commissions like California's Public Utilities Commission (CPUC) attempt to promote [*194] DG's potential for helping our energy crisis. n91 Unfortunately, under the structure of California's current system, the IOUs have

so much bargaining power

in the legislative process that nothing short of the Governor issuing a declaratory order will force them to fully implement DG systems into urban smart grids. n92

2NC Yes Zero-Sum

It’s zero-sum – VC investors will choose between smart grid tech and renewables – government funds determine which strategy wins

Martin 12

Glenn Martin, Credit Suisse Journalist, 6/8/12, VC Investors Unfazed by Cleantech's Growing Pains, https://mobile.creditsuisse.com/index.cfm?fuseaction=mOpenArticle&coid=280991&aoid=360007&lang=en

Three years ago, cleantech – particularly energy-related cleantech – was all the rage, attracting both lavish government support and abundant private capital. Despite this backing, a number of companies folded. Others, including solar developer BrightSource Energy, canceled its initial public offerings because of the tough market conditions facing the sector. The California company scrapped its 182 million US dollar IPO despite having secured a 1.2 billion US dollar loan guarantee from the Department of Energy to finance construction of its game-changing, 370-megawatt Ivanpah solar power project.

Uncertainty in Production Tax Credit Drives

Slowdown in Generated Power VC Investment The end of key government funding programs, such as the loan guarantee that helped finance Ivanpah, convinced some VC funds to rein in their cleantech investments. "We've seen a slowdown in wind (generated power) venture capital investment in particular", observes Allen Burchett, North American senior vice president of ABB Group, the Swissbased developer of power and automation technologies. "In the main, that seems driven by uncertainty in the production tax credit".

ABB, along with several venture funds, has invested in a number of cleantech companies, many developing energy management and smart-grid solutions. "Venture capitalists have, say, a four-to seven-year investment horizon – they want to get in and get out in that period. Beyond that, their tolerance wanes", explains Burchett. "The problem is that the technologies we're working on typically involve much longer time lines. This has made some investors nervous".

Cyclical Factors Also Contribute to the Slowdown Other, more cyclical factors have made some VCs wary of investing in alternative energy. For example, natural gas production has boomed following improvements in fracking technologies. As natural gas becomes cheaper and more abundant, interest inevitably slackens in fuels and power sources that are expensive and often unproven.

Rumping Up Efforts on Sustainable Energy But if cleantech is wobbling on the VC ropes, it is by no means about to hit the canvas. The same macro factors – climate change, concerns over peak oil and the rise of China and India and their seemingly insatiable appetite for energy – continue to support the long-term viability of the sector. So too does government fiat, especially at the state level, despite growing calls in Washington for less government spending.

"Renewable portfolio standards are now required in 26 states", observes Nancy Pfund, a managing partner of DBL Investors, a San

Francisco venture capital firm specializing in cleantech projects. Pfund is referring to regulations that require increased energy production from renewable sources. "In California, for example, 33 percent of the state's electricity must come from sustainable sources by 2020", Pfund says. "And we only see that trend getting stronger, regardless of the political dialogue. The reality of geophysics and geopolitics demands it. That's why virtually every country on the planet is ramping up efforts on sustainable energy, and that's why we (DBL) are still very bullish and investing heavily".

Downturn More of a Blip Than a Trend Pfund believes the decline in venture investments is a temporary bump on the road. Making the transition from one energy source to another is challenging, both in terms of the technology and funding. "Change is never a smooth ride", she observes. "There were dislocations when we moved from whale oil to coal, and from coal to petroleum and natural gas". This view – that the current downturn is more of a blip than a trend – may be on the mark. PricewaterhouseCoopers recently released a report showing that investment in the sector remains robust, with 4.6 billion US dollars devoted to 342 deals in 2011, up more than 17 percent from the 3.9 billion US dollars invested in 307 deals in 2010. Clean technology as a percentage of total VC investment has remained stable in the past year, accounting for 15 percent in 2011 compared with 16 percent in 2010. In the first quarter of 2012, 950 million US dollars flowed into

73 cleantech deals.

Sector Holds its Own All in all, the sector is more than holding its own. Rather than pulling their money out of cleantech, VCs are diverting funds away from solar, wind and other capital-intensive energy generation projects into investments that are more in line with their core Internet and information technology expertise. As such , more funds are backing companies developing energy management software services or smart-grid technologies .

2NC Water Impact

Solves water scarcity

Muys et al 11

[Jerome C. Muys, Jr., Jeffrey M. Karp, and Van P. Hilderbrand, Jr. Sullivan & Worcester LLP, “The

Intersection Between Water Scarcity And Renewable Energy” April, http://www.sandw.com/assets/htmldocuments/Intersection%20Between%20Water%20Scarcity%20and%20Renewable%20Energy%2

0-%20Muys%20Karp%20Hilderbrand%20W0230759.PDF]

The starting point for any discussion of the intersection between water scarcity and renewable energy is the now generally-accepted correlation between climate change and water resource impacts,

which is creating further imperatives for both reduction of GHG emissions and water conservation.

Most projections conclude that the water resource impacts of climate change will almost certainly be both diverse and wide-ranging

, necessitating

the implementation of new protocols for allocating water resources

such as the

Model Interstate Water Compact.

However, a less obvious impact of predicted wate r shortages will be on the future ability to site new renewable energy facilities

and, perhaps more importantly, on which types of renewable energy gain prominence in the future.

Consequently, water reuse and reclamation facilities are increasingly being co-located with renewable energy projects

, and, indeed, technological development in the two areas has begun to converge in ways that were completely unforeseen twenty years ago.

Extinction

Reilly ‘2

(Kristie, Editor for In These Times,

a nonprofit, independent, national magazine published in Chicago.

We’ve been around since 1976, fighting for corporate accountability and progressive government.

In other words, a better world, “NOT A DROP TO DRINK,” http://www.inthesetimes.com/issue/26/25/culture1.shtml

)

*Cites environmental thinker and activist Vandana Shiva Maude Barlow and Tony Clarke— probably North America’s foremost water experts

The two books provide a chilling, in-depth examination of a rapidly emerging global crisis. “Quite simply,” Barlow and Clarke write,

“unless we dramatically change our ways, between one-half and two-thirds of humanity will be living with severe fresh water shortages within the next quarter-century. … The hard news is this:

Humanity is depleting, diverting and polluting the planet’s fresh water resources so quickly and relentlessly that every species on earth— including our own—is in mortal danger .” The crisis is so great

, the three authors agree, that the world’s next great wars will be over water

. The

Middle East, parts of Africa, China, Russia, parts of the

United States and several other areas are already struggling to equitably share water resources.

Many conflicts over water are not even recognized as such

: Shiva blames the Israeli-Palestinian conflict in part on the severe scarcity of water in settlement areas.

As available fresh water on the planet decreases, today’s low-level conflicts can only increase in intensity.

2NC Warming Impact

Solves warming

Coughlin 11

[Sierra Coughlin, member of IEEE's Society on Social Implications of Technology, “Smart Grid: A Smart Idea

For America?” November 27, 2011 is last date cited, http://smartgrid.ieee.org/highlighted-papers/493-smart-grid-a-smart-idea-foramerica]

The natural environment is by far the most important resource mankind relies on. Society is intricately built about the foundations of bountiful resource and operates on the belief these resources are endless. As climate change continues to take effect and resources are contributing to dwindle, the guarantee of endless possibilities is running out.

Without the resource of the natural environment, there would be no way to sustain human life and societal development

.

Because these resources are facing an increasing demand and record climate change, the human population is required to adapt and respond to the increasing challenges

the planet faces.

Smart Grid technologies operate closely with this understanding and the need to aid the natural environment

. Through the process of designing such technologies, innovators work alongside scientists and environmental experts in order to design technologies that don’t consume more resource than necessary. Although there is initial resource that goes into creating the foundations of these technologies, the overall goal of Smart Grid systems is to lessen the impact on the natural environment, and greatly reduce the reliance on non-renewable natural resources

.

Environmental challenges not only consist of limited resource and resource generation, but often surround the issues of pollution and carbon emissions

. Understanding that pollutant levels now reach poisonous rates, fuels the desire to reduce emissions in every way possible. While there is no way to fix the damage that has been done to the ozone layer of the planet, there are ways in which mitigation can occur.

Reducing carbon emissions is a step forward in this process. Understanding the ways that Smart Grid technologies work inside this equation is fundamental.

While there are many ways in which Smart Grid technologies function within the natural environment, certain processes make a greater impact than others

. Not only is the impact significant, but often aids society in other ways. Through education and awareness, it is more likely a collective effort will be made in the response to climate change in hope that personal responsibility will be taken into account.

Paired along with education,

Smart Grid technologies create new levels of understanding and environmental mitigatio n. These processes ensure a solid relationship between natural processes and the understanding how these processes work by the people who must interact with them.

Smart Grid technologies play a fundamental role in building this relationship and often act as a catalyst for future research in regards to climate change

.

The introduction of communication through using real time technologies is the link between mitigation and understanding

. Using Smart Grid technologies to educate is a vital tool to utilize in the fight against climate change. One may even argue the greatest influence Smart Grid technologies can have on the environment is the education of society as a whole as a collective way to reduce poisonous emissions and work to repair what is possible

.

According to data gathered by the Electric Power Research Institute, there are two main ways in which Smart Grid technologies work to reduce carbon emissions outside of pure energy savings. While there are many ways in which Smart Grid technologies work to mitigate environmental issues, the focus of most study surround the notion of carbon emissions. Because carbon emissions are such a great threat to human health and environmental sustainability, it is often the center of much research and analysis in regards to renewable energy development.

The first of these strategies consists of a process known as integration of intermittent renewables

(EPRI 51). "

Deployment of a

Smart Grid infrastructure combined with electric storage and discharge options will help reduce the variability in renewable power sources by decoupling generation from demand." The basis of this process relies on the need to store energy that is not currently being used. Paired with other renewable energy sources such as wind and solar technologies, the impact on carbon emission levels is significant. Having these resources available to the public encourages the use of renewable energies and allows easier access to Smart Grid based technologies

. To promote this understand, Smart Grid technologies increase the rate at which the public can integrate personal generation technologies such as home solar panels (EPRI 55).

This connection is meant to integrate Smart Grid technologies on a private level, encouraging the idea of personal responsibility and awareness

.

The Electrical Power Research institute claims the facilitation of Plug-In hybrid vehicles is the second way in which the Smart Grid helps to reduce carbon emissions . “A joint study conducted in 2007 by EPRI and the Natural Resource Defense Council concluded that PHEVs will lead to a reduction of 3.4 to 10.3 billion metric tons of greenhouse gases by 2050” (EPRI 54). The benefits of using electric based technologies are shown through the projected environmental impacts from the EPRI. When one compares the usage of

non-renewable sources in a projected forecast, the outcome is quite dismal.

Because vehicles produce the highest amounts of carbon emissions, continuing to produce similar systems will only increase the problems associated with high volumes of standard emission s. Restricting the amount of green house gas that is accumulated has significant impacts when one calculates the future forecast in regards to pollutants and ozone depletion.

The development of PHEVs relies heavily on the production of electricity by Smart Grid technologies

.

The basis of the product itself works intricately with electric production and systems commonly associated. It is said the Smart Grid is vital for utilities, entailing the information is sent to consumers determining when is best to charge the batteries in their vehicles. This often correlates with on and off peak electrical generation and can strongly influence the demand for services associated with PHEV use. "Alternatively, PHEVs can potentially be used to store electrical energy in their onboard batteries for peak-shaving or powerquality applications, offering potentially powerful synergies to complement the electric power grid" (EPRI 55).

Hybrid vehicles are often said to be the direct outcome of Smart Grid technologies in that they often mirror the processes that traditionally associate with renewable processes.

In order to influence the natural environment in a positive way, renewable energies operate on many systems and are tightly integrated within in small processes, which occur every day in the general public.

Accessing "greener" technologies begins with understanding resource consumption. Because electrical vehicles have become so popular within the past decade, the need for electricity has increased as a result

.

Electricity generated by nonrenewable sources that pollute the environment with carbon emissions does little to reduce the problems society currently faces

. Because the resource of electricity is projected to increase in demand as more technologies rely on it, clean generation is needed.

All of these processes rely heavily on Smart Grid generation systems and storage. Without the use of Smart Grid technologies, the production of the energy needed will simply fail.

Supporting systems, which rely heavily on extraction further damages the natural environment.

The fiscal, environmental and health costs are far greater as the demand for electricity increases.

Extinction

Flournoy 12

(Citing Dr. Feng Hsu, a NASA scientist at the Goddard Space Flight Center and a technology risk assessment expert, Don Flournoy,

PhD and MA from the University of Texas, Former Dean of the University College @ Ohio University, Former Associate Dean @

State University of New York and Case Institute of Technology, Project Manager for University/Industry Experiments for the NASA

ACTS Satellite, Currently Professor of Telecommunications @ Scripps College of Communications @ Ohio University, Citing Dr.

"Solar Power Satellites," Chapter 2: What Are the Principal Sunsat Services and Markets?, January, Springer Briefs in Space

Development, Book)

In the Online Journal of Space Communication, Dr. Feng

Hsu, a NASA scientist

at Goddard Space Flight Center, a research center in the forefront of science of space and Earth, writes , “The evidence of global warming is alarming ,” noting the potential for a catastrophic planetary climate change is real and troubling (Hsu 2010).

Hsu and his NASA colleagues were engaged in monitoring and analyzing cli- mate changes on a global scale, through which they received first-hand scientific information and data relating to global warming issues, including the dynamics of polar ice cap melting.

After discussing this research with colleagues who were world experts on the subject, he wrote

: I now have no doubt global temperatures are rising, and that global warming is a serious problem confronting all of humanity . No matter whether these trends are due to human interference or to the cosmic cycling of our solar system, there are two basic facts that are crystal clear: (a) there is overwhelming scientific evidence showing positive correlations between the level of CO2 concentrations in Earth’s atmosphere with respect to the historical fluctuations of global temperature changes

; and

(b) the overwhelming majority of the world’s scientific community is in agreement about the risks of a potential catastrophic global climate change

. That is, if we humans continue to ignore this problem

and do noth- ing, if we continue dumping huge quantities of greenhouse gases into Earth’s biosphere, humanity will be at dire risk (Hsu 2010).

As a technology risk assessment expert

,

Hsu says he can show with some confi- dence that the planet will face more risk doing nothing to curb its fossil-based energy addictions than it will in making a fundamental shift in its energy supply . “This,” he writes, “is because the risks of a catastrophic anthropogenic climate change can be potentially the extinction of human species , a risk that is simply too high for us to take any chances ” (Hsu 2010). It was this NASA scientist’s conclusion that humankind must now embark on the next era of

“sustainable energy consumption and re-supply, the most obvious source of which is the mighty energy resource of our Sun” (Hsu

2010) (Fig. 2.1).

AT No spillover

China gets onboard but we need to develop the grid

Elizabeth

Balkan

, advises private and public stakeholders on energy and climate policy, and cleantech investment strategies in

China. She has over ten years of China experience,

09

[“China's Smart Grid Ambitions Could Open Door to US-China Cooperation,”

June 5, New Energy and Environment Digest, http://needigest.com/2009/06/05/chinas-smart-grid-ambitions-could-open-door-to-uschina-cooperation/]

China’s largest electric transmission company has announced an ambitious plan to develop a national smart grid by 2020 that would help utilities and their customers transport and use energy more efficiently.

The sheer size of the project raises some intriguing questions. First, about whether China has the capital and technology for such an extensive upgrade. And second, whether the project could provide an opening for U.S.-

China cooperation on technological improvements that could benefit both.

There’s little question that the grid upgrade is becoming a necessity for State Grid Corporation of China, which is responsible for delivering power to 80 percent of the population.

Repeated blackouts in China’s coastal metropolises caused by power shortages in recent years, plus pressure to expand electrification to the rural inland

and the growth of wind farms, have prompted considerable government investment in supply-side electricity improvements

. Underscoring the current pressure on the Chinese government to address the issue of power equity, Ryan Hodum, a senior associate for clean energy consulting firm David Gardiner & Associates told NEEDigest that while it “will be critical to develop a ‘clean energy backbone’ across China to deliver electrons derived from clean and efficienct sources,” the Chinese government also “need[s] to focus on rural electrification so that the country not only develops a Smart Grid but…also raises the level of access to energy.” Private firms and provincial governments across northern and eastern China are already commissioning several

10-gigawatt wind and solar generation bases that will depend on an advanced grid to help China reach its target of 15% energy from renewables by 2020, not to mention the 35% goal SGCC expects to meet. SGCC general manager Liu Zhenya said the smart grid project would get started this year with the development of technical standards. Part of the physical foundation is already in the works, such the “West-East Electricity Transfer Project,” an initiative to build three East-West corridors totaling 20 GW in transmission capability.

Since January, the State Grid has also been operating a 400 mile long, 1,000 kilovolt ultra-high voltage AC demonstration project, which allows heavy electricity flow with lower transmission loss

. It plans break ground on three more UHV AC lines this year, and build roughly 11,000 miles of UHV AC lines by 2012. More financial and technical details of the smart grid plan are expected to emerge over the next few weeks.

Meanwhile, questions remain about the source of the needed technology . China’s Localization Push Poses a Challenge for Technology Though the pilot UHV line was developed entirely in China, building out a smart grid in

China will depend on importing key technologies, a fact that was not lost on the over 40 utility data management-related exhibitor s that turned out to the third annual MeteringChina Conference, held in Beijing just days after the SGCC announcement. Nevertheless, China may be as intent to develop domestic smart growth technologies as it has been in promoting a homegrown wind industry by mandating at least 70% domestically produced components in the construction of wind power plants. China Strategies president Louis Schwartz is confident that as with “just about every industry, [China’s] ultimate goal is localization.” But localization will not come cheap. Lu Qiang, an academic at the Chinese Academy of

Sciences and professor at Tsinghua University, estimates that China will need to spend at least

$147 billion yuan to build an international-level quality smart grid

. Bloomberg reports suggested capital costs of $10 billion annually from 2011 to 2020, with a total project cost of $590 billion.

Partners in Smart Grid Exploration:

China and the U.S.

For these and other reasons, it makes sense that the U.S. and China – which are simultaneously venturing into smart grid planning – should be considering smart grid cooperation.

China and the US face many similar challenges in their power delivery infrastructure.

Like the U.S., China must transmit power across great distances. Moreover, compared with some countries’ systems, which boast only 2-3% annual loss of generated power, China averages roughly 7% loss per year and U.S. losses have reached 9%

in recent years.

The U.S. and China’s shared need to address geographically similar demand and excessive power loss explain why smart grid deployment has emerged as a

key area for bilateral collaboration

.

Four days after SGCC announced their smart grid plans, U.S. House Speaker Nancy Pelosi (D-Calif.) and Senate Foreign Relations Committee Chairman

John Kerry (D-Mass.) held a Clean Energy Forum in Beijing. The group reached an agreement on joint clean energy action that included smart grid as one of the key areas for collaboration.

Specific to smart grid planning, the agreement called for knowledge and technology exchange between the two countries, and a sharing of demand side management tools

. On the topic of eliminating barriers, the framework called for a relaxation of import barriers in both

countries on clean energy goods and services; lifting U.S. export controls on clean energy technologies, software and services stifling more robust joint research and development; and instituting a joint intellectual property protection program, backed by the full faith and credit of each government.

While the State Grid announcement and subsequent efforts to promote

U.S.-China smart grid coordination are a promising first step towards building a smart grid in

China, only time will tell whether China, and the U.S., will be able to sort out the extremely complicated nuts and bolts involved in such an initiative.

2NC Meltdowns Impact

Smart grid solves blackouts —makes the grid resilient

Barbara Vergetis

Lundin

, Energy Analyst,

12

[“Could U.S. utilities be the next to say "if only?" FierceSmartGrid, August 6, http://www.fiercesmartgrid.com/story/could-us-utilities-be-next-say-if-only/2012-08-06?page=0,3]

Growing Demands Mean Growing Smart Grid A modern grid is critical in the U.S. and globally.

Growing energy issues demand viable solutions even today, as well as the scalability for tomorrow

.

A 22-year-old native of India (specifically, Jaipur) and graduate of the University of California could have the answer to India's (and the United States') power reliability issues, according to The Times of India.

Yashraj Khaitan's philosophy lies at the very heart of the smart grid -- an 'eyes and ears connected to a brain' which monitors consumption, generates a demand-supply response, and eliminates losses

.

Making its debut two months ago, Khaitan launched

Gram Power with the strategy of setting up solar power plants at the village level and linking them to the start-ups' smart grid system.

U.S. utilities have what it takes to prevent outages -- with smarter networks, focused maintenance, and a better understanding of and greater load control

.

Technologies exist today to anticipate and prevent issues before they occur. The reality is that this is not always possible.

When unplanned outages do occur, grid outage management systems can reroute power to minimize the outage, analyze needed repairs and dispatch crews more effectively to get the job done faster and more efficiently.

The U.S. has traditionally had enough reserve capacity of both transmission and generation to support its needs. Losing sight of looming generation and transmission issues, and neglecting to compensate for retired power plants, could place the U.S. in the same daunting situation as India

.

"

As we ramp down baseload generation and ramp up variable generation

the supply situation, at times, will get closer to what India faces on a daily basis," Houseman contends.

Sm art grid technologies will serve as an increased buffer between supply and demand to minimize some of the issues facing India and, potentially, the United States

.

Blackouts cause nuclear meltdowns

Cappiello ‘11

(Dina, reporter for the AP March 29, 2011 “AP IMPACT: Long Blackouts Pose Risk to US Reactors” The Post and Courier http://www.postandcourier.com/news/2011/mar/29/ap-impact-long-blackouts-pose-risk-us-reactors/?print

)

Long before the nuclear emergency in Japan

,

U.S. regulators knew that a power failure lasting for days at an American nuclear plant

, whatever the cause, could lead to a radioactive leak

. Even so, they have only required the nation’s 104 nuclear reactors to develop plans for dealing with much shorter blackouts on the assumption that power would be restored quickly.

In one nightmare simulation presented by the Nuclear Regulatory Commission in 2009, it would take less than a day for radiation to escape from a reactor

at a Pennsylvania nuclear power plant after an earthquake, flood or fire knocked out

all electrical power and there was no way to keep the reactors cool

after backup battery power ran out. That plant, the Peach

Bottom Atomic Power Station outside Lancaster, has reactors of the same older make and model as those releasing radiation at

Japan’s Fukushima Dai-ichi plant, which is using other means to try to cool the reactors. And like Fukushima Dai-ichi, the Peach

Bottom plant has enough battery power on site to power emergency cooling systems for eight hours . In Japan, that wasn’t enough time for power to be restored. According to the International Atomic Energy Agency and the

Nuclear Energy Institute trade association, three of the six reactors at the plant still can’t get power to operate the emergency cooling systems. Two were shut down at the time. In the sixth, the fuel was removed completely and put in the spent fuel pool when it was shut down for maintenance at the time of the disaster. A week after the March 11 earthquake, diesel generators started supplying power to two other two reactors, Units 5 and 6, the groups said.

The risk of a blackout leading to core damage,

while extremely remote, exists at all U.S. nuclear power plants

, and some are more susceptible than others, according to an

Associated Press investigation. While regulators say they have confidence that measures adopted in the U.S. will prevent or significantly delay a core from melting and threatening a radioactive release, the events in Japan raise questions about whether U.S. power plants are as prepared as they could and should be

. A top Nuclear Regulatory

Commission official said Tuesday that the agency will review station blackouts and whether the nation’s 104 nuclear reactors are capable of coping with them. As part of a review requested by President Barack Obama in the wake of the Japan crisis, the NRC will examine “what conditions and capabilities exist at all 104 reactors to see if we need to strengthen the regulatory requirement,” said

Bill Borchardt, the agency’s executive director for operations. Borchardt said an obvious question that should be answered is whether

nuclear plants need enhanced battery supplies, or ones that can last longer. “There is a robust capability that exists already, but given what happened in Japan there’s obviously a question that presents itself: Do we need to make it even more robust?” He said the NRC would do a site-by-site review of the nation’s nuclear reactors to assess the blackout risk. “We didn’t address a tsunami and an earthquake, but clearly we have known for some time that one of the weak links that makes accidents a little more likely is losing power,” said Alan Kolaczkowski, a retired nuclear engineer who worked on a federal risk analysis of Peach Bottom released in 1990 and is familiar with the updated risk analysis. Risk analyses conducted by the plants in 1991-94 and published by the commission in

2003 show that the chances of such an event striking a U.S. power plant are remote, even at the plant where the risk is the highest, the

Beaver Valley Power Station in Pennsylvania.

These long odds are among the reasons why the United States since the late 1980s has only required nuclear power plants to cope with blackouts for four or eight hours . That’s about how much time batteries would last. After that, it is assumed that power would be restored

. And so far, that’s been the case.

Equipment put in place after the Sept. 11, 2001, terrorist attacks could buy more time. Otherwise, the reactor’s radioactive core could begin to melt unless alternative cooling methods were employed

. In Japan, the utility has tried using portable generators and dumped tons of seawater, among other things, on the reactors in an attempt to keep them cool.

A 2003 federal analysis looking at how to estimate the risk of containment failure said that should power be knocked out by an earthquake or tornado it

“would be unlikely that power will be recovered in the time frame to prevent core meltdown.”

In

Japan, it was a one-two punch: first the earthquake, then the tsunami.

Extinction

Lendman ‘11

(Stephen, Research Associate of the Center for Research on Globalization, “Nuclear Meltdown in Japan,” http://www.opednews.com/articles/Nuclear-Meltdown-in-Japan-by-Stephen-Lendman-110313-843.html

)

Fukushima Daiichi "nuclear power plant in Okuma, Japan, appears to have caused a reactor meltdown." Stratfor downplayed its seriousness, adding that such an event "does not necessarily mean a nuclear disaster," that already may have happened - the ultimate nightmare short of nuclear winter. According to Stratfor, "(A)s long as the reactor core, which is specifically designed to contain high levels of heat, pressure and radiation, remains intact, the melted fuel can be dealt with. If the (core's) breached but the containment facility built around (it) remains intact, the melted fuel can be....entombed within specialized concrete" as at

Chernobyl

in 1986. In fact, that disaster killed nearly one million people worldwide

from nuclear radiation exposure. In their book titled,

"Chernobyl: Consequences of the Catastrophe for People and the Environment," Alexey Yablokov, Vassily Nesterenko and Alexey

Nesterenko said: "

For the past 23 years, it has been clear that there is a danger greater than nuclear weapons concealed within nuclear power

.

Emissions from this one reactor exceeded a hundredfold the radioactive contamination of the bombs dropped on Hiroshima and Nagasaki

." "No citizen of any country can be assured that he or she can be protected from radioactive contamination.

One nuclear reactor can pollute half the globe.

Chernobyl fallout covers the entire Northern Hemisphere." Stratfor explained that if Fukushima's floor cracked,

"it is highly likely that the melting fuel will burn through (its) containment system and enter the ground

. This has never happened before," at least not reported.

If now occurring, "containment goes from being merely dangerous, time consuming and expensive to nearly impossible," making the quake, aftershocks, and tsunamis seem mild by comparison.

Potentially, millions of lives will be jeopardized.

Other DA Links

Reprocessing Politics Links

Financial support for reprocessing is uniquely unpopular in the current climate – perceived as Obama trying to replay past fights –

Oelrich 12

. [Ivan, Ph.D. is the Senior Fellow for the Strategic Security Program at the Federation of American Scientists,

“Prospects for a Plutonium Economy in the United States” in the report: The Future of Nuclear Power in the United States --

Federation of American Scientists -- February -- http://www.fas.org/pubs/_docs/Nuclear_Energy_Report-lowres.pdf]

The United States began a demonstration fast reactor at Clinch River, Tennessee, near Oak Ridge.

When the cost exploded several fold, Congress cancelled the program in 1983. But only in the United States was the parallel reprocessing program also cancelled. Presidents

Ford and Carter actually made opposition to reprocessing a government policy

, primarily because of fears that widespread reprocessing would increase the risks of nuclear weapon proliferation

. President

Reagan rescinded the ban

, allowing commercial reprocessing. But

Congress did not reinstate government

financial support

, and industry showed no interest in restarting reprocessing

.

Waste disposal initiatives like the plan create gridlock

Sands ’12

(Derek, “Before the US can store its nuclear waste, the Senate fights about who's in charge of that decision”, Platts, 6-8-2012,

http://www.platts.com/weblog/oilblog/2012/06/08/before_the_us_c.html

)

However, the two committees may focus on different priorities as part of potential nuclear waste legislation. Bingaman said

Wednesday that the bill he is crafting would focus on creating an entity outside DOE to deal with nuclear waste, while senators on the environment and public works committee aimed their questions to Blue Ribbon Commission members on the mechanics of gaining state and local consent for a repository. Whether the committees can work together or will come to legislative blows is still unclear.

But if one thing has been proven so far, it is that escaping gridlock and delay when debating nuclear waste disposal is impossible

. "It was over 30 years ago when the Congress realized the importance of finding a permanent solution for the disposal of our spent fuel and high level waste," Carper said. "In response,

Congress passed the Nuclear Waste Policy Act of 1982, moving this country forward towards deep mine geologic nuclear waste repositories. After years of study and debate, we find ourselves 30 years later in what's really a dead end."

Plan is perceived as not Yucca mountain—angers Congressional republicans

Wald ’12

(Matthew L., “Moving From Square One on Nuclear Waste”, New York Times, 6-7-

2012,

http://green.blogs.nytimes.com/2012/06/07/moving-from-square-one-on-nuclear-waste/ )

The idea that the proposed Yucca Mountain nuclear waste repository in Nevada is dead has not gone down well in Congress, where some Republicans are trying to allocate new money to the Nuclear Regulatory Commission so it can revive its evaluation of the site’s suitability. But at a Senate subcommittee hearing on Thursday, some supporters of the civilian power industry said it was time to move on.

It’s perceived as going against Congressional objectives

NTI ’12

(“U.S. Reluctant to Permit South Korean Fuel Reprocessing, Envoy Says”, Global

Security Newswire, 3-8-2012,

http://www.nti.org/gsn/article/south-korea-not-expecting-us-permit-fuel-reprocessingenvoy-says/ )

Were the Obama administration to allow pyroprocessing rights in a new atomic trade deal, Congress is not likely to ratify the pact as it would go against U.S. efforts to curb the spread of technologies that can be used in nuclear weapons development, the envoy said.

Costs political capital – risks, startup cost, and public safety, their evidence doesn’t assume changing congressional concerns

Alex

Trembath

, Policy Fellow in AEL’s New Energy Leaders Project,

11

[“Nuclear Power and the Future of Post-Partisan

Energy Policy,” Lead Energy, Feb 4, http://leadenergy.org/2011/02/the-nuclear-option-in-a-post-partisan-approach-on-energy/]

Nuclear power is unique among clean energy technologies in that Democrats tend to be more hesitant towards its production than Republicans

. Indeed, it has a reputation for its appeal to conservatives -Senators

Kerry, Graham and Lieberman included provisions for nuclear technology in their ultimately unsuccessful American Power Act

(APA) with the ostensible goal of courting Republican support. The urgency with which Democrats feel we must spark an energy revolution may find a perfect partner with Republicans who support nuclear power. But is there anything more than speculative political evidence towards its bipartisan viability?

If there is one field of the energy sector for which certainty of political will and government policy is essential, it is nuclear power

.

High up front costs for the private industry, extreme regulatory oversight and public wariness necessitate a committed government partner for private firms investing in nuclear technology

.

In a new report on the potential for a “nuclear renaissance,” Third Way references the failed cap-and-trade bill, delaying tactics in the House vis-a-vis EPA regulations on CO ₂ , and the recent election results to emphasize the difficult current political environment for advancing new nuclear policy

. The report,

“The Future of Nuclear Energy,” makes the case for political certainty: “ It is difficult for energy producers and users to estimate the relative price for nuclear-generated energy compared to

fossil fuel alternatives

(e.g. natural gas)– an essential consideration in making the major capital investment decision necessary for new energy production

that will be in place for decades.” Are our politicians willing to match the level of certainty that the nuclear industry demands? Lacking a suitable price on carbon that may have been achieved by a cap-and-trade bill removes one primary policy instrument for making nuclear power more cost-competitive with fossil fuels. The impetus on Congress, therefore, will be to shift from demand-side “pull” energy policies (that increase demand for clean tech by raising the price of dirty energy) to supply-side “push” policies, or industrial and innovation policies. Fortunately, there are signals from political and thought leaders that a package of policies may emerge to incentivize alternative energy sources that include nuclear power.

One place to start is the recently deceased

American Power Act

, addressed above, authored originally by Senators Kerry, Graham and Lieberman.

Before its final and disappointing incarnation, the bill included provisions to increase loan guarantees for nuclear power plant construction in addition to other tax incentives.

Loan guarantees are probably the most important method of government

involvement in new plant construction, given the high capital costs of development.

One wonders what the fate of the bill, or a less ambitious set of its provisions, would have been had

Republican

Senator

Graham not abdicated and removed

any hope of Republican co-sponsorship

.

But

that was last year.

The changing of the guard in Congress makes this a whole different game, and the once feasible support for nuclear technology on either side of the aisle must be reevaluated

. A New York Times piece in the aftermath of the elections forecast a difficult road ahead for nuclear energy policy,

but did note

Republican support for programs like a waste disposal site and loan guarantees.

Republican support for nuclear energy has roots in the most significant recent energy legislation, the Energy Policy Act of 2005, which passed provisions for nuclear power with wide bipartisan support. Reaching out to Republicans on policies they have supported in the past should be a goal of Democrats who wish to form a foundational debate on moving the policy forward. There are also signals that key Republicans, notably Lindsey Graham and

Richard Lugar, would throw their support behind a clean energy standard that includes nuclear and CCS.

Republicans in Congress will find intellectual support from a group that AEL’s Teryn Norris coined “innovation hawks,” among them Steven Hayward, David

Brooks and George Will. Will has been particularly outspoken in support of nuclear energy, writing in 2010 that “it is a travesty that the nation that first harnessed nuclear energy has neglected it so long because fads about supposed ‘green energy’ and superstitions about nuclear power’s dangers.” The extreme reluctance of Republicans to cooperate with Democrats over the last two years is only the first step, as any legislation will have to overcome Democrats’ traditional opposition to nuclear energy

. However, here again there is reason for optimism. Barbara Boxer and John

Kerry bucked their party’s long-time aversion to nuclear in a precursor bill to APA, and Kerry continued working on the issue during

2010. Jeff Bingaman, in a speech earlier this week, reversed his position on the issue by calling for the inclusion of nuclear energy provisions in a clean energy standard. The Huffington Post reports that “the White House reached out to his committee [Senate

Energy] to help develop the clean energy plan through legislation.” This development in itself potentially mitigates two of the largest obstacle standing in the way of progress on comprehensive energy legislation: lack of a bill, and lack of high profile sponsors.

Democrats can also direct Section 48C of the American Recovery and Reinvestment Act of 2009 towards nuclear technology, which provides a tax credit for companies that engage in clean tech manufacturing.

Democrats should not give up on their policy goals simply because they no longer enjoy broad majorities in both Houses, and Republicans should not spend all their time holding symbolic repeal votes on the Obama Administration’s accomplishments. The lame-duck votes in December on “Don’t Ask, Don’t

Tell,” the tax cut deal and START indicate that at least a few Republicans are willing to work together with Democrats in a divided

Congress, and that is precisely what nuclear energy

needs moving forward. It will require an aggressive push from

the White House, and a concerted effort from both parties’ leadership, but the road for forging bipartisan legislation is not an impassable one.

Obama would get drawn in

Trembath 2/4/11

(Alex, Policy Fellow in AEL’s New Energy Leaders Project, “Nuclear

Power and the Future of Post-Partisan Energy Policy”)

http://leadenergy.org/2011/02/the-nuclear-optionin-a-post-partisan-approach-on-energy/

The politician with perhaps the single greatest leverage over the future of nuclear energy is President Obama, and his rhetoric matches the challenge posed by our aging and poisonous energy infrastructure. “This is our generation’s Sputnik moment,” announced Obama recently. Echoing the calls of presidents past, the President used his State of the Unionpodium to signal a newly invigorated industrialism in the United States. He advocated broadly for renewed investment in infrastructure, education, and technological innovation. And he did so in a room with many more members of the opposition party than at any point during the first half of his term. The eagerness of the President to combine left and right agendas can hopefully match the hyper-partisan bitterness that dominates our political culture, and nuclear power maybe one sector of our economy to benefit from his political leadership.

Congress has historically voted against increasing incentives

Gale 9 (Kelley Michael Gale, Finance Department Chair of Latham and Watkins San

Diego, Global Co-Chair for the firm’s Climate Change and Cleantech Practice Groups,

“Financing the Nuclear Renaissance: The Benfeits and Potential Pitfalls of Federal and

State Government Subsidies and the Future of Nuclear Power in California” Energy Law

Journal, Vol. 30.497)

President Barack Obama has stated ―[n]uclear power represents more than 70 percent of our noncarbon generated electricity . . . It is unlikely that we can meet our aggressive climate goals if we eliminate nuclear power as an option. However, before an expansion of nuclear power can be considered, key issues must be addressed including: security of nuclear fuel and waste, waste storage, and proliferation.‖ Larry West, Election 2008: Barack Obama on Nuclear Energy, ABOUT.COM, http://environment.about.com/od/environmentallawpolicy/a/obama_nuclear.htm (last visited Sept. 1, 2009). Secretary of Energy

Steven Chu stated in a speech to the Western Governors‘ Association that ―[n]uclear has to be part of the mix. . . . It‘s clean, baseload power.‖ He also noted the problems of waste and safety, but stated that such problems can be solved. Patty Henetz, Energy

Secretary Sees Nuclear Power In America’s Future, SALT LAKE TRIBUNE, June 16, 2009, available at http://www.sltrib.com/news/ci_12595919. Exemplifying policymakers‘ hesitance to support nuclear energy as part of a green energy policy, proposed amendments to

increase incentives

for nuclear power were voted down by the Senate Energy committee, including an amendment proposing to include nuclear power as renewable energy for purposes of renewable portfolio standards.

Ayesha Rascoe, U.S. Lawmakers Seek More Nuclear Power in Bill, REUTERS, June 5, 2009, available at http://www.energybusinessreview.com/news/us_lawmakers_seek_more_nuclear_power_in_bill_090605. Also, despite significant funding for and emphasis placed on renewable energy in the 2009 American Recovery and Reinvestment Act, funding for nuclear power was entirely eliminated. See “Nuclear Pork” Cut Out of Final Recovery and Reinvestment Package, ENVTL. NEWS SERV., Feb. 12, 2009, available at http://www.ens-newswire.com/ens/feb2009/2009- 02-12-094.asp (quoting Kevin Kamps of Beyond Nuclear who helped lead the campaign on Capitol Hill to cut nuclear money from the stimulus as saying ―nuclear energy cannot solve the climate crisis and fattening the nuclear calf has deprived real energy solutions like renewable energy and energy efficiency programs from essential support for decades‖). Even many policymakers who express support for nuclear energy also express reservations about its use. See, e.g., Kent Garber, Gauging the Prospects for Nuclear Power in the Obama Era, US NEWS AND WORLD REPORT, Mar. 27, 2009, available at http://www.usnews.com/articles/news/energy/2009/03/27/gauging-the-prospects-for-nuclear-power-in-theobama-era.html

(suggesting that even within the Obama administration which states it supports nuclear energy, there are doubts about whether these statements are actually meant and ―[e]ven Democrats are arguing among themselves over how much to support nuclear energy‖).

NRC Licensing Turn

Lowering restrictions on licenses undermines the NRC model – expert consensus

P

hysicians for

S

ocial

R

esponsibility, quotes a wreck of qualified people, 6/23/

2010

(“Experts warn proposed climate/energy legislation would deregulate new nuclear reactors in much the same way that oil drilling oversight was ‘streamlined’ before BP spill,” http://www.psr.org/nuclear-bailout/resources/062310-release.pdf

)

As the industry’s proponents in Congress tout the nuclear regulatory structure as superior to that used for oil drilling and even as a possible model for oversight of the petrochemical industry, the same individuals are quietly working behind the scenes to push through BP-like regulatory rollbacks that would dramatically reshape safety and environmental requirements for new reactors

. These provisions might be incorporated into a climate bill or a narrower “energy-only” bill that could be voted on by the

U.S. Senate as early as next month.

Leading experts are worried that these little-discussed provisions in proposed climate/energy legislation would further undermine

Nuclear Regulatory Commission ( NRC ) safety reviews for new reactors by truncating the licensing process for new reactors , scaling back environmental impact reviews , and limiting public involvement in reactor licensing decisions . These measures would relax the healthy pressure that nuclear reactor neighbors can put on regulators to serve the public’s interest first , rather than that of the industry . (See details below on the proposed legislative provisions that set the stage for the BP-style deregulation of the nuclear industry.)

Extinction

Shultz

, former U.S. Secretary of State and PhD in industrial economics, et al,

10/2

/2012

(George, and Sidney Drell – PhD in physics, arms control specialist and senior fellow at the Hoover Institution at Stanford University and a professor of theoretical physics emeritus at Stanford’s SLAC National Accelerator Laboratory, Steven P. Andreasen -- lecturer at the Humphrey School of Public Affairs at the University of Minnesota, “Reducing the Global Nuclear Risk” October 2, 2012,

Policy Review, No. 175, Hoover Institution)

We begin with the most reassuring outcome of our deliberations: It’s the sense generally held that the U.S. nuclear enterprise

currently

meets very high standards in its commitment to safety and security .

That has not always been the case in all aspects of the U.S. nuclear enterprise. But

safety begins at home

, and while

the U.S. will need to remain focused

to guard against nuclear risks, the picture here looks relatively good.

Our greatest concern is that the same cannot be said of the nuclear enterprise globally. Governments, international organizations, industry, and media must recognize

and address the nuclear challenges and mounting risks posed by a rapidly changing world.

The biggest concerns with nuclear safety

and security are in countries relatively new to the nuclear enterprise, and the po­tential loss of control to terrorist or criminal gangs of the fissile material that exists

in such abundance around the world

. In a number of countries, confidence in civil nuclear energy production was severely shaken in the spring of 2011 by the Fukushima nuclear reactor plant disaster. And in the military sphere, the doctrine of deterrence that remains primarily dependent on nuclear weapons is seen in decline due to the importance of nonstate actors such as al-

Qaeda and terrorist affiliates that seek destruc­tion for destruction’s sake. We have two nuclear tigers by the tail.

When risks and consequences are unknown, undervalued, or ignored, our nation and the world are dangerously vulnerable. Nowhere is this risk/consequence calculation more relevant than with respect to the nucleus of the atom.

From Hiroshima to Fukushima

The nuclear enterprise was introduced to the world by the shock of the devastation produced by two atomic bombs hitting Hiroshima and Nagasaki. Modern nuclear weapons are far more powerful than those early bombs, which presented their own hazards. Early research depended on a program of atmospheric testing of nuclear weapons. In the early years following World War II, the impact and the amount of radioactive fallout in the atmosphere generated by above-ground nuclear explosions was not fully appreciated. During those years, the United States and the Soviet Union conducted several hundred tests in the atmosphere that created fallout.

A serious regulatory weak point from that time still exists in many places today, as the Fukushima disaster clearly indicates. The U.S.

Atomic Energy Commission (aec) was initially assigned conflicting responsibilities: to create an arsenal of nuclear weapons for the

United States to confront a grow­ing nuclear-armed Soviet threat; and, at the same time, to ensure public safety from the effects of radioactive fallout. The aec was faced with the same conundrum with regard to civilian nuclear power generation. It was charged with promoting civilian nuclear power and simultane­ously protecting the public.

Progress came in 1963 with the negotiation and signing of the Limited Test Ban Treaty (ltbt) banning all nuclear explosive testing in the atmosphere (initially by the United States, the Soviet Union, and the United Kingdom). With the successful safety record of the

U.S. nuclear weapons program, domestic anxiety about nuclear weapons receded somewhat. Meanwhile, public attitudes toward

nuclear weapons reflected recognition of their key role in establishing a more stable nuclear deter­rent posture in the confrontation with the Soviet Union.

The nuclear safety picture looks relatively good; the same cannot be said of the nuclear enterprise globally.

The positive record on safety of the nuclear weapons enterprise in the United States — there have been accidents involving nuclear weapons, but none that led to the release of nuclear energy — was the result of a strong effort and continuing commitment to include safety as a pri­mary criterion in new weapons designs, as well as careful production, handling, and deployment procedures. The key to the health of today’s nuclear weapons enterprise is confidence in the safety of its operations and in the protection of special nuclear materials against theft. One can imagine how different the situation would be today if there had been a recognized theft of material sufficient for a bomb, or if one of the two four-megaton bombs that fell from a disabled b-52 Strategic Air Com­mand bomber overflying Goldsboro, North Carolina, in 1961 had deto­nated. In that event, a single switch in the arming sequence of one of the bombs, by remaining in its “off” position while the aircraft was dis­integrating, was all that prevented a full-yield nuclear explosion. A close call indeed.

In the 26 years since the meltdown of the nuclear reactor at Chernobyl in Soviet-era Ukraine, the nuclear power industry has strengthened its safety practices

.

Over the past decade, growing concerns about global warming and energy independence

have actually strengthened support for nuclear

energy in the U nited

S tates and

many nations around the world

.

Yet despite these trends, the civil nuclear enterprise remains fragile.

Following Fukushima

, opinion polls gave stark evidence of

the public’s deep fears

of the invisible force of nuclear radiation, shown by public opposition to the construction of new nuclear

power plants in close proximity. It is not simply a matter of getting bet­ter information to the public but of actually educating the public about the true nature of nuclear radiation and its risks. Of course, the imme­diate task of the nuclear power component of the enterprise is to strive for the best possible safety record. The overriding objective could not be more clear: no more

Fukushimas.

Another issue that must be resolved involves

the continued effectiveness of a policy of deterrence that remains primarily dependent on nuclear weapons, and the hazards these

weapons pose due to the spread of nuclear technology and material.

There is

growing apprehension about the determination of terrorists to get their hands on

weapons or, for that matter, on the

special nuclear material

— plutonium and highly enriched uranium — that fuels them

in the most challenging step toward develop­ing a weapon.

The

global effects

of

a regional war between nuclear-armed adversaries such as India and

Pakistan would also wield an enormous impact , potentially involving radioactive fallout at large distances caused by a limited number of nuclear explosions.

This is true as well for nuclear radiation from a reactor explosion — fallout at large distances

would have a serious societal impact on the nuclear enterprise.

There is little understanding of the reality and

poten­tial danger of consequences if such an event were to occur halfway around the world

. An effort should be made to prepare the public by providing information on how to respond to such an event.

An active nuclear diplomacy has grown out of the Cold War efforts to regulate testing and reduce superpower nuclear arsenals. There is now a welcome focus on rolling back nuclear weapons proliferation. Additional important measures include the Nunn-Lugar program, started in 1991 to reduce the nuclear arsenal of the former Soviet Union. Such initiatives have led to greater investment by the United States and other governments in better security for nuclear weapons and material globally, including billions of dollars through the g8 Global Partnership Against the Spread of Weapons and Materials of Mass Destruction. The commitment to improving security of all dangerous nuclear material on the globe within four years was made by 47 world leaders who met with Presi­dent

Obama in Washington, D.C., in April 2010; this commitment was reconfirmed in March 2012 at the Nuclear Security Summit in

Seoul, South Korea. Many specific commitments made in 2010 relating to the removal of nuclear materials and conversion of nuclear research reactors from highly enriched uranium to low-enriched uranium fuel have already been accomplished, along with increasing levels of voluntary commitments from a diverse set of states, improving prospects for achieving the four-year goal.

Three principles

It is evident that globally, the nuclear enterprise faces new and increasingly difficult challenges. Successful leadership in national security policy will require a continu­ous, diligent, and multinational assessment of these newly emerging risks and consequences. In view of the seriousness of the potentially deadly consequences associated with nuclear weapons and nuclear power, we emphasize the importance of three guiding principles for efforts to reduce those risks globally:

First, the calculations used to assess nuclear risks in both the military and the civil sectors are fallible. Accurately analyzing events where we have little data, identifying every variable associated with risk, and the possibility of a single variable that goes dangerously wrong are all factors that complicate risk calculations. Governments, industry, and concerned citizens must constantly reexamine the assumptions on which safety and security measures, emergency preparations, and nuclear energy production are based. When dealing with very low-probability and high-consequence operations, we typically have little data as a basis for making quantitative analyses. It is therefore difficult to assess the risk of a nuclear accident and what would contribute to it, and to identify effective steps to reduce that risk.

It's important to remember that the calculations used to assess nuclear risks in the military and civil sectors are fallible.

In this context, it is possible that

a single variable

could exceed expec­tations, go dangerously wrong, and simply overwhelm safety systems and the risk assessments on which those systems were built

.

This is what happened in

2011 when an earthquake, followed by a tsunami

— both of which exceeded expectations based on history — overwhelmed the Fuku ­ shima

complex, breaching a number of safeguards that had been built into the plant and triggering reactor core meltdowns and radiation leaks. This in turn exposed the human factor, which is hard to assess and can dramatically change the risk equation. Cultural habits and

regulatory inadequacy

inhibited rapid decision-making and crisis management in the Fukushima disaster.

A more nefarious example of the human factor would be a determined nuclear terrorist attack specifically targeting either the military or civilian component of the nuclear enterprise.

Second, risks associated with

nuclear weapons and nuclear power will

likely grow substantially as

nuclear weapons and civilian nuclear energy production

technology spread in unstable regions of the world where the potential for conflict is high

.

States that are new to the nuclear enterprise may not have effective nuclear safeguards to secure

nuclear weapons and materials

— including a developed fabric of early warning systems and nuclear confidence-building measures that could increase warn­ing and decision time for leaders in a crisis — or the capability to safely manage and regulate the construction and operation of new civilian reactors.

Hence there is a growing risk of

accidents

, mistakes, or

miscal­culations involving nuclear weapons

, and of

regional wars

or

nuclear ter­rorism

. The consequences would be horrific: A Hiroshima-size nuclear bomb detonated in a major city could kill a half-million people and result in $1 trillion in direct economic damage.

On the civil side of the nuclear ledger, the sobering paradox is this: While an accident would be con­siderably less devastating than the detonation of a nuclear weapon, the risk of an accident occurring is probably higher. Currently, 1.4 billion people live without electricity, and by 2030 the global demand for energy is projected to rise by about 25 percent. With the added need to mini­mize carbon emissions, nuclear power reactors will become increasingly attractive alternative sources for electric power, especially for develop­ing nations. These countries, in turn, will need to meet the challenge of developing appropriate governmental institutions and the infrastruc­ture, expertise, and experience to support nuclear power efforts with a suitably high standard of safety. As the world witnessed in Fukushima, a nuclear power plant accident can lead to the spread of dangerous radia­tion, massive civil dislocations, and billions of dollars in cleanup costs.

Such an event can also fuel widespread public skepticism about nuclear institutions and technology.

Some developed nations — notably Germany — have interpreted the Fukushima accident as proof that they should abandon nuclear power altogether, primarily by prolonging the life of existing nuclear reactors while phasing out nuclear-produced electricity and developing alternative energy sources.

Third, we need to understand that no nation is immune from risks involving nuclear weapons and nuclear power within their borders.

There were 32 so-called “Broken Arrow” accidents — nuclear accidents that do not pose a danger of an outbreak of nuclear war — involving U.S. weapons between 1950 and 1980, mostly involving U.S. Strategic Air Command bombers and earlier bomb designs not yet incorporating modern nuclear detonation safety designs. The U.S. no longer maintains a nuclear-armed, in-air strategic bomber force, and the record of incidents is greatly reduced. In several cases, accidents such as the North Caro­lina bomber incident came dangerously close to triggering catastrophes, with disaster averted simply by luck.

The U nited

S tates has had an admirable safety record in the area of civil nuclear power since

the

1979

Three Mile Island accident in Pennsylvania, yet safety concerns persist. One of the critical assumptions in the design of the Fukushima reactor complex was that, if electrical power was lost at the plant and back-up generators failed, power could be restored within a few hours. The combined one-two punch of the earthquake and tsunami, however, made the necessary repairs impossible. In the United

States today, some nuclear power reactors are designed with a comparably short window for restoring power. After Fukushima, this is an issue that deserves action — especially in light of our own Hurricane Katrina expe­rience, which rendered many affected areas inaccessible for days in 2005, and the August 2011 East Coast earthquake that shook the North Anna nuclear power plant in Mineral,

Virginia, beyond expectations based on previous geological activity.

Reducing risks

To reduce these nuclear risks, we offer four related recommendations that should be adopted by the nuclear enterprise, both military and civilian, in the

U nited

S tates and abroad.

First, the reduction of nuclear risks requires every level of the nuclear enterprise

and related military and civilian organizations to embrace the importance of safety and security as an overarching operating rule

.

This is not as easy as it sounds. To a war fighter, more safety and control can mean less reliability and availability and greater costs.

For a com­pany

or utility involved in

the construction or operation

of a nuclear power plant, more safety and security

can mean

greater regulation

and higher costs.

The absence of a culture of safety and security is perhaps the most reliable indicator of an impending disaster.

But the absence of a culture of safety and security, in which priorities and meaningful

standards are set

and rigorous discipline and

accountability are enforced

, is perhaps

the most reliable indicator

of an impending disaster.

In August 2007, after a b-52 bomber loaded with six nuclear-tipped cruise missiles flew from North Dakota to Louisiana without anyone realizing there were live weapons on board, then Secretary of Defense Robert Gates fired both the military and civilian heads of the U.S. Air Force. His action was an example of setting the right priorities and enforcing accountability, but the reality of the incident shows that greater incorporation of a safety and security culture is needed.

Second, independent

regulation of the nuclear enterprise is crucial

to setting and enforcing the safety and security rule

. In the U nited S tates today, the nuclear regulatory system — in particular, the

Nuclear Regu­latory Commission (

NRC

) — is credited with setting a uniquely high standard for independent regulation of the civil nuclear power sector .

This is one of the keys to a successful and safe nuclear program . Effec­tive regulation is even more crucial when there are strong incentives to keep operating costs down and keep an aging nuclear

reactor fleet in operation, a combination that could create conditions for a catastrophic nuclear

power plant failure

. Careful attention is required to protect the NRC from regulatory capture by vested interests in government and industry , the latter of which funds a high percentage of the nrc’s budget.

In too many countries, strong

, independent regulatory agencies are not the norm

. The independent watchdog organization advising the Japanese government was working with Japanese utilities to influence public opin­ion in favor of nuclear power. Strengthening the International Atomic Energy Agency (iaea) so that it can play a greater role in civil nuclear safety and security would also help reduce risks, and will require substan­tially greater authorities to address both safety and security, and most importantly, more resources for an agency whose budget is only €333 million, with only one-tenth of that total devoted to nuclear safety and security. In addition,

exporting “best practices” of the U.S. N

uclear

R

egulatory

C

ommission —

that is, lessons of nuclear regulation, oversight, and safety

learned over many decades —

to other countries

would

pay a huge safety dividend

.

IAEA Turn

Plan collapses IAEA effectiveness and credibility

Trevor

Findlay

, Senior Fellow at Centre for International Governance Innovation and Director of the Canadian Centre for Treaty

Compliance. Professor at the Norman Paterson School of International Affairs, June 20

12

, UNLEASHING THE NUCLEAR

WATCHDOG: strengthening and reform of the iaea, http://www.cigionline.org/sites/default/files/IAEA_final_0.pdf

If the

much-heralded nuclear energy revival ever comes to fruition

,

increased numbers of

research and power

reactors

,

additional

nuclear

trade

and transport

and moves by

more states

to acquire the

full nuclear fuel cycle will require increased IAEA safeguards capacity and spending

(Findlay, 2010a). In respect of existing types of facilities,

this will simply require more Agency resources and personnel

. With regard to new types of reactors

and facilities, it will require new safeguards approaches . The Agency has already been encouraging plant designers to consider “safeguards by design” and in 2010, interacted with Canada, Finland and Sweden on these issues (IAEA, 2011cc: 5). The Agency is also preparing for safeguarding new types of non-reactor facilities such as geological repositories for spent fuel and nuclear waste, pyro-processing plants (currently under consideration by South Korea) and laser enrichment facilities (IAEA, 2011f: 5).

It may

also awaken a “

sleeper” issue

that has long exercised the sharpest critics of safeguards

: the fact that the

current system cannot provide sufficient

timely warning of non-diversion of fissionable material from bulk-handling facilities

, such as those involved in uranium enrichment, plutonium reprocessing and fuel fabrication (discussed above).

If a nuclear energy revival permits increasing numbers of NNWS to acquire such facilities,

the safeguards system risks losing its credibility

. Through its International Project on Innovative Nuclear Reactors and Fuel

Cycles (INPRO), and in cooperation with the NEA’s GIF, the Agency is helping assess the proliferation resistance of different nuclear energy systems. Following the success of the INPROF Collaborative Project on Proliferation Resistance: Acquisition/Diversion

Pathways Analysis, which concluded in 2010, a new project on Proliferation Resistance and Safeguardability Assessment Tools — or

PROSA — was launched by INPRO in 2012 to develop a coordinated set of methodologies using both the INPRO and GIF experiences (IAEA, 2012b).

Causes nuclear accidents—takes out aff solvency

Allison and Sreenivasan 8

Graham Allison, Harvard Belfer Center for International Affairs and Science Director, JFK Government Professor, and T.P.

Sreenivasan, Former UN and Vienna Ambassador, Brookings Visiting Fellow, May 2008, IAEA Commissioned Independent Report prepared by a panel of 22 nonproliferation experts, Allison and Sreenivasan Executive Directors, Reinforcing the Global Nuclear

Order for Peace and Prosperity: The Role of the IAEA to 2020 and Beyond, http://belfercenter.ksg.harvard.edu/publication/18333/reinforcing_the_global_nuclear_order_for_peace_and_prosperity.html?breadcru

mb=%2Fproject%2F3%2Fmanaging_the_atom

Looking ahead,

if

the number of

nuclear power

plants around the world

is to grow

substantially without increasing the total risk of a nuclear accident

,

the risk of an accident

at any given reactor

must continue to be reduced

.

As additional countries build nuclear power plants, it is essential that they establish strong safety measures

, including competent, effective, and independent national regulators. The world is still a long way from a regime of mandatory, effective global safety standards and comprehensive reviews of performance in meeting them.

The IAEA roles in

maintaining and continuously improving

the global safety regime

that emerged after Chernobyl are

particularly critical

, and must continue to be strengthened and expanded to ensure nuclear safety

and protection from radio-toxicity. For example, IAEA’s comprehensive reviews of performance in meeting safety standards should be expanded so as to cover all the world’s operating reactors and nuclear installations, including research reactors and fuel-cycle facilities.

Extinction

Stephen

Lendman

, The Peoples Voice, 3/12/

11

, Nuclear Meltdown in Japan, www.thepeoplesvoice.org/TPV3/Voices.php/2011/03/13/nuclear-meltdown-in-japan

Reuters said the 1995 Kobe quake caused $100 billion in damage, up to then the most costly ever natural disaster. This time, from quake and tsunami damage alone, that figure will be dwarfed. Moreover, under a worst case

core meltdown

, all bets are off

as the

entire region and beyond

will be threatened with

permanent contamination

, making the most affected areas unsafe to live in. On March 12, Stratfor Global Intelligence issued a "Red Alert: Nuclear Meltdown at

Quake-Damaged Japanese Plant," saying:

Fukushima

Daiichi "nuclear power plant in Okuma, Japan, appears to have caused a reactor meltdown." Stratfor downplayed its seriousness, adding that such an event " does not necessarily mean a nuclear disaster

," that already may have happened

- the ultimate nightmare short of nuclear winter

. According to Stratfor,

"(A)s long as the reactor core, which is specifically designed to contain high levels of heat, pressure and radiation, remains intact, the melted fuel can be dealt with. If the (core's) breached but the containment facility built around (it) remains intact, the melted fuel can be....entombed within specialized concrete" as at Chernobyl in 1986. In fact, that disaster killed nearly one million people worldwide from nuclear radiation exposure. In their book titled, "Chernobyl: Consequences of the Catastrophe for People and the Environment,"

Alexey Yablokov, Vassily Nesterenko and Alexey Nesterenko said: "For the past 23 years, it has been clear that there is a danger greater than nuclear weapons concealed within nuclear power. Emissions from this one reactor exceeded a hundred-fold the radioactive contamination of the bombs dropped on Hiroshima and Nagasaki." "No citizen of any country can be assured that he or she can be protected from radioactive contamination. One nuclear reactor can pollute half the globe. Chernobyl fallout covers the entire Northern Hemisphere." Stratfor explained that if Fukushima's

floor cracked, "it is highly likely that the melting fuel

will burn through (its) containment system

and enter the ground.

This has never happened before

," at least not reported. If now occurring, "containment goes from being merely dangerous, time consuming and expensive to nearly impossible," making the quake, aftershocks, and tsunamis seem mild by comparison. Potentially, millions of lives will be jeopardized. Japanese officials said Fukushima's reactor container wasn't breached. Stratfor and others said it was, making the potential calamity far worse than reported. Japan's Nuclear and Industrial Safety Agency (NISA) said the explosion at Fukushima's Saiichi No. 1 facility could only have been caused by a core meltdown. In fact, 3 or more reactors are affected or at risk. Events are fluid and developing, but remain very serious. The possibility of an extreme catastrophe can't be discounted. Moreover, independent nuclear safety analyst John

Large told Al Jazeera that by venting radioactive steam from the inner reactor to the outer dome, a reaction may have occurred, causing the explosion. "When I look at the size of the explosion," he said, "it is my opinion that there could be a very large leak

(because) fuel continues to generate heat." Already, Fukushima way exceeds Three Mile Island that experienced a partial core meltdown in Unit 2. Finally it was brought under control, but coverup and denial concealed full details until much later. According to anti-nuclear activist Harvey Wasserman, Japan's quake fallout may cause nuclear disaster, saying: "This is a very serious situation. If the cooling system fails (apparently it has at two or more plants), the super-heated radioactive fuel rods will melt, and (if so) you could conceivably have an explosion," that, in fact, occurred. As a result, massive radiation releases may follow, impacting the entire region.

"It could be

,

literally

, an

apocalyptic

event. The reactor could blow." If so, Russia, China, Korea and most parts of Western Asia will be affected. Many thousands will die, potentially millions under a worse case scenario, including far outside East

Asia. Moreover, at least five reactors are at risk. Already, a 20-mile wide radius was evacuated. What happened in Japan can occur anywhere. Yet Obama's proposed budget includes $36 billion for new reactors, a shocking disregard for global safety. Calling

Fukushima an "apocalyptic event," Wasserman said "(t)hese nuclear plants have to be shut," let alone budget billions for new ones. It's unthinkable, he said. If a similar disaster struck California, nuclear fallout would affect all America, Canada, Mexico, Central

America, and parts of South America. Nuclear Power: A Technology from Hell Nuclear expert Helen Caldicott agrees, telling this writer by phone that a potential regional catastrophe is unfolding. Over 30 years ago, she warned of its inevitability. Her 2006 book titled, "Nuclear Power is Not the Answer" explained that contrary to government and industry propaganda, even during normal operations, nuclear power generation causes significant discharges of greenhouse gas emissions, as well as hundreds of thousands of curies of deadly radioactive gases and other radioactive elements into the environment every year. Moreover, nuclear plants are atom bomb factories. A 1000 megawatt reactor produces 500 pounds of plutonium annually. Only 10 are needed for a bomb able to devastate a large city, besides causing permanent radiation contamination. Nuclear Power not Cleaner and Greener Just the opposite, in fact. Although a nuclear power plant releases no carbon dioxide (CO2), the primary greenhouse gas, a vast infrastructure is required. Called the nuclear fuel cycle, it uses large amounts of fossil fuels. Each cycle stage exacerbates the problem, starting with the enormous cost of mining and milling uranium, needing fossil fuel to do it. How then to dispose of mill tailings, produced in the extraction process. It requires great amounts of greenhouse emitting fuels to remediate. Moreover, other nuclear cycle steps also use fossil fuels, including converting uranium to hexafluoride gas prior to enrichment, the enrichment process itself, and conversion of enriched uranium hexafluoride gas to fuel pellets. In addition, nuclear power plant construction, dismantling and cleanup at the end of their useful life require large amounts of energy. There's more, including contaminated cooling water, nuclear waste, its handling, transportation and disposal/storage, problems so far unresolved. Moreover, nuclear power costs and risks are so enormous that the industry couldn't exist without billions of government subsidized funding annually. The Unaddressed Human Toll from Normal

Operations Affected are uranium miners, industry workers, and potentially everyone living close to nuclear reactors that routinely emit harmful radioactive releases daily, harming human health over time, causing illness and early death. The link between radiation exposure and disease is irrefutable, depending only on the amount of cumulative exposure over time, Caldicott saying: "If a regulatory gene is biochemically altered by radiation exposure, the cell will begin to incubate cancer, during a 'latent period of carcinogenesis,' lasting from two to sixty years." In fact, a single gene mutation can prove fatal. No amount of radiation exposure is safe. Moreover, when combined with about 80,000 commonly used toxic chemicals and contaminated GMO foods and ingredients, it causes 80% of known cancers, putting everyone at risk everywhere. Further, the combined effects of allowable radiation exposure, uranium mining, milling operations, enrichment, and fuel fabrication can be devastating to those exposed. Besides the insoluble waste storage/disposal problem, nuclear accidents happen and catastrophic ones are inevitable. Inevitable Meltdowns Caldicott and other experts agree

they're certain in one or more of the hundreds of reactors operating globally, many years after their scheduled shutdown dates unsafely. Combined with human error, imprudently minimizing operating costs, internal sabotage, or the effects of a high-magnitude quake and/or tsunami, an eventual catastrophe is certain. Aging plants alone, like Japan's Fukushima facility, pose unacceptable risks based on their record of near-misses and meltdowns, resulting from human error, old equipment, shoddy maintenance, and poor regulatory oversight. However, under optimum operating conditions, all nuclear plants are unsafe.

Like any

machine or facility, they're vulnerable to breakdowns

, that

if serious enough can cause

enormous, possibly catastrophic, harm

.

Add nuclear war to the mix, also potentially inevitable according to some experts, by accident or intent, including Steven Starr saying:

"Only a single failure of nuclear deterrence is required to start a nuclear war," the consequences of which "would be profound, potentially killing "tens of millions of people, and caus(ing) long-term, catastrophic disruptions of the global climate and massive destruction of Earth's protective ozone layer

. The result would be a

global nuclear famine

that could kill up to one billion people

." Worse still is nuclear winter, the ultimate nightmare

, able to

end all life

if it happens

. It's nuclear proliferation's unacceptable risk, a clear and present danger as long as nuclear weapons and commercial dependency exist.

2NC IAEA Link

SMRs destroys IAEA’s resources and effectiveness

Lyman 11

(Edwin, Senior Scientist, Global Security Program Union of Concerned Scientists, June 7, 2011, Testimony on S. 512,

“The Nuclear Power 2021 Act” and S. 1067, “The Nuclear Energy Research Initiative Improvement Act of 2011” Before the Committee on Energy and Natural Resources U.S. Senate, http://www.ucsusa.org/assets/documents/nuclear_power/lyman-testimony-06-07-2011.pdf

)

The distributed deployment of small reactors would put great strains on licensing and inspection resources.

Nuclear reactors are qualitatively different from other types of generating facilities, not least because they require a much more intensive safety and security inspection regime. Similarly, deployment of individual small reactors at widely distributed and remote sites around the world would strain the resources of the International Atomic

Energy Agency (IAEA) and its ability to adequately safeguard reactors to guard against proliferation, since IAEA inspectors would need to visit many more locations per installed megawatt around the world. Maintaining robust oversight over vast networks of SMRs around the world would be difficult, if even feasible.

Nuke Leadership Bad

1NC Prolif Turn

Maintaining trade through weak nuclear agreements solves prolif—shift to restrictive agreements scuttles everything

NEI 12

Nuclear Energy Institute, May 2012, Issues in Focus: Nuclear Energy Exports and Nonproliferation, www.nei.org/resourcesandstats/documentlibrary/newplants/whitepaper/issues-in-focus-nuclear-energy-exports-and-nonproliferation

These imperatives are inextricably linked

.

To maintain U.S. influence over global nonproliferation policy and international nuclear safety, the U.S. commercial nuclear energy sector must participate in the rapidly expanding global market

for nuclear energy technologies (439 commercial nuclear reactors in operation around the world, 65 under construction, 162 planned or on order).

Without U.S. commercial engagement, the U nited

S tates would have substantially diminished influence over other nations’ nonproliferation policies and practices

. U

.S. technology and U.S. industry are a critical engine that drives U.S. nonproliferation policies

. A successful nuclear trade and export policy must be a partnership between government and industry. A

Section 123 Agreement is a prerequisite for U.S. commercial nuclear exports. It

is also promotes

U.S. nonproliferation interests. Section 123 Agreements already include provisions governing enrichment and reprocessing of U.S.- controlled nuclear material, including a prohibition on enrichment or reprocessing without prior U.S. consent

.

Any effort in U.S. 123 agreements to impose additional restrictions on enrichment and/or reprocessing

of nuclear material controlled by other countries is seen by many countries as an overreach by the U nited

S tates.

It would be counterproductive to require other nations to forswear enrichment and reprocessing in order to execute a Section 123 agreement with the U nited

S tates. Most nations would refuse to do so, and would simply turn to other commercial nuclear suppliers

– France, Russia and others that do not impose such requirements.

Without a Section 123 agreement, the U nited

S tates cannot engage in commercial nuclear trade, and thus has substantially diminished influence over nonproliferation

.

Unilateral requirements, imposed in the name of nonproliferation, could have the perverse effect of undermining U.S. influence over nonproliferation policy

.

1NC Vietnam DA

Kills the Vietnam agreement

NEI 12

Nuclear Energy Institute, June 2012, H.R. 1280:

A Misguided Attempt to Control Enrichment and Reprocessing Technologies, http://www.nei.org/resourcesandstats/documentlibrary/newplants/whitepaper/white-paper--hr-1280-a-misguided-attempt-to-controlenrichment-and-reprocessing-technologies

The H.R. 1280 report states that there is “no evidence to support the concern” that U.S. suppliers would be disadvantaged by the requirement for countries to forswear E&R as a condition for U.S. nuclear cooperation. But the cases of Vietnam and Jordan suggest otherwise: it is not clear that these states will accept the same restrictions found in the U.S.-UAE agreement. With negotiations for

U.S. cooperation long stalled over E&R concerns, both countries’ nuclear energy programs have moved ahead in partnership with non-U.S. suppliers. Unlike UAE, Jordan possesses sizeable uranium reserves—around 200,000 tons—and has expressed an interest in eventually enriching fuel for export to international mar kets. And while Jordan had reportedly considered some E&R limits in 2011 negotiations with the U.S., Dr. Khaled Toukan, head of the Jordanian Atomic Energy Commission, has now publicly stated opposition to “restrictions outside of the NPT on a regional basis or a country-by-country basis.”6 He has also criticized the UAE commitment, saying that country “has relinquished all of its NPT rights to sensitive nuclear technology indefinitely. Why should we give up our rights?”7 It is unclear whether Jordan will ultimately accept E&R restrictions in exchange for a U.S. nuclear cooperation agreement; but the country is clearly not waiting for such cooperation to move ahead with its civil nuclear aspirations. Jordan has nearly finished the technology selection process for its first nuclear power plant, a 1,000-megawatt reactor due in service in 2019. A Japanese-French consortium, as well as Russian and Canadian groups, are seeking to win that bid, while South Korea has loaned Jordan $70 million to help fund a 5- megawatt nuclear research reactor worth $130 million. Meanwhile, Jordan has granted France's AREVA exclusive rights over the next 25 years to mine uranium in the country’s central region. Although Vietnam has indicated that it has no plans to develop E&R capabilities, sources close to its negotiations with the U.S. say Vietnam has so far chosen not to renounce E&R rights in exchange for a U.S. nuclear agreement. As with Jordan, Vietnam has sought alternatives to U.S. cooperation, including a

$5.6- billion deal with Russia in late 2010 to build two 1,000-MW VVER reactors. Russia will also supply the fuel for the reactors and handle its removal and reprocessing. In 2011, Vietnam concluded a separate reactor deal with Japan for the supply of two additional reactors.

123 agreement key to strategic cooperation—key to check China flare-ups in the South China Seas

Jha 10

Saurav Jha, The Diplomat and World Politics Review Contibutor, 9/15/10, Why a US-Vietnam Nuclear Deal?, thediplomat.com/2010/09/15/why-a-us-vietnam-nuclear-deal/?all=true

But the engagement with Vietnam that the visit also demonstrated goes deeper than just this show of force— Washington is looking to move beyond symbolism to engage in a genuine strategic partnership , the cornerstone of which will be the US-Vietnam 123 nuclear cooperation agreement . Unsurprisingly, the deal has already riled China and non-proliferation proponents alike, who note that the deal being offered to Vietnam is devoid of the standard strings that have characterised other deals with emerging nuclear nations, including the United Arab Emirates. Most notably, Vietnam won’t have to abandon having the option to carry out nuclear fuel cycle activities on its territory as the UAE had to. This means that Vietnam, can, at least hypothetically, establish enrichment and reprocessing (ENR) facilities in its territory. Of course, the agreement doesn’t mean that the United States is about to transfer any

ENR technology to Vietnam—or that the latter is in any hurry to set up its own such facilities. As Vuong Huu Tan, president of the government-affiliated Vietnam Atomic Energy Institute, has noted: ‘Vietnam doesn’t intend to enrich as of now because of expensive and very sensitive technology.’ ENR technology is anyway a closely guarded secret that only a handful of countries have the capacity to exploit on an industrial scale. But while any country with a nuclear energy programme would typically like to retain a certain degree of independence—and the NPT actually entitles all of its members to engage in full nuclear co-operation—the reality for many is that commercial and proliferation sensitivities have prompted various restrictions and regimes to be put in place denying them any such technology. In addition, such activities are simply prohibitively expensive for small and mid-sized nuclear estates. Yet while the

UAE’s willingness to forsake fuel cycle activity on its own soil seemed to provide a gold standard Washington could use for its nuclear dealings, the nature of the Vietnam deal implies that a broader technological relationship could yet be crafted between Hanoi and Washington. With its industrial activity in the north of the country expanding rapidly, Vietnam has been prompted to explore nuclear power as a ‘clean’ way of meeting its growing electricity demands. But a 123 agreement with the United States is unlikely to stop at nuclear co-operation. As US Secretary of State, Hillary Clinton, said during her visit to Hanoi in July, ‘ Ties between the two countries will be taken to the next level .’ What could this mean? Certainly US firms can be expected to play an increasing role in

Vietnam’s industrial development, something that would likely necessitate a much broader deal than Washington has arranged with other countries. For example, a number of instrumentation technologies are classified as dual use by the US State Department, but will be required if Hanoi wishes to exploit its offshore hydrocarbon resources. Unlike in the Middle East, US oil majors aren’t already entrenched in Vietnam’s fossil fuel sector, and an excessively restrictive deal would adversely affect their ability to compete. Such differences mean that the Vietnam arrangement is more akin to the India nuclear deal than the one with the UAE, a point no more evident than at the strategic level. Indeed, although it’s on a quite different scale, the philosophy and rationale underpinning a US-

Vietnam 123 are remarkably similar to the Indo-US nuclear deal. So why is the United States so interested in making an India-like exception to its nuclear arrangements with Vietnam? China. As another wary neighbour of China, Vietnam is a potentially

sympathetic US partner in any attempts to keep expansionist Chinese ambitions in the S outh C hina S ea in check . With a long maritime tradition and a knack for military upsets (the Vietnamese have managed to defeat the French, Americans and Chinese on different occasions), combined with its very sizeable armed forces, Vietnam is potentially an indispensable ally in any possible regional flare-up .

Extinction

Wittner 11

(Lawrence S. Wittner, Emeritus Professor of History at the State University of New York/Albany, Wittner is the author of eight books, the editor or co-editor of another four, and the author of over 250 published articles and book reviews. From

1984 to 1987, he edited Peace & Change, a journal of peace research., 11/28/2011, "Is a Nuclear War With China Possible?", www.huntingtonnews.net/14446)

While nuclear weapons exist, there remains a danger that they will be used. After all, for centuries national conflicts have led to wars, with nations employing their deadliest weapons. The current deterioration of U.S. relations with China might end up providing us with yet another example of this phenomenon . The gathering tension between the United States and China is clear enough.

Disturbed by China’s growing economic and military strength, the U.S. government recently challenged China’s claims in the South

China Sea, increased the U.S. military presence in Australia, and deepened U.S. military ties with other nations in the Pacific region.

According to Secretary of State Hillary Clinton, the United States was “asserting our own position as a Pacific power.” But need this lead to nuclear war ? Not necessarily. And yet, there are signs that it could . After all, both the United States and China possess large numbers of nuclear weapons . The U.S. government threatened to attack China with nuclear weapons during the Korean War and, later, during the conflict over the future of China’s offshore islands, Quemoy and Matsu. In the midst of the latter confrontation,

President Dwight Eisenhower declared publicly, and chillingly, that U.S. nuclear weapons would “be used just exactly as you would use a bullet or anything else.” Of course, China didn’t have nuclear weapons then. Now that it does, perhaps the behavior of national leaders will be more temperate. But the loose nuclear threats of U.S. and Soviet government officials during the Cold War, when both nations had vast nuclear arsenals, should convince us that, even as the military ante is raised, nuclear saber-rattling persists. Some pundits argue that nuclear weapons prevent wars between nuclear-armed nations; and, admittedly, there haven’t been very many—at least not yet. But the Kargil War of 1999, between nuclear-armed India and nuclear-armed Pakistan, should convince us that such wars can occur. Indeed, in that case, the conflict almost slipped into a nuclear war. Pakistan’s foreign secretary threatened that, if the war escalated, his country felt free to use “any weapon” in its arsenal. During the conflict, Pakistan did move nuclear weapons toward its border, while India, it is claimed, readied its own nuclear missiles for an attack on Pakistan. At the least, though, don’t nuclear weapons deter a nuclear attack? Do they? Obviously, NATO leaders didn’t feel deterred, for, throughout the Cold War, NATO’s strategy was to respond to a Soviet conventional military attack on Western Europe by launching a Western nuclear attack on the nuclear-armed Soviet Union. Furthermore, if U.S. government officials really believed that nuclear deterrence worked, they would not have resorted to championing “Star Wars” and its modern variant, national missile defense. Why are these vastly expensive—and probably unworkable—military defense systems needed if other nuclear powers are deterred from attacking by U.S. nuclear might? Of course, the bottom line for those Americans convinced that nuclear weapons safeguard them from a Chinese nuclear attack might be that the U.S. nuclear arsenal is far greater than its Chinese counterpart. Today, it is estimated that the U.S. government possesses over five thousand nuclear warheads, while the Chinese government has a total inventory of roughly three hundred. Moreover, only about forty of these Chinese nuclear weapons can reach the United States. Surely the United States would “win” any nuclear war with

China. But what would that “victory” entail? A nuclear attack by China would immediately slaughter at least 10 million Americans in a great storm of blast and fire, while leaving many more dying horribly of sickness and radiation poisoning. The Chinese death toll in a nuclear war would be far higher. Both nations would be reduced to smoldering, radioactive wastelands . Also, radioactive debris sent aloft by the nuclear explosions would blot out the sun and bring on a “nuclear winter” around the globe— destroying agriculture, creating worldwide famine, and generating chaos and destruction . Moreover, in another decade the extent of this catastrophe would be far worse. The Chinese government is currently expanding its nuclear arsenal, and by the year

2020 it is expected to more than double its number of nuclear weapons that can hit the United States. The U.S. government, in turn, has plans to spend hundreds of billions of dollars “modernizing” its nuclear weapons and nuclear production facilities over the next decade. To avert the enormous disaster of a U.S.-China nuclear war, there are two obvious actions that can be taken. The first is to get rid of nuclear weapons, as the nuclear powers have agreed to do but thus far have resisted doing. The second, conducted while the nuclear disarmament process is occurring, is to improve U.S.-China relations. If the American and Chinese people are interested in ensuring their survival and that of the world, they should be working to encourage these policies .

1NC Russia DA

Kills the Russia agreement – impact is relations and Iran prolif

Young

et al

10

– research associate at the Center for Nonproliferation Studies (Thomas, Cole Harvey, Ferenc Dalnoki-Veress,

12/21, “It's not just New START: Two other U.S.-Russian Nuclear Agreements Boost U.S.-Russian Reset,” http://cns.miis.edu/stories/101221_nuclear_agreements.htm)

The nuclear cooperation agreement is

perhaps as important

as New START in

carrying out the Obama administration's goal of resetting

U.S.-Russian relations

on a more cooperative path.

Russia

, still suffering from memories of the Chernobyl disaster and smarting from past tensions with Washington, gains a

long-desired seal of approval from the United States as a responsible nuclear energy producer. It also "creates new commercial opportunities for Russian and American industry,"

as Secretary Poneman highlighted.[1] The U.S.-Russia agreement lifts a number of historical limitations on commercial nuclear trade and can be seen as an acknowledgement of Russia's more cooperative stance towards sanctions on Iran (Russian assistance to Iran's nuclear program has been the main obstacle to U.S.-

Russian civil nuclear cooperation in the past).

By creating

new opportunities for Russia's

powerful nuclear

energy industry,

the U.S. government hopes the agreement will

help further that industry's shift toward

the

nonprolif

eration mainstream and

away from more marginal customers such as Iran.[2] The nuclear cooperation agreement was originally submitted to Congress for review by the George W. Bush administration in May 2008, in part as an effort to bolster bilateral cooperation

between the United States and Russia on issues beyond the agreement's scope (for example, on

the question of

Iran's nuclear program

). However, the administration withdrew the agreement from Congress following Russia's war with Georgia in August of that year, as the U.S.-Russian relationship hit a post-Cold War low. President Obama resubmitted the agreement to Congress in May 2010 as part of his 'reset' policy with Moscow. The reset aimed to move the U.S.-Russian relationship out of the shadow of the Georgia conflict in order to gain Russia's cooperation on Iran, the war in Afghanistan, and other regional issues. At summit meetings between Obama and Russian President Dmitry Medvedev in April and July 2009, the two presidents committed to bring the 123 Agreement into force.[3]

U.S.-Russia relations solve accidental extinction

ROJANSKY AND COLLINS 10

(James F. Collins – Director, Russia and Eurasia Program at the Carnegie

Endowment and an ex-US ambassador to the Russian Federation, Matthew Rojansky – the deputy director of the Russia and Eurasia

Program at the Carnegie Endowment, August 18, 2010, “Why Russia Matters”, http://www.foreignpolicy.com/articles/2010/08/18/why_Russia_matters)

Russia's nukes are

still an

existential threat

.

Twenty years after the fall of the Berlin Wall,

Russia has thousands of nuclear weapons

in stockpile and hundreds still on

hair-trigger alert

aimed at U.S. cities . This threat will not go away on its own; cutting down the arsenal will require

direct, bilateral arms control talks between Russia and the

U

nited

S

tates. New

START

, the strategic nuclear weapons treaty now up for debate in the Senate, is the latest in a long line of bilateral arms control agreements between the countries dating back to the height of the Cold War. To this day, it remains the only mechanism granting U.S. inspectors access

to secret Russian nuclear sites . The original START agreement was essential for reining in the runaway Cold War nuclear buildup, and New START promises to cut deployed strategic arsenals by a further 30 percent from a current limit of 2,200 to 1,550 on each side. Even more, President

Obama and

his Russian counterpart, Dmitry

Medvedev

, have agreed to a long-term goal of eliminating nuclear weapons

entirely . But

they can only do that by working together

.

Iran prolif causes accidental nuclear war

Ward 12

– studying for a Masters in International Relations from Durham University, cites Krieger, Tepperman and Waltz (Alex,

03/02, “Iran’s Nuclear Programme and the Stability of the Middle East,” http://www.e-ir.info/2012/03/02/irans-nuclear-programmeand-the-stability-of-the-middle-east/)

The mechanisms of nuclear deterrence theory presuppose both actor rationality and effective inter-state communication structures that enable agents to precisely interpret and convey intent (Brown, 2008) (Huth, 1999). However, according to Morgan (1977: 78), the circumstance of threat can undermine the psychological capacity of key decision makers to act rationally,

especially in the case of Iran

, wherein perennial regional instability, ‘Axis of Evil’ rhetoric and an increasingly restless nuclear Israel have

served to magnify threats to Iran’s national security. Another major critique of nuclear deterrence is Sagan’s (1994) organisation theory that emphasises the salience of “misinformation, misunderstanding, or misconstruing information” (Krieger, 2000: npn), in conjuncture with leaders’ “use of simplifying mechanisms” (Sagan, 1994: 71) to comprehend complex political situations. Absent an effective communication infrastructure, actors will act “on the basis of misunderstandings” (McNamara, 1962: npn) and, accordingly, will “not function predictably in accordance with bargaining and game-theory assumptions” (Russel, 2004: 106), undermining the stabilizing effect of nuclear warheads upon the regional stability. In the Middle East, the limitations to deterrence theory are

intensified

as “there exists no institutionalized process for adversaries to ensure structured communications on a routine basis”

(Russel, 2004: 105), rendering interstate communication distinctly problematic, especially in light the relatively large role of the media in shaping inter-state perceptions. At an internal level, as Iran is an embryonic nuclear state, the command-and-control problems therein will be inevitably more severe and thus, according to Powell (2003: 102), “the risk of accidental or inadvertent war will be

high

er”. Ultimately though, it is of crucial importance to acknowledge the “critical distinction between a theory and predictions derived from it” (Powell, 2003: 1000) and to avoid extrapolating form outdated Cold War theory, as it is not applicable to the radically different Middle Eastern strategic order.

1NC Peak Uranium

No impact to deterrence

Kober 10

, research fellow, foreign policy studies – Cato, 6/13/10

(Stanley, “The deterrence illusion,” http://www.guardian.co.uk/commentisfree/cifamerica/2010/jun/10/deterrence-war-peace )

The world at the beginning of the 21st century bears an eerie – and disquieting – resemblance to

Europe at the beginning of the last century

. That was also an era of globalisation.

New technologies for transportation and communication were transforming the world. Europeans had lived so long in peace that war seemed irrational. And they were right, up to a point.

The first world war was the product of a mode of rational thinking that went badly off course.

The peace of Europe was based on security assurances.

Germany was the protector of Austria-Hungary, and Russia was the protector of Serbia. The prospect of escalation was supposed to prevent war

, and it did– until, finally, it didn't.

The Russians, who

should have been deterred – they had suffered

a terrible defeat at the hands of Japan

just a few years before – decided they had to come to

the support

of their fellow Slavs. As countries honoured their commitments, a system that was designed to prevent war instead widened it. We have also been living in an age of globalisation, especially since the end of the cold war, but it too is increasingly being challenged. And just like the situation at the beginning of the last century, deterrence is not working. Much is made, for example, of the North Atlantic Treaty

Organisation (Nato) invoking Article V – the famous "three musketeers" pledge that an attack on one member is to be considered as an attack on all – following the terrorist attacks of September 11. But the United States is the most powerful member of Nato by far.

Indeed, in 2001, it was widely considered to be a hegemon, a hyperpower. Other countries wanted to be in Nato because they felt an

American guarantee would provide security. And yet it was the US that was attacked.

This failure of deterrence

has not received the attention it deserves. It is

, after all, not unique. The North Vietnamese were not deterred by the

American guarantee to South Vietnam.

Similarly,

Hezbollah was not deterred in Lebanon

in the 1980s, and American forces were assaulted in Somalia.

What has been going wrong? The successful deterrence of the superpowers during the cold war led to the belief that if such powerful countries could be deterred, then lesser powers should fall into line when confronted with an overwhelmingly powerful adversary. It is plausible, but it may be too rational.

For all their ideological differences, the US and the Soviet Union observed red lines during the cold war.

There were crises – Berlin, Cuba, to name a couple – but these did not touch on emotional issues or vital interests, so that compromise and retreat were possible. Indeed, what we may have missed in the west is the importance of retreat in Soviet ideology. "Victory is impossible unless

[the revolutionary parties

] have learned both how to attack and how to retreat properly," Lenin wrote

in "Left-Wing" Communism: An Infantile Disorder. When the Soviets retreated, the US took the credit. Deterrence worked. But what if retreat was part of the plan all along?

What if, in other words, the Soviet

Union was the exception rather than the rule ? That question is more urgent because, in the post-cold war world

, the US has expanded its security guarantees, even as its enemies

show they are not impressed.

The Iraqi insurgents were not intimidated by President Bush's challenge to "bring 'em on".

The Taliban

have made an extraordinary comeback from oblivion and show no respect for American power. North Korea is demonstrating increasing belligerence.

And yet the US keeps emphasising security through alliances

. "We believe that there are certain commitments, as we saw in a bipartisan basis to Nato, that need to be embedded in the DNA of American foreign policy," secretary of state Hillary Clinton affirmed in introducing the new National Security Strategy. But that was the reason the US was in Vietnam.

It had a bipartisan commitment to South Vietnam under the Southeast Asia Treaty Organisation, reaffirmed through the Tonkin Gulf

Resolution, which passed Congress with only two dissenting votes. It didn't work, and found its commitments were not embedded in its DNA. Americans turned against the war, Secretary Clinton among them.

The great powers could not guarantee peace in Europe a century ago, and the US could not guarantee it in Asia a half-century ago.

No impact to CBWs

Easterbrook 3

(Gregg Easterbrook, senior fellow at The New Republic, July 2003, Wired, “We’re All Gonna Die!” http://www.wired.com/wired/archive/11.07/doomsday.html?pg=2&topic=&topic_set=

3. Germ warfare!Like chemical agents, biological weapons have never lived up to their billing in popular culture. Consider the 1995 medical thriller Outbreak, in which a highly contagious virus takes out entire towns. The reality is quite different. Weaponized smallpox escaped from a Soviet laboratory in Aralsk, Kazakhstan, in 1971; three people died, no epidemic followed. In 1979, weapons-grade anthrax got out of a Soviet facility in Sverdlovsk (now called Ekaterinburg); 68 died, no epidemic. The loss of life was tragic, but no greater than could have been caused by a single conventional bomb. In 1989, workers at a US government facility near

Washington were accidentally exposed to Ebola virus. They walked around the community and hung out with family and friends for several days before the mistake was discovered. No one died. The fact is, evolution has spent millions of years conditioning mammals to resist germs. Consider the Black Plague. It was the worst known pathogen in history, loose in a Middle Ages society of poor public

health, awful sanitation, and no antibiotics. Yet it didn’t kill off humanity. Most people who were caught in the epidemic survived.

Any superbug introduced into today’s Western world would encounter top-notch public health, excellent sanitation, and an array of medicines specifically engineered to kill bioagents . Perhaps one day some aspiring Dr. Evil will invent a bug that bypasses the immune system. Because it is possible some novel superdisease could be invented, or that existing pathogens like smallpox could be genetically altered to make them more virulent (two-thirds of those who contract natural smallpox survive), biological agents are a legitimate concern. They may turn increasingly troublesome as time passes and knowledge of biotechnology becomes harder to control, allowing individuals or small groups to cook up nasty germs as readily as they can buy guns today. But

no superplague has ever come close to wiping out humanity before, and it seems unlikely to happen in the future.

No challengers

Kaplan 11

, senior fellow – Center for a New American Security, and Kaplan, frmr. vice chairman – National Intelligence

Council,

(Robert D and Stephen S, “America Primed,” The National Interest , March/April)

But in spite of the seemingly

inevitable and rapid diminution of U.S. eminence

, to write America’s greatpower obituary is beyond premature.

The U nited

S tates remains

a highly capable

power.

Iraq and

Afghanistan

, as horrendous as they have proved to be—in a broad historical sense— are still relatively minor events that America can easily overcome. The

eventual demise of empires like

those of

Ming China and

latemedieval

Venice was brought about by

far more pivotal blunders.

Think of the Indian Mutiny against the British in

1857 and 1858. Iraq in particular—ever so frequently touted as our turning point on the road to destruction—looks to some extent eerily similar. At the time, orientalists and other pragmatists in the British power structure (who wanted to leave traditional India as it was) lost some sway to evangelical and utilitarian reformers (who wanted to modernize and Christianize India—to make it more like

England). But the attempt to bring the fruits of Western civilization to the Asian subcontinent was met with a violent revolt against imperial authority. Delhi, Lucknow and other Indian cities were besieged and captured before being retaken by colonial forces. Yet, the debacle did not signal the end of the British Empire at all, which continued on and even expanded for another century. Instead, it signaled the transition from more of an ad hoc imperium fired by a proselytizing lust to impose its values on others to a calmer and more pragmatic empire built on international trade and technology.1 There is no reason to believe that the fate of America need follow a more doomed course. Yes, the mistakes

made in Iraq and Afghanistan have been the United States’ own, but, though destructive, they are not fatal. If

we withdraw sooner rather than later, the cost to American power can be stemmed. Leaving a stable

Afghanistan behind of course requires a helpful Pakistan, but with more pressure Washington might increase Islamabad’s cooperation in relatively short order.

In terms of acute threats, Iran is the only state that has exported terrorism and insurgency

toward a strategic purpose, yet the country is economically fragile and

politically unstable, with behind-the-scenes infighting

that would make Washington partisans blanch.

Even assuming Iran acquires

a few nuclear devices—of uncertain quality with uncertain delivery systems—the long-term outlook for the

clerical regime is

itself unclear.

The administration must only avoid a war with the Islamic Republic. To be sure,

America may

be in decline in relative terms

compared to some other powers, as well as to many countries of the former third world, but in absolute terms, particularly military ones , the U nited S tates can easily be the first among equals for decades hence. China, India and Russia are the only major Eurasian states prepared to wield military power of consequence on their peripheries. And each, in turn, faces its own obstacles on the road to some degree of dominance. The Chinese will have a great navy (assuming their economy does not implode) and that will enforce a certain level of bipolarity in the world system. But

Beijing will lack the alliance network Washington has

, even as

China and Russia will always be —because of geography— inherently distrustful

of one another.

China has

much influence, but no credible military allies beyond possibly North Korea, and its

authoritarian regime lives in fear of internal disruption if

its economic growth

rate falters.

Furthermore,

Chinese naval planners

look out from their coastline and see

South Korea and a string of

islands—Japan, Taiwan and Australia—that are

American allies

, as are, to a lesser degree, the Philippines, Vietnam and Thailand. To balance a rising China, Washington must only preserve its naval and air assets at their current levels. India, which has

its own internal insurgency, is bedeviled by semifailed states

on its borders that critically sap energy and attention from

its security establishment, and especially from its land forces

; in any case,

India has become a de facto ally

of the United States whose very rise, in and

of itself, helps to balance China.

Russia will be occupied for years regaining influence in its

post-Soviet near abroad, particularly

in

Ukraine

, whose feisty independence constitutes a fundamental challenge to the very idea of the Russian state.

China checks

Russia in Central Asia, as do Turkey, Iran and the West in the Caucasus. This is to say nothing of

Russia’s

diminishing population and

overwhelming reliance on energy

exports. Given the problems of these other states, America remains fortunate indeed. The

U nited

S tates is poised to tread the path of postmutiny Britain. America might not be an empire in the formal sense, but its obligations and constellation of military bases worldwide put it in an imperial-like situation,

particularly because its air and naval deployments will continue in a post-Iraq

and post-Afghanistan world. No country is in such a n enviable position to keep

the relative peace

in Eurasia as is the United States—especially if it can recover the level of enduring competence in national-security policy last seen during the administration of George H. W. Bush. This is no small point. America has strategic advantages and can enhance its power while extricating itself from war.

But this requires leadership—not great and inspiring leadership which comes along rarely even in the healthiest of societies—but plodding competence, occasionally steely nerved and always free of illusion.

No prolif impact – multiple checks prevent use

Cha 1

(Victor, Associate Professor of Government and School of Foreign Service @ Georgetown, “The second nuclear age:

Proliferation pessimism versus sober optimism in South Asia and East Asia,” Journal of Strategic Studies, InformaWorld)

Proliferation pessimists do not deny the existence of the nuclear taboo; they do, nevertheless, see this taboo as shared only by First World proliferators. Is this

a fair

assessment? As Tannenwald argues, a

taboo

takes effect when the agent realizes

(1) the exceptionalist nature of

the weapon (i.e., in terms of its destructive power)

; (2) the absence of effective defenses (i.e.,

vulnerability);

(3) and fears the political and social consequences of taking such an action. All of these conditions readily hold for new nuclear powers

. Moreover, the

revulsion

against nuclear weapons use (first-use) has become

so institutionalized

in an array of international agreements and practices such that new NWS states operate in an environment that severely circumscribes the realm of legitimate nuclear use.

90

Proliferation pessimists

therefore underestimate the transformative effects of nuclear weapons on these new proliferators.

They assume that the interests for aspiring nuclear powers remain constant in the pre- and postacquisition phases.

They do not consider that once states cross the nuclear threshold, they become

acutely aware

of the dangers and responsibilities that come with these new awesome capabilities

. The likelihood of such a learning process occurring is even higher if nuclear weapons are valued for their political currency. As noted above, while security needs certainly drive proliferation in Asia, a predominant factor that cannot be disentangled from this dynamic is the striving for prestige and international recognition as an NWS state. Moreover, if the taboo equates the use of nuclear weapons with an 'uncivilized' or 'barbarian' state," then those states that are status-conscious will be that much more attuned to the taboo. The effects of the taboo on Asian proliferators are therefore both regulative and constitutive

. In the former sense, as these states further embed themselves in the international community

(discussed below), this change heightens the costs of breaking any rules regarding nuclear use. The taboo's constitutive effects also are evident in that any use would undermine one of the primary purposes for which the capabilities were sought

(e.g., prestige, badge of modernity). Although it is still relatively early in the game, there is

some evidence that the acquisition of nuclear capabilities has been accompanied by a change in preferences about what is acceptable behavior.

While

India has rejected any notions that it might roll back its newfound capability, it had readily admitted that as an incipient nuclear weapons state, it now has certain responsibilities that include a no-first-use policy and not sharing nuclear weapons technology with other irresponsible states.

92 Similarly,

Pakistan previously placed little value and even resented nonproliferation norms as these were seen as inhibiting and degrading to the national character

.93 Otherwise, they might have been swayed by the benefits of not responding to the

Indian tests as a shining example of a country adhering to nuclear nonproliferation norms. Arguably it is only after becoming an incipient nuclear weapons state that such arguments about nonproliferation gain value. Nowhere is this perverse dynamic more evident than in both sides' views of the CTBT. Previously perceived as an instrument intended to preempt nuclear spread beyond the first age, the CTBT is now arguably seen by India and Pakistan in less antagonistic terms, and even among some, as a responsibility to be borne as a nuclear state.

Uranium stocks are fine—we’ll have enough for the next century

MIT 11

[“The Future of the Nuclear Fuel Cycle”, 2011, http://web.mit.edu/mitei/research/studies/documents/nuclear-fuelcycle/The_Nuclear_Fuel_Cycle-all.pdf]

We developed a price elasticity model to estimate the future costs of uranium as a function of the cumulative mined uranium. The details of this model are in the appendix. The primary input is the model of uranium reserves as a function of ore grade [14] developed

in the late 1970s by Deffeyes. The results of this model are shown in Figure 3.2. For uranium ores of practical interest, the supply increases about 2% for every 1% decrease in average grade mined down to an ore grade of ~1000 ppm. His work extended models previously applied to individual mined deposits (e.g., by Krige for gold) [15] to the worldwide ensemble of deposits of uranium. The region of interest in the figure is on the left-hand side, above about 100 ppm uranium, below which grade the energy expended to extract the uranium will approach a significant fraction of that recoverable by irradiation of fuel in LWRs. The resources of uranium increase significantly if one is willing to mine lower-grade resources. An important factor not accounted for here in prediction of uranium resources is the recovery of uranium as a co-product or by-product of other mining operations. The most important category here is phosphate deposits. A recent CEA assessment [8] projects 22 million MT from this source: by itself enough for 1000 one-GWe reactors for 100 years, subject to the caveat that co-production is fully pursued.Finally, several authors have noted that Deffeyes’ assessment was completed before the rich ore deposits in Canada, at grades in excess of 3% (30,000 ppm) were discovered. This could imply that the projected cost escalation based on his results would, in effect, be postponed for a period. Our model included three other features in addition to uranium supply versus ore grade elasticity: p Learning curve. In all industries there is a learning curve where production costs go down with cumulative experience by the industry. p Economics of scale. There are classical economics of scale associated with mining operations. p Probabilistic assessment. Extrapolation into an ill-defined future is not properly a deterministic undertaking—we can not know the exact answer. Hence, following the lead in a similar effort in 1980 by Starr and

Braun of EPRI, a probabilistic approach was adopted [16] in our models. The results of our model are shown in Figure 3.3 where the relative cost of uranium is shown versus the cumulative electricity produced by LWRs of the current type. The unit of electricity is gigawatt-years of electricity generation assuming that 200 metric tons of uranium are required to produce a gigawatt-year of electricity—the amount of uranium used by a typical light water reactor. The horizontal axis shows three values of cumulative electricity production: p G1 = 100 years at today’s rate of uranium consumption and nuclear electric generation rate p G5 = 100 years at 5 times today’s uranium consumption and nuclear electricity generation rate p G10 = 100 years at 10 times today’s uranium consumption and nuclear electricity generation rate. Three lines are shown based on the probabilistic assessment described in the appendix of Chapter 3. The top line is to be interpreted as an 85% probability that the cost relative to the baseline cost will be less than the value on the trace plotted as a function of the cumulative electricity production using today’s LWR once-through fuel cycle.

The three lines meet at the far left where the baseline cost of uranium is taken as 100 $/kg, and the baseline total cumulative nuclear electricity production is (somewhat arbitrarily) taken as 10 4 GWe-yr using 2005 as the reference year. The other lines correspond to

50% and 15% probabilities. As one example at 10 GWe-yr cumulative production, there is an 85% probability that uranium will cost less than double 2005 costs (i.e., less than $200/kg), a 50% probability that it will cost less than 30% greater than 2005 costs, and a

15% probability that it will be 20% or lower in cost. As another example, if there were five times as many nuclear plants (G5) and they each operated for 100 years, we would expect (at 50% probability) uranium costs to increase by less than 40%. Because uranium is ~4% of the production cost of electricity, an increase to 6% of the production costs would not have a large impact on nuclear power economics. The two points plotted on Figure 3.3 correspond to 2007 Red Book values for identified (RBI) and identified-plusundiscovered (RBU) resources at under 130 $/kg: 5.5 and 13.0 million metric tons. These benchmarks support the expectation that uranium production costs should be tolerable for the remainder of the 21st century – long enough to develop and smoothly transition to a more sustainable nuclear energy economy.

Plan would only save 20% of uranium ore—that’s marginally better than the squo

Garwin 09

[Richard L. Garwin, BM Fellow Emeritus at the Thomas J. Watson Research Center in

Yorktown Heights, New York. He has contributed to the design of nuclear weapons, instruments and electronics for research in nuclear and low-temperature physics, and superconducting devices. His work for the U.S. government includes studies on antisubmarine warfare, military and civil aircraft, and satellite systems. In 1998, he served as a member of the nine-person Rumsfeld Commission to assess the ballistic missile threat to the United States. He received the Presidential National Medal of

Science in 2003, “Reprocessing isn’t the answer”, Bulletin of the Atomic Scientists, 8-6-

2009, http://www.thebulletin.org/web-edition/op-eds/reprocessing-isnt-the-answer]

Reprocessing of LWR fuel also fails to save uranium, a common argument in favor of recycle. Although 1 percent of the fuel is plutonium and can be burned as MOX; recycling all LWR fuel, including reuse of uranium, would save at most 20 percent of the necessary supply of raw uranium ore. Analysis shows this isn't worth doing unless the cost of natural uranium rose to something like

$750-$1,000 per kilogram. Its current price, however, is much lower, on the order of $70 per kilogram. Even at a price of $750 per kilogram, reprocessing would only be marginally preferable.

The plan only increases uranium conservation by 1% over the once-through cycle

Makhijani and Ledwidge 12

[Arjun Makhijani and Lisa Ledwidge, Ph.Ds, “Reprocessing: Mythology versus Reality”,

Science for Democratic Action, Vol. 16, No. 2, February 2012, http://ieer.org/wp/wpcontent/uploads/2012/02/16-2.pdf]

Statements that imply that the French have somehow figured out how to use 90 percent of the uranium resource are wrong. The

French use only about 0.7 percent of the original uranium resource to create fission energy – and most of that happens before any reprocessing is done. The rest – 99.3 percent of the original uranium – is mainly depleted uranium (DU). This DU is piling up as reprocessed uranium that is not being used, or is uranium left in spent fuel of various kinds (including mixedoxide [MOX] spent fuel).

This figure cannot be increased significantly even with repeated reprocessing, use of all the plutonium, and re-enrichment of the uranium so long as the fuel is used in a light water reactor system. Figure 1 shows the flow of materials in a light water reactor plus reprocessing system. It is not hard to see that using more than one percent of the uranium resource in a light water reactor system is technically impossible even with reprocessing and re-enrichment. In light water reactor systems, almost all the uranium resource winds up as depleted uranium or in spent fuel. Yet France continues to be at the center of reprocessing mythmaking. For instance, Bill

Magwood, now a commissioner in the U.S. Nuclear Regulatory Commission, and Mark Ribbing of the Progressive Policy Institute, wrote as follows to President Obama in 2009: While looking to France for inspiration may or may not play well with domestic audiences, it is one of the first places to look for ideas on how to handle nuclear waste. Actually, the French…do not really think of it as waste….…After a three-year cooling-down period, 96 percent or 97 percent of that material is potentially reusable uranium or plutonium; only the remaining 3 percent or 4 percent is genuinely useless “waste.” France “reprocesses” that leftover uranium and plutonium into useable energy…. 3 As shown above, any statement or implication that France is recycling 96 or 97 percent or any similar high percentage of spent fuel is wrong. Once-through fuel use without reprocessing converts about 4.7 percent of the fuel’s mass into fission products – that is, just 4.7 percent of the fuel produces energy. France increases this by roughly 1 percent by reprocessing. Even if all the recovered uranium were reused, the amount of the fuel actually fissioned would be 6 percent. And repeated reprocessing and reuse presents a huge number of technical and economic difficulties. Finally, as noted above, light water reactors use less than one percent of the original uranium resources, since over 86 percent of it is depleted uranium before the fuel is made.The decision to continue reprocessing in France was not about economics, technical suitability, waste management, or significantly increasing the use of the uranium resource in the fresh fuel. Rather, it was driven mainly by the momentum of a system that was government-owned and had already invested a great deal of money and institutional prestige in the technology. Reprocessing in France continues today due largely to two factors: the inertia of primarily-government-owned electricity generation and reprocessing corporations (EDF and AREVA respectively), and the political and economic dislocations that closing an established large industrial operation would cause in a largely rural area in Normandy that has scarcely any other industries.

2NC A2: Russia Relations Add-on Russian Relations Bad

Anti-Americanism and Russia’s approach to cooperation make relations completely ineffective

Cohen 12

– Senior Research Fellow in Russian and Eurasian Studies and International Energy Policy in the Douglas and Sarah

Allison Center for Foreign Policy Studies (Ariel, 03/15, “How the U.S. Should Respond to Russia's Unhelpful Role in the Middle

East,” http://www.heritage.org/research/reports/2012/03/how-the-us-should-respond-to-russias-unhelpful-role-in-the-middle-east)

The anti-American tilt of Russian foreign policy

prevents diplomatic cooperation

because the

U.S. and Russia lack a shared threat assessment and mutual understanding in dealing with the changing dynamics of the Middle East.

Despite clear statements to the contrary by Prime Minister Putin and Foreign Minister Lavrov, the Obama Administration has repeatedly declared that the U.S. is not competing with Russia for regional influence. Regrettably, the Kremlin has not received this memo. Instead, Russian attempts to constrain U.S. policy have provoked little or no response from Washington.

Lavrov

habitually invokes a

“polycentric” or multipolar model of the world, with Russia working

with her partners toward a future in which U.S. power is so diminished that it

cannot act without Moscow’s permission

.

Russia’s vision of the Middle East is a case in point.[68] Moscow’s concept of multipolarity entails not just an uncontested Russian sphere of influence in the Commonwealth of Independent States, but also together with Iran wielding much greater clout in the Middle East. Moscow clearly wants to retain ties with Iran, which it regards as the rising great power in the Gulf and Middle East. However, the Obama Administration has been deluding itself that Russia would be a genuine partner in restraining

Iran. Notwithstanding Washington’s and Riyadh’s irritation, Russia defends the Assad regime despite its bloody repression of its own citizens. Even though the regime is teetering on collapse, Russia has signed an agreement with Syria to refurbish Soviet naval bases in

Latakiyah and Tartus and has increased sales of sophisticated weapons. Thus, Russia is obstructing U.N. resolutions censuring Syria, while allowing its relationship with the Obama Administration to wilt.[69]

Moscow’s suspicions of the U.S. and the prevailing anti-American mindset lead it to persist in playing a

zero-sum game

in the Middle

East and elsewhere.

The intense competition, in turn, tends to work to the advantage of third countries, such as Iran and China, and of terrorist groups, such as Hamas and Hezbollah.[70] For instance, although Iran and nonstate or state-sponsored Islamist radicals present long-term dangers to both states, Russia tends to ignore the Iranian threat. U.S. interests lie in a more democratic and pro-

Western environment that fosters civil society and economic opportunity. However, the Obama Administration’s myopic laissez-faire attitude toward Islamists seems to have moved this goal further away than before the Arab upheavals erupted.[71] International energy companies also need security for capital-intensive energy projects, which often require investments of the tens of billions of dollars.

Russia’s zero-sum policy is preventing Washington and Moscow from identifying and exploring areas in which U.S. and Russian interests

in the region converge

, such as anti-terrorism and disrupting funding of globally active radical Islamists. The areas in which the two states are pursuing diverging foreign policy goals, such as Russia’s trade in arms and nuclear reactors, will require special attention and, where necessary, consistent pushback. Russia’s interests in the region—including energy and weapons trade, supporting a nuclear Iran, and attempting to selectively legitimize anti-Israel radical

Islamist organizations while fighting similar ones at home—contradict U.S. interests. In addition, Russia is pursuing a diplomatic strategy of developing an ad hoc Sino–Russian axis to undermine U.S. priorities around the world, particularly in the Middle East.

The U.S. goes all-in on Medvedev, but he holds no political power

Cohen 11

– Senior Research Fellow in Russian and Eurasian Studies and International Energy Policy in the Douglas and Sarah

Allison Center for Foreign Policy Studies (Ariel, 06/15, “Reset Regret: U.S. Should Rethink Relations with Russian Leaders,” http://www.heritage.org/research/reports/2011/06/reset-regret-us-should-rethink-relations-with-russian-leaders)

For the past two years, the Obama Administration has touted its Russia “reset policy” as one of its great diplomatic achievements.

The President spent

an inordinate amount of time cultivating

Russian President Dmitry

Medvedev and making him his principal diplomatic interlocutor—despite the fact that Medvedev is

Prime Minister Vladimir

Putin’s

appointed protégé with

no political base

of his own.

To uphold the “reset,” the Administration agreed to cut U.S. strategic nuclear forces under New START, abandoned missile defense deployment in Poland and the Czech Republic, engaged Russia in missile defense talks, pursued a policy of geopolitical neglect in the former Soviet Union, and toned down criticism of political freedom violations in Russia. However,

Putin remains

Russia’s “national leader” and the real power behind — and on— the throne.

Top

White House and State Department officials

now privately recognize that they bet on

the wrong horse

, as it is unlikely that Medvedev will wield any real power beyond the spring of 2012. However, the

Administration cannot publicly admit

that this bet failed, as it would

undermine

the

very notion of

this over-personalized

“reset.”

Yet the reality that

Medvedev has a

limited capacity

to deliver and is unlikely to continue in office

means that the U.S. should rethink its strategy for engaging with Russia’s leadership.

That means relations can’t solve anything

Cohen 11

– Senior Research Fellow in Russian and Eurasian Studies and International Energy Policy in the Douglas and Sarah

Allison Center for Foreign Policy Studies (Ariel, 06/15, “Reset Regret: U.S. Should Rethink Relations with Russian Leaders,” http://www.heritage.org/research/reports/2011/06/reset-regret-us-should-rethink-relations-with-russian-leaders)

U.S.–Russian relations include issues such as

human rights and

Islamist extremism

in Russia, the energy and sovereignty concerns of U.S. friends and allies, Iran, and

nuclear nonproliferation.

The

Obama

Administration cannot address these issues by pretending

that

Medvedev and his narrow circle of supporters wield

the real power.

In fact, it is the Putin group—which includes

the key energy, military and security

services officials

, businessmen, and the leadership of the United Russia ruling party— that exercises the

ultimate power

.

Now Putin, no great friend of America, is likely to move back from the Prime Minister’s office to the Kremlin in the spring of 2012, raising tough questions about Obama’s Russian policy.

Putin publicly disagreed with Medvedev

, his handpicked successor, on

a number of key policy issues

, many of them

vital

to U.S. interests.

These included the role of freedom in the country, the legacy of Joseph Stalin (Putin called him “an effective manager”), and the collapse of the Soviet Union. The two also argued on modernization, Libya, and persecution of the former oil magnate Mikhail Khodorkovsky. Putin also supports

“friendship” with China and Venezuela and good relations with Iran. At various points Putin accused the U.S. of supporting Islamist terrorists in North Caucasus in order to dismantle Russia, illegally intervening in Iraq, being responsible for the global economic recession, and toppling regimes in the Middle East through promotion of social media. Putin views modernization as primarily boosting military technology, pays lip service to the fight against corruption, and directly intervenes in prominent court cases.

Putin formed his worldview in the KGB and by reading Russian nationalist philosophers. He famously considers the collapse of the Soviet

Union “the greatest geopolitical catastrophe of the 20th century.” He also does not like or trust the

U

nited

S

tates.

The U.S. can’t broaden engagement beyond arms control

Fly 10

– executive director of the Foreign Policy Initiative (Jamie M., 06/25, “President Obama’s Failed ‘Reset’ with Russia,” http://www.nationalreview.com/corner/232469/president-obamas-failed-reset-russia-jamie-m-fly)

This is a concept that the Obama administration has shown itself unable or unwilling to grasp

as it has rushed to grant every possible concession to Moscow in an effort to obtain a new arms-control agreement.

Despite

the visit to a burger joint, talk about Russian economic modernization, and supposed

civil-society cooperation,

President

Obama’s relationship with

President

Medvedev has been defined by

one thing

: arms control. If Russia were truly ready for a “reset,”

President

Obama would be able to express concerns about political repression, the rule of law, and Russia’s policies

towards its neighbors without risking the collapse of the relationship. President

Obama has shown

no willingness

to broaden his engagement with Moscow

to include such issues. Unlike President Reagan’s engagement with Moscow on arms control, which was coupled with criticism of the

Soviet Union’s repression of its citizens, this administration has stood by while the situation in Russia deteriorates.

Russia won’t cooperate on the issues their impact evidence is talking about

Brookes 12

– Senior Fellow, National Security Affairs and Chung Ju-Yung Fellow for Policy Studies (Peter, 06/19, “US-

Russia: From ‘Reset’ to Regret,” http://www.heritage.org/research/commentary/2012/06/us-russia-from-reset-to-regret)

There’s little doubt from the reporting of the lackluster meeting between President Obama and Russian President Vladimir Putin at the

G-20 in Mexico yesterday that the White House’s Russia policy is moving

from “reset” to “regret.”

Of course, we’ve seen this coming for awhile — despite lots of wishful thinking on the administration’s part. Team

Obama’s hope

over the last three-plus years has been

that Russia would become a partner of the United States on a range of international issues if ties could only be “reset,” pruning away thorny tensions that have grown in the relationship. In other words, if we could just get relations chummy enough, the Kremlin and

the

White House would become a dynamic duo, tackling

a growing list of world problems.

So much for that

plan. One key focus of the “reset” policy was getting Russia to help stop Iran’s expanding nuclear (weapons) program. While supporting some added pressure on Tehran, Moscow hasn’t really come on board. This week’s P5+1 meeting (the latest in a seemingly endless series) in Moscow on Iranian nukes

probably won’t change that. In fact, after the supposed “reset,” Russia finished building Iran’s first nuclear reactor and provided fuel for it. If Tehran doesn’t return the fuel rods, it could reprocess them for plutonium, providing another avenue for making nukes. The

Russians have been continually cranky about US-led missile defense in Europe, too, seeing it as being aimed at their nuclear deterrent rather than at the growing Iranian missile threat. The griping didn’t stop even after Obama unilaterally abrogated the deal to put antimissile sites in Poland and the Czech Republic. Not long ago, a senior Russian general rattled a Soviet-like saber, threatening a pre-emptive strike on US-NATO missile defenses in Eastern Europe, if necessary. Another flashpoint is Syria. Secretary of State

Hillary Clinton last week blasted the Russians with both barrels for sending weapons to their longtime friend, the Syrian regime. Even with a few words of support for a democratic transition in Syria at the G-20,

Moscow has frustrated Washington’s

UN efforts to punish Syria ’s Basher, er, Bashar Assad. (The Kremlin is reportedly crabby about Libya, believing the mission crept beyond its original UN mandate.) Russia continues to befriend countries of concern, too: Venezuela is a rapacious buyer of Russian arms; Moscow held its first-ever naval exercises with Beijing in April in the Yellow Sea, waters China considers “sensitive” to US military operations. Team Obama has tried to lump Russia in with its claimed foreign-policy successes, citing the New Strategic Arms

Reduction Treaty and Moscow’s provision of supply and withdrawal routes in and out of Afghanistan as proof positive of better relations. Of course, many see New START as having advantaged the Russians, since a majority of the cuts came from us. The

Russia-Afghanistan road is worrisome because it’s much more expensive than through Pakistan (currently closed) and gives Moscow leverage over us, especially when we pull out. Critics say we’ve

also given ground on a Russian sphere of influence

in some parts of the former Soviet Union’s stompin’ grounds — and Obama’s certainly been pretty much mum on political and social liberty in Russia. Yet the president may still be hoping for a “reset” redo in a second term. Who can forget his open-mike moment this spring, offering now-former Russian President Dmitry Medvedev more flexibility on missile defense after the

US election? Isn’t that comforting? The fact is Putin wants the US to get out of the way of Russia’s reemergence — but is willing to cooperate on issues that benefit him politically

or Moscow in general.

That’s really nothing new. So it’s

probably a good time to

forget the reset

— and instead embrace a pragmatic policy that sees Russia for what it is, not what Team Obama hopes against hope it will be.

Empirics go neg

Bendikova and Cohen 12

(Michaela, Research Assistant for Missile Defense and Foreign Policy in the Douglas and

Sarah Allison Center for Foreign Policy, Ariel, Senior Research Fellow in Russian and Eurasian Studies and International Energy

Policy in the Douglas and Sarah Allison Center for Foreign Policy Studies, 08/04, “Who Are the Real Cold War Monsters?” http://blog.heritage.org/2011/08/04/who-are-the-real-cold-war-monsters/)

It has been over two years since the

U

nited

S

tates launched the “reset” policy. Where is it heading

in view of

Russian rhetoric and threats? President Obama called the “reset” his “great achievement” only days after Putin’s “parasite” outburst.

Maybe he was encouraged by Russia’s issuing a series of postage stamps to commemorate his 50th birthday.

If

history

is any guide: The

U

nited

S

tates tried a policy of détente with the Soviet Union

in the 1970s, culminating in

the kiss

between

President Jimmy

Carter and

Soviet Leader Leonid

Brezhnev

at the SALT II Treaty signing in

Vienna, Austria.

The U.S. reward for its more “constructive” stance, however, was the Soviet invasion in Afghanistan.

Domestic politics in both countries prevent relations from solving anything

Rojansky 9/5

– deputy director of the Russia and Eurasia Program at the Carnegie Endowment (Matthew, “U.S.-Russian

Cooperation Beyond 2012,” http://www.carnegieendowment.org/2012/09/05/u.s.-russia-cooperation-beyond-2012/drem)

The relationship between the

U

nited

S

tates and Russia is

on hold

in 2012.

The intensity of domestic political debate in Russia following disputed

national elections and months of

public protest, and

in the

U

nited

S

tates leading up to November’s presidential contest, leaves

little room

for bold initiatives

or high-profile summit diplomacy. So for now, don’t expect much progress—the

best case

will be

if there is no backsliding

, and that outcome is by no means guaranteed.

Outweighs on magnitude and probability

Blank 2k

– Research Professor of National Security Affairs at the U.S. Army War College (Stephen J., June, “U.S. MILITARY

ENGAGEMENT WITH TRANSCAUCASIA AND CENTRAL ASIA,” www.bits.de/NRANEU/docs/Blank2000.pdf)

However, Washington’s well-known ambivalence about committing force to Third World ethnopolitical conflicts suggests that U.S. military power will not be easily committed to saving its economic investment. But this ambivalence about committing

forces and the dangerous situation

, where Turkey is allied to Azerbaijan and Armenia is bound to Russia, create the potential for

wider and more protracted regional conflicts

among local forces.

In that connection, Azerbaijan and Georgia’s growing efforts to secure NATO’s lasting involvement in the region, coupled with Russia’s determination to exclude other rivals, foster a polarization along very traditional lines.71 In 1993 Moscow even threatened World War

III to deter Turkish intervention on behalf of Azerbaijan. Yet the new Russo-Armenian Treaty and Azeri-Turkish treaty suggest that

Russia and Turkey could be dragged into a confrontation to rescue their allies from defeat. 72 Thus many of the conditions for conventional war or protracted ethnic conflict in which third parties intervene are present in the Transcaucasus. For example, many

Third World conflicts generated by local

structural factors have a

great potential for unintended escalation

. Big powers often feel obliged to rescue their lesser

proteges and proxies. One

or another big power may fail to grasp the other side’s stakes since interests here are not

as clear

as in Europe.

Hence commitments involving the use of nuclear weapons to prevent a client’s defeat are not as

well established or apparent.

Clarity about the nature of the threat could prevent the kind of rapid and almost uncontrolled escalation we saw in 1993 when Turkish noises about intervening on behalf of Azerbaijan led Russian leaders to threaten a nuclear war in that case.

73 Precisely because Turkey is a NATO ally,

Russian nuclear threats could trigger a

potential

nuclear blow

(not a small possibility given the erratic nature of Russia’s declared nuclear strategies).

The real threat of a Russian nuclear strike

against Turkey to defend Moscow’s interests

and forces in the Transcaucasus makes the danger of major war there

higher than almost everywhere else

.

As Richard Betts has observed, The greatest danger lies in areas where (1) the potential for serious instability is high;

(2) both superpowers perceive vital interests;

(3) neither recognizes that the other’s perceived interest or commitment is as great as its own; (4) both have the capability to inject conventional forces; and

, (5) neither has willing proxies capable of settling the situation.

74 Russian perceptions of the Transcaspian’s criticality to its interests is tied to its continuing efforts to perpetuate and extend the vast disproportion in power it possesses relative to other CIS states. This power and resource disproportion between Russia and the smaller states of the Transcaspian region means that no natural equilibrium is possible there. Russia neither can be restrained nor will it accept restraint by any local institution or power in its pursuit of unilateral advantage and reintegration. 75

We control the fastest extinction impact – relations kill U.S. BMDs – causes nuclear war in 2013

Bendikova 12

– Research Assistant for Missile Defense and Foreign Policy in the Douglas and Sarah Allison Center for

Foreign Policy (Michaela, 06/01, “Limiting Defenses to Placate Russia Is Dangerous,” http://blog.heritage.org/2012/06/01/limitingdefenses-to-placate-russia-is-dangerous/)

In his recent article former Deputy Assistant Secretary of Defense Keith Payne offers a unique perspective on the U.S. missile defense program and exploits the rationale for taking a more aggressive approach to U.S. defensive measures. The issue of missile defense

in particular has

recently occupied a prominent position in the debate about

U.S. national security and the future of

U.S.–Russian relations

due to Russia’s adamant opposition to the European Phased Adaptive Approach. While the Russians continue to threaten U.S. allies with nuclear attacks on their soil if they accept U.S. missile defense installations on their territory, President Obama seems to believe that Russia’s opinions are more important than making sure that the U.S. and allies are less vulnerable to a ballistic missile attack. The U.S. should not cave in to Russian demands to restrict its missile defense system, because

any limitations

would

ultimately make the U.S. and its allies vulnerable to a ballistic missile attack. Yet

President

Obama

recently demonstrated his willingness to be more “flexible” regarding Russian demands after the

November election.

President

Obama’s commitment to missile defense cannot be trusted past

November

.

Vulnerability is not inevitable, but it is a consequence of government’s policy choices.

The Cold War notion that missile defenses

and other passive defense measures are

“destabilizing”

(meaning incentivizing the opponent to strike first) and not worth pursuing (because they would not save a significant majority of the population) is no longer applicable

in the post–Cold War environment, writes Payne. Today, the

U.S. faces new types of threats

from many difference sources: terrorists armed with w eapons of m ass d estruction, electromagnetic pulse attacks, and ballistic missile attacks from Iran or North Korea.

As

Payne concludes, “The Cold War is over, and U.S. officials need not accept its legacy of uncontested vulnerability. The price of continuing adherence to that old, dubious tenet of the balance of terror is now too high.

Turns nuclear deterrence

Andersen 11

– Senior Digital Communications Associate at The Heritage Foundation (Ericka, 10/26, “Morning Bell: The

Serious Risks of the Russian Reset,” http://blog.heritage.org/2011/10/26/morning-bell-the-serious-risks-of-the-russian-reset/)

President Obama may believe that America’s “reset” policy with Russia is the correct move to cover important foreign policy bases, but the policy is deeply flawed. It puts the United States at a disadvantage we can’t afford and forces us to lay aside fundamental

American principles of human liberty.

The “reset” concessions are simply not worth the exchange of

empty promises

from

Russian President Dmitry

Medvedev

, who is merely a talking head for Prime Minister Vladimir

Putin. As Heritage’s Ariel Cohen & Kim Holmes wrote recently in a memo on U.S.–Russia Relations, Putin would like nothing less than

a “Soviet-like superpower prestige and status through forced nuclear equality with

Washington.” The

large

“reset”

payoff requires

America put it all on the line by cutting U.S. strategic nuclear forces and engaging in missile defense talks

with Russia, in addition to abandoning missile defense deployment in Poland and the Czech Republic

and keeping quiet about political freedom violations running rampant throughout Russia. America may never have won the Cold War 22 years ago with policies such as these. It is imperative that America lead with the cause of freedom and justice when dealing with Russia, or any other nation for that matter. In Heritage’s Understanding

America series, Matthew Spalding explains that the United States was founded and thrives on “universal principles that appeal to a higher standard.” Such universal principles of freedom should be the foundation of America’s foreign policy strategy—not an afterthought. Yesterday at a Heritage Foundation conference focused on the “reset” policy, House Speaker John Boehner (R–OH) recalled the international leadership that prompted America to victory in the Cold War not so long ago. He applauded President

Ronald Reagan and British Prime Minister Margaret Thatcher as two who “quite simply, loved freedom…[and] made their feelings well-known, contagious, as if no one or no force could stand in their way.” Boehner urged America not forget what life was like for the Soviets before these two warriors of freedom refused to stand for it. As Boehner said, “freedom most inspires those who remember life without it.” While the Obama Administration may believe the “reset” policy as it stands is necessary, the deal raises a lot of red flags. In his paper, Cohen urges that America must not tolerate Russian mischief or fail to make its priorities of freedom loud and clear. As Boehner said, instead of negotiating with Russia, Washington should call its bluff—“Publicly, forcefully, frequently.” As the leader of the free world, America has a responsibility to remain in control and end the idea that it is “leading from behind” when it comes to Russia. In a recent memo, Cohen explained why the Obama Administration must stop its policy of “please Moscow” and push Russia to “reset” its own policies. He writes: Moscow has continuously promoted in word and deed

the idea that there is or should be a

multipolar world order

that constrains U.S. foreign policies.

A “reset” policy that ignores Russia’s global efforts to undermine the U.S. recalls the ill-fated détente of the 1970s. As experts at yesterday’s conference attested, the risks involved in America’s “reset” relations with Russia are

many

.

Foreign policy dealings with any nation—especially Russia—must be guided by America’s Founding values first and foremost. The consequences of doing otherwise will be great.

U.S.-Russia relations cause Iran to launch nuclear attacks on Europe within 3 years

Gardner 12

– former writer for the Homeland Security NewsWire (Chris, 07/20, “Russian Demands Are Still Not a Reason to

Abandon U.S. Missile Defense,” http://blog.heritage.org/2012/07/20/russian-demands-are-still-not-a-reason-to-abandon-u-s-missiledefense/)

Continuing its longstanding opposition to U.S. missile defense,Russia has demanded to know the parameters of the proposed missile defense shield in Europe and ways to verify that the system will not target Russian intercontinental ballistic missiles (ICBMs).

According to Russia’s acting envoy to NATO, Nikolai Korchunov, creating a

joint partnership between the

U.S. and Russia

for any missile defense system placed in Europe could

most easily

accomplish this

; for the U.S., to comply with this demand would be lunacy. For years, the

Russia n Federation has opposed

many of the foreign policy goals of the U.S. with respect to Iran, including helping Iran build its ballistic missile force and blocking new sanctions designed to get

the

Iran ian regime to give up its nuclear ambitions. This has allowed Iran

to continue to develop its program

and harden key assets in underground bunkers.

Russia

even helped build Iran’s first nuclear power reactor and is considering building a second

unit. If not for Russian help, the deployment of missile defenses in Europe to counter an Iranian threat may not be such an urgent matter. Yesterday, the Fars News Agency reported that the Russian central bank is preparing to make due payments and issue letters of credit to circumvent sanctions placed on the Iranian regime.

Why

, then, would the U.S. grant what is effectively

veto power

to

the

Russia ns when it comes to protecting our allies in Europe by allowing Russia to co-manage the system?

Further, is it much of a stretch to believe the

Russia ns would provide our missile defense technology to

rogue states such as

Iran and North Korea

? Russian opposition to

missile defense has existed since President Reagan announced the Strategic Defense Initiative (SDI), and it has been so strong that

Mikhail Gorbachev was willing to give up all strategic offensive nuclear weapons at the Reykjavik summit in exchange for Reagan killing SDI. However, an irrational Russian fear is hardly a reason to open our allies up to the very real possibility of

attack

by Iran or other rogue state. If anything, threats of the use of nuclear force against missile sites by

Russian generals only proves the need for missile defense and should steel the resolve of the United States. According to a report by the Department of Defense in April,Iran is continuing to develop ballistic missiles that can reach into Europe and increasing their lethality through accuracy improvements and new submunition payloads. Estimates also project that

Iran will be technically capable of launching an ICBM

within three years

.

Global nuclear war

Glaser

, Assistant Prof @ Chicago,

93

(Charles, International Security Summer)

However, although the lack of an imminent Soviet threat eliminates the most obvious danger, U.S. security has not been entirely separated from the future of Western Europe.

The ending of the Cold War

has brought many benefits, but has not eliminated the possibility of major power war, especially since such a war could grow out of a smaller conflict in the East

.

And

, although nuclear have greatly reduced the threat that a European hegemon would pose to U.S. security, a sound case

nevertheless remains that a major European war could threaten U.S. security. The United States could be drawn into such a war, even if strict security considerations suggested it should stay out.

A major power war could escalate to a nuclear war

that

, especially if the United States joins, could include attacks against the American homeland

. Thus, the United

States should not unconcerned about Europe’s future.

Desal Bad

1NC

Status quo solves water scarcity and nuclear desalination is ineffective

Gar

Smith 11

, Editor Emeritus of Earth Island Journal, a former editor of Common Ground magazine, a Project Censored

Award-winning journalist, and co-founder of Environmentalists Against War, "NUCLEAR ROULETTE: THE CASE AGAINST A

NUCLEAR RENAISSANCE," June, International Forum on Globalization series focused on False Solutions, http://ifg.org/pdf/Nuclear_Roulette_book.pdf

By 2025, 3.5 billion people will face severe fresh-water shortages.

Nuclear proponents

groping for justifications to expand nuclear power have argued that the waste heat from power plants can provide a “cheap and clean” solution to the inherently costly process of removing salt from seawater.

Desalination plants (there are 13,080 worldwide, mostly oil- and gas-fired and mostly in wealthy desert nations) already produce more than 12 billion gallons of drinkable water a day. 153 The first nuclear desalinator was installed in Japan in the late 1970s and scores of reactor-heated desalination plants are operating around the world today. But nuclear desalination is another False Solution

. The problem with atomic waterpurifiers is that using heat to treat seawater is an obsolete 20 th -century technology.

Thermal desalination has given way to new reverse osmosis systems that are less energy intensive and 33 times cheaper to operate

. 154 Nuclear desalination advocates claim that wind, solar, and wave power aren’t up to the task while new low-temperature evaporation technology may be able to produce high purity water at temperatures as low as 122° Fahrenheit. 155

Promoting reactors as a solution to the world’s water shortage is especially ludicrous since nuclear power plants consume more water than any other energy source

. 156 Even proponents admit there is a potential risk that running seawater through a radioactive environment might contaminate the drinking water produced

. 157 Undeterred, scientists in Russia and India have proposed anchoring small atom-powered water-plants offshore near densely populated coastal cities. But this would provide no relief for the billions of people living inland in water-starved regions of North Africa and Asia.

Desalination is merely a way of giving a marginal new purpose to existing reactors whose balance sheets would be improved if they were retrofitted with desalination chambers

. As with power generation, so with desalination: efficiency in water use (better irrigation technology, crop selection, eliminating transit losses, etc.) beats new production.

A real solution to the growing global water shortage needs to address the increasing amount of water diverted to wasteful agricultural and industrial practices and concentrate on preventing the water from being contaminated in the first place —by, among other things, capping the size of local populations to match locally available water supplies.

Nuclear desalination collapses global water sources

Smith 11

Gar Smith, Editor Emeritus of Earth Island Journal, a former editor of Common Ground magazine, a Project Censored Award-winning journalist, and co-founder of Environmentalists Against War, June 2011, NUCLEAR ROULETTE, http://ifg.org/pdf/Nuclear_Roulette_book.pdf

The nuclear fuel-cycle contaminates

our water

as well as our air. The 104 U.S. reactors operating in 40 of the 50 states routinely discharge used coolant water

into the nation’s major streams, the Great Lakes, the Gulf of Mexico, and the

Atlantic and Pacific Oceans.142 While much of a reactor’s coolant water is released as steam (heating the atmosphere), the remainder—heated and contaminated with radioactive isotopes — is vented back into waters where it

wreaks damage on

river and

ocean life

.

143, 144 Thermal pollution of the Hudson River from Indian Point kills more than 2 billion fish a year.The Salem Nuclear Generating Station, which swallows 3 billion gallons a day from the Delaware Bay, has caused a 31 percent reduction in bay anchovy. California’s two coastal plants at San Onofre and Diablo Canyon suck in nearly a million gallons of seawater every minute to use as a free coolant.145 San Onofre’s two reactors pour 2,400 million gallons of water

(heated to 19°F over ambient temperatures) into the Pacific every day. 146 When it comes to producing electricity,

nuclear is an extravagantly water-wasting technology

. 147 A nuclear power station requires between 20 to 83 percent more water than any other kind of power plant. Even Westinghouse’s “Generation III” AP1000 needs to consume as much as 750,000 gallons per minute to operate safely.148 IMPACTS ON LAND Life on land suffers gross impacts too. By 1978, the U.S. “uranium rush” had left 140 million tons of crushed-rock tailings at 16 operating mills and 22 abandoned sites—with additional wastes piling up at an average of 6 to10 tons a year.The 1.7-million-ton tailings pile at Shiprock, New Mexico, covers 72 acres. All tailings piles release radon gas and long-lived radioactive isotopes into the air, rivers, arroyos and aquifers. Radon gas (believed responsible for a five-fold increase in lung cancer among uranium miners) continues to poison the winds blowing over abandoned piles of mining wastes that lie scattered around the world. 149 In 1979, 94 million gallons of contaminated liquid

tailings burst from a containment dam

in New Mexico, sweeping 1,100 tons of radioactive wastes into the Rio Puerco River, which flows into the

Little Colorado River and on to Lake Mead, a major source of drinking water

for Las Vegas and Los Angeles.150 In

1984, a flash flood flushed four tons of tailings into a tributary of the Colorado River, which provides irrigation for farms and drinking

water for cities in Nevada and southern California. Less dramatic but also deadly is the imperceptibly slow, toxic seepage from tailing ponds

that has

steadily poisoned critical subsurface aquifers

across the Colorado Plateau.151

The devastation

to portions of America’s landscape has been so vast and long-lasting

that the government has

no hope of ever repairing the damage

. Instead, it has created a term to describe these irreparably damaged, nuclear no-man’s-lands —“National Sacrifice Areas.” Despite the environmental and health damages wrought by uranium mining, there have never been any binding standards requiring operators to minimize harm to the local land or people.152

The World Nuclear Association (a trade body representing 90% of the industry) is considering a “Charter of Ethics” but it would be voluntary and self-policed. At best, some local activist communities have been able to demand a higher price for the ore extracted from their damaged lands. In 2008, in a rare victory, the people of Niger forced the French firm AREVA to increase the price of a kilogram of uranium.

This is offense—trades-off with effective water management

Phil

Dickie

, WWF’s Global Freshwater Programme, 200

7

, MAKING WATER: http://awsassets.panda.org/downloads/desalinationreportjune2007.pdfDesalination: option or distraction for a thirsty world?, http://awsassets.panda.org/downloads/desalinationreportjune2007.pdf

All

of the areas where

seawater desalination is

rapidly assuming a more prominent

water supply role had more

cost effective and less

potentially environmentally damaging alternatives available

.

This is particularly true of demand management, water conservation and water efficiency

measures, where many of even the more advanced economies such as Australia do not uniformly require easily achievable water and energy efficiency standards in new buildings. The extent to which a furore in favour of desalination is associated with unsustainable urban development, excess water

intensive tourism development

for arid areas, and unsustainable arid area export agriculture is also disturbing. Many of these relatively dry or drying areas have high levels of water consumption. Many of the areas where there is most intensive desalination activity also have a history of damaging or degrading natural water resources, particularly groundwater. What such societies need is a new attitude to water not a new water supply. It is in this sense that desalination

, which fits a familiar supply paradigm, caters to

the edifice complex of institutions and politicians

, and offers

up opportunities of a new stream of contracts to

the infrastructure industry

, is essentially

a distraction

to

the need to use all water wisely

for the maintenance of both human societies and the natural systems on which they depend. The World Bank, in conducting a study of desalination in Asia, the Middle East and North Africa, sounded a strong and similar note of caution about desalination. “A key conclusion of the study is that

desalination

alone

cannot deliver the promise of improved water supply

. The ability to make the best use of desalination is subject to a series of

wider water sector related conditions

. In some countries weak water utilities

, politically determined

low water tariffs

, high water losses and poor sector policies mean that desalinated water, just like any

other new source of bulk water, may not be used wisely

or that desalination plants are at risk of falling into disrepair.

Under these conditions, there is a risk that substantial amounts of money are used inefficiently

, and that

desalination cannot alleviate water scarcity nor contribute

to the achievement of the MDGs.

It may be preferable not to engage in desalination

on a large scale unless the underlying weaknesses of the water sector are seriously addressed. A programme to address these weaknesses should include a reduction of non-revenue water; appropriate cost recovery; limited use of targeted subsidies; sound investment planning; integrated water resources management; proper environmental impact assessments; and capacity building in desalination as well as in water resources management and utility management. In any case, desalination should remain the last resort, and should only be applied after cheaper alternatives in terms of supply and demand management have carefully been considered. (emphasis added)

1 – Turn - Conservation

A) Higher price in desal now will cause shift to conservation

Boals 9

(Connor Boals Infographics by Hannah Nester Circle of BlueDrinking From The Sea, http://www.circleofblue.org/waternews/2009/world/drinking-from-the-sea-demand-for-desalination-plants-increases-worldwide/ , June

29)

“The most reliable, most cost effective

and most environmentally friendly source of water is conservation

, increased efficiency and waste prevention ,” Scow said. “We have so many opportunities to save water

.

Those needs need to be addressed first .” Many in the industry see a silver lining in the high er pric ing of desalinated water: people will be thriftier and use less . “Yes, the price is obscenely high, but what’s the alternative if you don’t have any water?” Pankratz said. “Until we look at water differently and start valuing it for what its real cost is, we won’t have a good picture, and people won’t be conserving water like they should.” Palmer said that the pricing of water in Australia has always been too cheap. “We are the driest continent, and our prices for municipal water are about half of what people charge in

Europe, where there is admittedly more water,” he said. “[Desalinated] water is three times more expensive, therefore you don’t want to waste it,” he said. “So water authorities have to charge accordingly, and people will use less water and waste less water .”

B)

Conservation alone can solve the world’s water problems

Bouguerra 8

(Environmental and economic challenges of water desalination Mohamed Larbi BOUGUERRA 02 / 2008 http://base.d-ph.info/fr/fiches/dph/fiche-dph-7355.html

, Author’s lecture during the roundtable on « Natural resources and security » during the seminar on « Natural resources » organized on the 18th of January 2008 by the French Embassy in Amman and the Institut Français du Proche-Orient.)

For some analysts, water desalination may appear as a technological fix to the water needs of our modern societies or, sometimes, as a political trick as in the case of the Israeli- Palestinian conflict. Natural resources such as water are of course limited and finite.

Desa lination is deceiving

.

It’s a fool paradise

rubbing that fact.

Illimited abundance

in any field or realm is a hoax

. Rather, one must take

into account of all the techniques aiming at a wise water use, to conserving of the resource and processes intented to save water.

One must manage

water in order to eliminate leakages which amount

up to 20-30%

on average worldwide (NAFW not accounted for water). According to recent studies, it appears that conservation measures may meet the new water needs for a cost

which is 10 to 25% o f incurred expenses of water desal ination. In that regard, water efficiency must be improved. Leakages and wastings must be eliminated.

According to the Washington based Worldwatch Institute, we can avoid

thus desal ination and its negative effects on the environment and the atmosphere. Finally one must point to the fact that desalinated water quality must be carefully monitored for bromate, a suspected carcinogen. According to international regulations, bromate levels may not exceed 10 ppb on average over a year in a reservoir.

2.

Turn – Trade-off - Desal stops better water-saving options – and it causes unsustainable urban development

Dickie 7

MAKING WATERDesalination: option or distraction for a thirsty world? This report was prepared for WWF’s Global Freshwater

Programme by Phil Dickie (www.melaleucamedia.com) June 2007 http://waterwebster.org/documents/desalinationreportjune2007.pdf

All of the areas where seawater desal ination is rapidly assum ing a more

prominent water supply role had more cost effective and less potentially environmentally damaging alternatives available

. This is particularly true of demand management, water conservation and water

efficiency measures, where many of even the more advanced economies such as Australia do not uniformly require easily achievable water and energy efficiency standards in new buildings. The extent to which a furore in favour

of desalination is associated with unsustainable urban development

, excess water intensive tourism development for arid areas

, and unsustainable arid area export agriculture is also disturbing. Many of these relatively dry or drying areas have high levels of water consumption. Many of the areas where there is most intensive desal ination activity also have a history of

damaging or degrading natural water resources, particularly groundwater

. What such societies need is a new attitude to water not a new water supply. It is in this sense that desal ination, which fits a familiar supply paradigm, caters to the edifice complex of institutions and politicians, and offers up opportunities of a new stream of contracts to the infrastructure industry, is

essentially a distraction to the need to use all water wisely

for the maintenance of both human societies and the natural systems on which they depend. The World

Bank, in conducting a study of desalination in Asia, the Middle East and North Africa, sounded a strong and similar note of caution about desalination. “A key conclusion of the study is that desalination alone cannot deliver the promise of improved water supply. The ability to make the best use of desal ination is subject to

a series of wider water sector related conditions. In some countries weak water utilities

, politically determined low water tariffs

, high water losses and poor sector policies

mean

that desal inated water, just like any other new source of bulk water, may not be used wisely or that desalination plants

are at risk of falling into disrepair

. Under these conditions, there is a risk that substantial amounts of money are used inefficiently, and that desal ination cannot alleviate water scarcity

nor contribute to the achievement of the MDGs.

It

may be preferable not to engage in desal ination on a large scale unless the underlying weaknesses of the water sector are seriously addressed

. A programme to address these weaknesses should include a reduction of nonrevenue water; appropriate cost recovery; limited use of targeted subsidies; sound investment planning; integrated water resources management; proper environmental impact assessments; and capacity building in desalination as well as in water resources management and utility management. In any case, desalination should remain the last resort, and should only be applied after cheaper alternatives in terms of supply and demand management have carefully been considered. (emphasis added) A second conclusion is that the private sector can play a useful and important role in funding and operating desalination plants, but only if the above conditions are met. If these conditions are absent, there is a risk that excessive investments in desal ination become a drain to the national budget

, either directly under public financing or indirectly through implicit or explicit guarantees under private financing." 72

3 - Turn – Coastal Growth – Desal causes it – results in water shortages, ag runoff destroying natural sources of water

Cooley 6

Heather Cooley, holds a B.S. in Molecular Environmental Biology and an M.S. in Energy and Resources from the University of

California at Berkeley. Peter H. Gleick, co-founder and President of the Pacific Institute for Studies in Development, Environment, and Security in Oakland, California, . and Gary Wolff, Ph.D., is Principal Economist and Engineer. Dr. Wolff received his B.S. in

Renewable Energy Engineering Technology from Jordan College in 1982, his M.S. in Civil and Environmental Engineering from

Stanford University in 1984, and his Ph.D. in Resource Economics from the University of California at Berkeley in 1997.

DESALINATION, WITH A GRAIN OF SALT, Pacific Institute. June 2006 http://www.pacinst.org/reports/desalination/desalination_report.pdf

accessed July 24, 2007

In addition to affecting the coastal environment through water intake and discharge

, desal ination can

also affect the coast through

impacts on developments, land use, and local growth

, which are often controversial and contentious topics.

Rapid, unplanned growth can damage local environmental resources as well as

the social fabric of a community anywhere

. For example, building new homes and businesses

without investing in infrastructure can cause overcrowded schools, traffic, and water shortages

.

Urban and agricultural runoff and increases in wastewater flows create water-quality problems

in local rivers, streams, and/or the ocean

.

Coastal developments are often particularly divisive

. Some developments can change the nature of views, beach access, and other environmental amenities.

4. Efficiency outweighs desal – only way to stop ag overuse – makes up 70%

Kaldany 12

(Africa: The Water Crisis is Now, http://allafrica.com/stories/201207050683.html

, Rashad, July 4, Rashad Kaldany is vice president of global industries at the International Finance Corporation)

One promising conclusion of the group's work is that investments in efficiency can make a huge difference at a reasonable cost

. In some countries the greatest room for increased efficiency is in the industrial sector. In China, for example, it takes almost 3000 litres of water to produce one cotton shirt, so water savings here could have dramatic effects. However, most countries should focus first on the ag ricultural sector, since it uses 70 percent of water worldwide

- with half

of it wasted

. Investment in more efficient irrigation makes a big difference. The Indian company Jain Irrigation, the second largest drip irrigation company in the world, is a good example. IFC has helped it expand its operations in India, where its microirrigation products have resulted in water savings equal to the annual consumption of about 15m households. Jain is now expanding to

Africa, a promising initiative in South-South cooperation. This is only one of many examples of innovations that can help contain the water crisis. Investors, governments, and international organisations can and must work together, and they must do so now. Since water is a common good, its use and conservation require common solutions.

Turns

Efficiency outweighs desal – only way to stop ag overuse – makes up 70%

Kaldany 12

(Africa: The Water Crisis is Now, http://allafrica.com/stories/201207050683.html

, Rashad, July 4, Rashad Kaldany is vice president of global industries at the International Finance Corporation)

One promising conclusion of the group's work is that investments in efficiency can make a huge difference at a reasonable cost

. In some countries the greatest room for increased efficiency is in the industrial sector. In China, for example, it takes almost 3000 litres of water to produce one cotton shirt, so water savings here could have dramatic effects. However, most countries should focus first on the ag ricultural sector, since it uses 70 percent of water worldwide

- with half

of it wasted

. Investment in more efficient irrigation makes a big difference. The Indian company Jain Irrigation, the second largest drip irrigation company in the world, is a good example. IFC has helped it expand its operations in India, where its microirrigation products have resulted in water savings equal to the annual consumption of about 15m households. Jain is now expanding to

Africa, a promising initiative in South-South cooperation. This is only one of many examples of innovations that can help contain the water crisis. Investors, governments, and international organisations can and must work together, and they must do so now. Since water is a common good, its use and conservation require common solutions.

Desal causes overuse in the ME, raises energy prices and stops Conservation efforts necessary

Barton 12

(Alexandra, http://thewaterproject.org/water-in-crisis-middle-east.php

, “Water in Crisis – Middle East)

Desal ination plants are an overuse of water resources in the Middle East

. Seventy percent of desalination plants in the world are located in this area, found mostly in Saudi Arabia, the United Arab Emirates, Kuwait, and Bahrain . While the plants produce water needed for the arid region, they can manufacture problems for health and the environment.

The seawater used

most in desalination plants has high amounts of boron and bromide

, and the process can

also remove essential minerals like calcium

. Also, the concentrated salt is often dumped back into oceans where the increased salinity affects the ocean's environment.

The plants harm local wildlife and add pollutants to the region's climate

. In addition, desalination is the most energy-costing water resource. The Pacific Institute explains that the high use of energy results in raised energy prices

and higher prices on water produced, hurting the consumer. The water produced can be beneficial towards substituting any lack of freshwater, but these areas have tendencies towards overuse of their natural resources. Concerns with the large amount of desal ination plants in the Middle East

focus on the improper dependency they will cause, instead of encouraging alternate forms of water and energy and conserving freshwater

. The Middle East has numerous struggles with its current water resources, and the region needs more than one solution to generate an optimistic environmental position for the future.

Intentional oil spills will destroy desal plants

Lovell 98

( http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA398768 , AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY THE

THREAT OF INTENTIONAL OIL SPILLS TO DESALINATION PLANTS IN THE MIDDLE EAST A U.S. SECURITY THREAT by James E. Lovell, Maj, USAF A Research Report Submitted to the Faculty In Partial Fulfillment of the Graduation Requirements

Advisor: Major Robert L. Fant Maxwell Air Force Base, Alabama April 1998)

The New York Times articles discussed in chapter one best indicate the growing awareness of the vulnerability of desalination plants to oil. However, there are some other interesting indications that should be noted. Perhaps most dramatic are recent statements by former deputy assistant secretary for energy emergencies and the U.S. Department of Energy, Edward Badolato. He said the “U.S

. government

‘is doing nothing’ to anticipate sabotage of pump ing stations, treatment plants

, pipelines, or dams

in the Middle East.” 1 When commenting on U.S. plans to defend international water facilities, Baldolato said, “We’re not equipped to deal with it

. We haven’t focused on the water problem

. We’re barely capable of focusing on oil.” 2 John

Bullock and Adel Darwish, in their book Water Wars, are also very clear about how vulnerable they see desalination facilities.

Every desal ination plant

built is a hostage to fortune; they are easily sabotaged

; they can be attacked from the air or by shelling from off-shore

; and their intake ports have to be kept clear, giving another simple way of preventing their operation. 3 Why is Oil a Threat?

Oil

is a threat for two primary reasons. First, it contains pollutants

not normally found in sea water that the desalination facilities do not normally have to remove. For instance, benzene is a human carcinogen contained in oil that can not exceed 5 parts per billion in potable water. 4 Continued operation of desalination plants with even a small amount of processed benzene could pose a threat to public health. Second and perhaps the most obvious reason oil threatens desalination plants is the damage it can do to sea water intake filters and heat exchangers.

Oil in sea water

can take the form of the well recognized slick, but it can also form

large tarballs and “sunken oil globs” that can be drawn into intake filters

. 5

Obviously, the oil can then foul the filters limiting the amount of water intake as well as foul internal membranes disrupting the reverse osmosis process if affecting a reverse osmosis facility. The required clean-up and problems caused by oil affecting a multistage flash facility are no less troublesome. “If oil enters a (multi-stage flash) desalination plant it may be necessary to first use a solvent to loosen the oil particles. Next, the plant should be flushed with soap and water and finally it should be flushed with fresh water. Oil clings to heat transfer surfaces, disrupting the heat transfer process.” 6 Proximity Concerns The Middle East is the one place

in the world uniquely positioned to have the highest concentration of both desalination and oil facilities

, thus providing many opportunities21 for the terrorist scenario suggested by this paper. The Central Intelligence Agency’s “Middle

East Area Oil and Gas Map” shows a stunning concentration of oil facilities in the region. The Persian Gulf has 29 major tanker ports and 16 major shore-based refineries. 7 Kuwait alone accounts for 4 tanker ports and 4 coastal refineries 8 while the International

Desalination Association’s plant inventory indicates Kuwait had 23 major desalination facilities operational by 1997. 9 Considering its north/south coastline is only approximately 100 miles long, Kuwait clearly has a concentration of facilities that makes another Gulf

War spill scenario very possible. The United Arab Emirates is another example of how concentrated the desalination and oil facilities are. Ten tanker ports and three coastal refineries 10 share approximately 300 miles of coastline with 40 major desalination plants. 11

Other Middle East countries share a similar story, so the point to be made here is an obvious one; many desalination plants share limited coastal areas with many oil facilities

. Thus, proximity alone presents another threat to desalination plants in the Middle East.

Problem in Pakistan is WATER MANAGEMENT and USE – increasing supply or desal will not solve the problem

Shiekh 12

(No Doubt Pakistan’s Water Crisis Is Predominantly Manmade – OpEd, http://www.eurasiareview.com/23012012-no-doubtpakistan%E2%80%99s-water-crisis-is-predominantly-manmade-oped/ , Naseem, Eurasia Review)

There is no doubt

Pakistan’s water crisis is

predominantly a ( hu)manmade

(SIC) problem . Pakistan’s climate is not particularly dry in fact semi arid to arid, nor is it lacking in rivers and groundwater.

Extremely poor management

, unclear laws

, government corruption, and industrial and human waste have caused this water supply crunch and rendered

what water

is available

practically useless due to the huge quantity of pollution. According to

World Bank report of 2006 Pakistan was fast moving from being a water-stressed country to a water-scarce country, primarily because of its high population growth, over-exploitation of ground water, pollution, poor repair in water infrastructures and financially no sustainability of water management system. Interestingly, the country’s large parts have good soil, sunshine and excellent farmers and these can get much more value from the existing flows. The most water-rich country in terms of the run-off from rain-fall to population is Iceland, with more than 500,000 cubic meters per person per year; the most water- poor are Egypt, with just 0.02 cubic meters. Water is absolutely essential for plant life. It is pertinent to mention here that the major source of drinking water in Pakistan is groundwater, so water availability is the second most serious issue. Future water demand will be affected by many factors, including population growth, wealth and sharing. Globally, it is estimated that between half a billion and almost two billion people are already under high water stress, and this number is expected to increase significantly by 2025, due primarily to population growth and increasing to climate change. We live in an agricultural region water is key for survival, water lost through mismanagement mainly

. A big investment in the repair of existing dams and the large scale construction of new water storage is simple solution of problem. In managing water resources, the Pakistani government must balance competing demands between urban and rural, rich and poor, the economy and the environment. However, because people have triggered this crisis, by changing their actions they have the power to prevent water scarcity from devastating

Pakistan ’s population, agriculture, and economy.

Saudi Arabia will use solar desal now – ensures low cost water in that country for the long-term

Picow 10

( http://www.greenprophet.com/2010/02/saudi-arabia-desalination-solar/ , Saudi Arabia to Replace Oil with Sun Power for Desalination

Plants , Maurice, February 1st, writes feature articles for the The Jerusalem Post as well as being a regular contributor to Green

Prophet. He has also written a non-fiction study on Islam)

You would imagine that a desert country like Saudi Arabia would have to rely a lot on desalination for a good part of the fresh water it uses. For example, a previous Green Prophet article told about the

Kingdom building what they say is the world’s largest desalination plant in the Al Jubail Industrial Zone on the shores of the Persian Gulf. Up to now, the more than

28 desalination plants scattered around the Kingdom have had to rely of fossil fuel

, most notably fuel oil, to provide to power to run the equipment used to extract salt and other minerals from sea water. Much of this may be changing, however, as

Saudi Arabia is now interested in using solar energy

to provide the power needed, instead of oil. According to an article on the UAE Top

News media site , the Kingdom is now planning to build solar energy based desal ination plants in order to save on energy costs, as well as be in tune with new environmental polices. This might be to secure membership in the International Renewable Energy

Agency, otherwise known as IRENA. Saudi Finance Minister Ibrahim Al Assaf said “desalination is our strategic choice to supply an adequate supply of drinking water to people across the Kingdom.” He added that by using solar energy

instead of oil, it will focus more on using renewable energy and even become an exporter of this

clean form of energy

as it has been doing with oil.

A tremendous amount of oil is currently being used

to provide power for the country’s desalination plants; around 1.5 million barrels per day.

This has caused the price of desal inated water to rise

as oil prices have risen. The use of solar energy to power desalination plants is just one of several projects in the Kingdom that are more environmentally friendly.

The Kingdom is also embarking of projects to improve its inland transport systems including building a high speed train network to carry pilgrims to and from the annual Hajj pilgrimage in the Holy Cities of Mecca and Medina. The new rail network, when completed, will be able to large numbers of people, and help to eliminate many of the thousands of buses which are currently used. In addition to desalination, solar energy will also be supplying energy to a country which has been historically known as being a world supplier of oil , especially to countries like the US.

Solar energy will eventually

enable to Saudis to not only have a renewable energy source to supply their own energy needs but will significantly reduce the cost of fresh water

, as well as being able to export renewable energy, as well as oil.

Uniqueness

Short term scarcity in the squo key to high water prices

Standard and Poors 12

( http://www.standardandpoors.com/spf/swf/water/data/document.pdf

, March 7, 2012, Is The U.S. Water Sector Approaching A

Tipping Point?)

Water scarcity can force a utility to spend more

on expensive marginal sources of drinking water (such as desalination and wastewater reuse) or reduce the volume available

to customers, which means utilities must raise the price per unit of water sold so total revenues will cover fixed costs.

High water prices cause tech innovations, water reuse and recycling

Standard and Poors 12

( http://www.standardandpoors.com/spf/swf/water/data/document.pdf

, March 7, 2012, Is The U.S. Water Sector Approaching A

Tipping Point?)

Yet, evidence is mounting that water stress is increasing, and water prices

in the U.S. will inevitably have to rise

. Over time, as stress turns into scarcity and regulators face requests for significant rate increases, economic decisions will have to be depoliticized. Still, we believe that as prices rise, so will incentives for technological innovations

, ways to reduce demand and opportunities to recycle and reuse

this commodity. Innovations will also occur in the financial markets and in the structure adopted by sponsoring entities. For example, the introduction

of public/private partnerships such as leases and concession contracts can introduce competition and provide greater flexibility for privatesector providers to meet the needs of municipally owned water utilities.

Link Turn

Desal causes massive emissions –

Rosenfeld 11 (David, “Conservationists Push Back Against Desalination in

California”) http://www.dcbureau.org/20110303169/natural-resources-newsservice/conservationists-push-back-against-desalination-in-california.html

There are problems too with desalination’s byproduct, the heavy concentrates of salt and the remains of other chemicals that could be dumped into the ocean.

Desalination also has a massive carbon footprint

. For the most common type of ocean desalination method called reverse osmosis, which pushes water through membranes, some 40 percent of the operating cost is electricity to power the plant

.

The $700 million proposed plant

in

Carlsbad by investor-owned Poseidon Resources expects to satisfy around 8 percent of San Diego County’s water supply while at the same time consuming as much electricity as 45,000 homes.

Greenhouse gas emissions would total about 200 million pounds a year , according to the project’s environmental impact assessment. Advocates say the technology is becoming more efficient by re-capturing energy and using renewable resources as much as possible.

But it is a lot to overcome.

Makes water scarcity worse

Nicholas

Stern

—Head of the British Government Economic Service—

2007

(Former Head Economist for the World Bank, I.G.

Patel Chair at the London School of Economics and Political Science, “The Economics of Climate Change: The Stern Review”, The report of a team commissioned by the British Government to study the economics of climate change led by Siobhan Peters, Head of

G8 and International Climate Change Policy Unit, Cambridge University Press, p. 62-63)

People will feel the impact of climate change most strongly through changes in the distribution of water around the world and its seasonal and annual variability. Water is an essential resource for all life and a requirement for good health and sanitation. It is a critical input for almost all production and essential for sustainable growth and poverty reduction. 12 The location of water around the world is a critical determinant of livelihoods. Globally, around 70% of all freshwater supply is used for irrigating crops and providing food. 22% is used for manufacturing and energy (cooling power stations and producing hydro-electric power), while only 8% is used directly by households and businesses for drinking, sanitation, and recreation. 13 Climate change will alter patterns of water availability by intensifying the water cycle. 14 Droughts and floods will become more severe in many areas. There will be more rain at high latitudes, less rain in the dry subtropics, and uncertain but probably substantial changes in tropical areas.15 Hotter land surface temperatures induce more powerful evaporation and hence more intense rainfall, with increased risk of flash flooding. Differences in water availability between regions will become increasingly pronounced.

Areas that are already relatively dry, such as the Mediterranean basin and parts of Southern Africa and South America, are likely to experience further decreases in water availability, for example several (but not all) climate models predict up to 30% decrease in annual runoff in these regions for a 2°C global temperature rise (Figure 3.2) and 40 – 50% for 4°C. 16 In contrast, South Asia and parts of Northern Europe and Russia are likely to experience increases in water availability (runoff), for example a 10 – 20% increase for a 2°C temperature rise and slightly greater increases for 4°C, according to several climate models. These changes in the annual volume of water each region receives mask another critical element of climate change – its impact on year-to-year and seasonal variability. An increase in annual river flows is not necessarily beneficial, particularly in highly seasonal climates, because: (1) there may not be sufficient storage to hold the extra water for use during the dry season, 17 and (2) rivers may flood more frequently.18 In dry regions, where runoff one-year-in-ten can be less than 20% of the average annual amount, understanding the impacts of climate change on variability of water supplies is perhaps even more crucial. One recent study from the Hadley Centre predicts that the proportion of land area experiencing severe droughts at any one time will increase from around 10% today to 40% for a warming of 3 to 4°C, and the proportion of land area experiencing extreme droughts will increase from 3% to 30%.19 In Southern Europe, serious droughts may occur every 10 years with a 3°C rise in global temperatures instead of every

100 years if today’s climate persisted .20 As the water cycle intensifies, billions of people will lose or gain water. Some risk becoming newly or further water stressed, while others see increases in water availability. Seasonal and annual variability in water supply will determine the consequences for people through floods or droughts. Around one-third of today’s global population live in countries experiencing moderate to high water stress, and 1.1 billion people lack access to safe water (Box 3.3 for an explanation of water stress). Water stress is a useful indicator of water availability but does not necessarily reflect access to safe water. Even without climate change, population growth by itself may result in several billion more people living in areas of more limited water availability. The effects of rising temperatures against a background of a growing population are likely to cause changes in the water status of billions of people. According to one study, temperature rises of 2°C will result in 1 – 4 billion people experiencing growing water shortages, predominantly in Africa, the Middle East, Southern Europe, and parts of South and Central America

(Figure 3.3).21 In these regions, water management is already crucial for their growth and development. Considerably more effort and expense will be required on top of existing practices to meet people’s demand for water. At the same time, 1 – 5 billion people, mostly in South and East

Asia, may receive more water. 22 However, much of the extra water will come during the wet season and will only be useful for alleviating shortages in the dry season if storage could be created

(at a cost). The additional water could also give rise to more serious flooding during the wet season.

Misc

Turn - Plankton

A) Desal destroys plankton and marine species

Matthews 11

(Richard Matthews is a consultant, eco-entrepreneur, green investor and author of numerous articles on sustainable positioning, enviro-politics and eco-economics. He is the owner of THE GREEN MARKET , a leading sustainable business blog and one of the

Web’s most comprehensive resources on the business of the environment., http://globalwarmingisreal.com/2011/03/23/aredesalination-technologies-the-answer-to-the-world-water-crisis/ , Are Desalination Technologies the Answer to the World Water

Crisis?)

In addition to its high cost, desalination technologies are harmful to the environment.

Removing salt from seawater produces brine

, which contains twice the salt of seawater; they also contain contaminants

that can affect marine life when dumped back to the sea. If brine is disposed on land, it could seep through the soil and pollute water reserves underground. The

US Environmental Protection Agency found that desalination plants kill at least 3.4 billion fish and other marine life annually. This represents a $212.5 million loss to commercial fisheries.

Desal ination plants can

also destroy up to 90 percent of plankton

and fish eggs in the surrounding water

.

B) Plankton key to all life on earth

IBMEC 12

(Island Bay Marine Education Center, http://www.octopus.org.nz/Plankton.html

, The Marine Education Centre is a not for profit charitable organisation, focussed on conservation through education, promoting the on-going care and sustainable use of Our Ocean, http://www.octopus.org.nz/Plankton.html

)

WHY SHOULD WE CARE ABOUT PLANKTON? Plankton are the basis of all life in the ocean and food for larger marine animals from shellfish to large fish and even whales. The largest fish in the world, the Whale Shark, is a plankton feeder and "krill", one of the ocean's smallest animals, is dinner for its largest, the blue whale! Studying plankton can tell scientists about water quality and the amount of nutrients in different areas of the oceans, and how many fish there are likely to be in future years. Almost

70% of the oxygen

we breathe comes from the oceans and is made by phytoplankton

.

Without phytoplankton

, there would be no life in

the oceans or on Earth

!! Plankton also absorbs most of the carbon dioxide (CO2) in the atmosphere (caused by cutting down forests and burning fossil fuels) by converting it to oxygen (O2) or by sinking it to the bottom of the sea where it canÕt escape. Land plants are really important too, but the health of the oceans is even more important. Plankton are the most abundant life form on Earth, except for bacteria. In fact, all the plankton in the oceans weigh more than all the dolphins, fish and whales put together! Plankton may be microscopic in size, but they play a giant role in the Earth's ecosystems!! Plankton is very important for all life on this planet.

Without it both the ocean and the land would become a desert

. Where there's lots of sunlight, phytoplankton grows quickly, mopping up carbon dioxide, releasing oxygen and providing food for zooplankton and the rest of the ocean food web including whales. When plankton die they fall to the bottom of the ocean and break down like compost and help fertilise new plankton growth.. But not all dead plankton breaks down quickly

. Some of it gets buried in layers of sand and mud which builds up over time crushing and heating the plankton and causing chemical changes.

Turn – Endocrine Disruption

A) Desalination causes boron pollution – resulting in mass endocrine disruption

Food and Water Watch 9

( http://documents.foodandwaterwatch.org/doc/Desal-Feb2009.pdf

, “Desalination an Ocean of Problems” Food & Water Watch is a nonprofit consumer organization that works to ensure clean water and safe food.)

Environmental damage is not the only danger from ocean desalination.

Desalted water

also puts the drinking water supply at risk

because both seawater and brackish water can contain chemicals that freshwater does not. These contaminants

include

chemicals such as endocrine disruptors, pharmaceuticals, personal care products and toxins from marine algae. 85

Some of these contaminants may not be adequately removed in the reverse osmosis process. Poseidon Resources, inc. After masterminding the failed Tampa Bay venture, Poseidon Resources, Inc. is trying its hand at ocean desalination a second time. Its proposed plant in Carlsbad, California, would be the largest ocean desalination plant in the western hemisphere — twice as large as the Tampa Bay plant. Poseidon Resources has been trying to get its plan approved for the last 10 years. The company has been relentless in its marketing, however, and is now promising that its plant will be carbon neutral. This claim is misleading. Poseidon’s calculation assumes that the amount of energy used by the desalination plant will be mostly offset by the energy that would have been required to import the same amount of water. However, there is insufficient evidence that desalinated water will actually replace imported water in the California water supply. 70 Unfortunately, the Coastal Commission, the governmental body charged with protecting the state’s coast, approved a permit for the plant in August 2008. 71 This sets a dangerous precedent. If the plant is built without proper consideration for social and environmental impacts, it may become the first in a long line of polluting, damaging plants along the California coast. Two conservation groups have filed a lawsuit against the San Diego Regional Water Quality Control

Board, charging that the board did not adequately study how the plant would harm marine life. 72 In December 2008, the San Diego

County Water Authority requested $175 million from the federal government as part of its economic stimulus package to subsidize the

$300 million project, which it would give to Poseidon in exchange for the company reducing its rates for the agencies buying the water. 73 The company has yet to secure financing for the movement of the water from the project, despite the fact that it is scheduled for construction in 2009. The federal taxpayer dollars would enable the company to realize a profit faster, while ratepayers will still be paying more than market price for desalinated water. 74 Food & Water Watch 9

Boron is a chemical of particular concern because much higher levels are found in seawater than freshwater.

However

, membranes can remove only

between

50 and

70 percent

of this element. The rest is concentrated in the product water, which enters the drinking water system. 86 While it is possible to remove more boron with a second process, existing plants don’t because it is too costly. 87 This is a major problem for the drinking water system because boron is known to cause reproductive

and developmental problems

in experimental animals and irritation of the human digestive tract. 88 Moreover, the world’s largest ocean desalination plant in

Ashkelon, Israel found that the boron in the desalted water acted as an herbicide when applied to crops. 89 Current drinking water reg ulation s do not protect the public from boron

. Recently, EPA made the preliminary determination that it would not regulate the element as a primary contaminant under the Safe Drinking Water Act because of its low occurrence in traditional sources of drinking water. 90 However, the studies that EPA used to make this decision did not take into account the hike in boron levels that would occur if desalted water was to be added to the system.

B) Endocrine disruption causes extinction

NRDC 2003

(Natural Resources Defense Council, “Toxic Chemicals And Health”, http://www.nrdc.org/health/effects/bendrep.asp

, accessed 9-12)

Experiments in lab animals indicate that while high doses of PCBs can be toxic, lower doses can cause hypo-or hyper-activity, impaired performance on tests of learning, balance, reaction time, and impaired hearing.The doses of exposure that result in behavioral abnormalities, sex hormone abnormalities, and enzymeabnormalities are close to the current exposure levels in humans. We are concerned about endocrine disruption

because this is a means by which subtle effects from human actions can have species- and

population-extinction

outcome s.

Small

, but critical, changes

in the chemical makeup of an environment are enough to trigger outcomes that could lead to population decline and loss of biodiversity.

Case Neg

***Inherency

1NC DOE

DOE already contracted out SMR’s to the industry— solves widespread adoption

McMahon 12

(Jeff, contributing writer for Forbes, May 25, 2012, “Small Modular Nuclear Reactors By 2022 -- But No Market For Them” http://www.forbes.com/sites/jeffmcmahon/2012/05/23/small-modular-reactors-by-2022-but-no-market-for-them/)

The Department of Energy will spend $452 million—with a match from industry—over the next five years to guide two small modular reactor designs through the nuclear regulatory process by 2022. But cheap natural gas could freeze even small nuclear plants out of the energy market well beyond that date. DOE accepted bids through Monday for companies to participate in the Small Modular Reactor program. A number of reactor manufacturers submitted bids, including NuScale

Power and a collaboration that includes Westinghouse and General Dynamic. “This would allow SMR technology to overcome the hurdle of NRC certification – the ‘gold standard’ of the international nuclear industry, and would help in the proper development of the NRC’s regulatory framework to deal with SMRs,” according to Paul Genoa, Senior Director of

Policy Development at the Nuclear Energy Institute. Genoa’s comments are recorded in a summary released today of a briefing given to Senate staff earlier this month on prospects for small modular reactors, which have been championed by the

Obama Administration. DOE defines reactors as SMRs if they generate less than 300 megawatts of power, sometimes as little as 25 MW, compared to conventional reactors which may produce more than 1,000 MW. Small modular reactors can be constructed in factories and installed underground, which improves containment and security but may hinder emergency access. The same summary records doubt that SMRs can compete in a market increasingly dominated by cheap natural gas.

Nuclear Consultant Philip Moor told Senate staff that SMRs can compete if natural gas costs $7 to $8 per million BTU—gas currently costs only $2 per MBTU—or if carbon taxes are implemented, a scenario political experts deem unlikely.

2NC DOE

Here’s more ev-DOE is taking the wheel on SMR’s now

Chu and Majumda 12

*Steven, US Department of Energy **Arun, US Department of Energy

“Opportunities and challenges for a sustainable energy future”Nature 488, 294–303 (16 August 2012))

The US DOE intends to support the engineering designs required by the Nuclear Regulatory Commission for licensing smallmodular reactors between 80 MW and 300 MW (refs. 76, 77). It is possible that safe nuclear power can be made more accessible through the economy of constructing dozens of reactors in a factory rather than one at a time at each site. Also, with the risk of licensing and construction delays reduced, small-modular reactors may represent a new paradigm in nuclear construction. The US DOE has also established an Energy Innovation Hub to develop multiphysics computational simulation tools to reduce the time needed to design and certify many aspects of both conventional reactors and small-modular reactors78.

Squo commercializes SMRs by 2020

NEI 12

“Small Reactors” http://www.nei.org/keyissues/newnuclearplants/small-reactors/ )

The Energy Department’s 2012 budget includes research and development funding for small reactors. Legislation recently introduced in the U.S. Senate, S. 512, the Nuclear Power 2021 Act, and S. 1067, the Nuclear Energy Research Initiative

Improvement Act, would help move two reactor designs through the safety certification process and study ways to reduce manufacturing and construction costs. The first small light water reactor could be licensed and in operation around 2020. The other types—high-temperature gas-cooled reactors and liquid-metal cooled and fast reactors—will follow.

***Solvency—Generic

Industry Indict 1NC

SMRs fail and be skeptical of their authors

-unachievable cost assumptions

-wrong assumptions of a rushed market

-safety regulaitons

Cooper 14

(Mark, Senior Fellow for Economic Analysis @ Institute for Energy and the Environment at Vermont Law School, May 15, 2014,

“The Economic Failure of Nuclear Power and the Development of a Low-Carbon Electricity Future: Why Small Modular Reactors

Are Part of the Problem, Not the Solution” http://216.30.191.148/Cooper%20SMRs%20are%20Part%20of%20the%20Problem,%20Not%20the%20Solution%20FINAL2.pdf

)

Unachievable assumptions abou t cost :

Even industry executives and regulators believe the SMR technology will have costs that are substantially higher than the failed “nuclear renaissance” t echnology on a per unit of output.

The higher costs result from • lost economies of scale in conta inment structures, dedicated systems for control, management and emergency response , and the cost of licensing and security, • operating costs between one - fifth and one - quarter higher, and • decommissioning costs between two and three times as high.

Irrespons ible assumptions about a rush to market :

To reduce the cost disadvantage and meet the urgent need for climate policy, advocates of SMR technology propose to deploy large numbers of reactors (50 or more)

, close to population centers, over a short period of time.

This compressed RD&D schedule embodies a rush to market that does not make proper provision for early analysis, testing , and demonstration to provide an opportunity for experience - based design modifications.

This is exactly the problem that arose in the 1970 s , when utilities order ed 250 reactors and ended up cancelling more than half of them when the technology proved to be expensive and flawed.

Unrealistic assumptions about the scale of the sector : While each individual reactor would be smaller, the idea of creating an assembly line for SMR technology would require a massive financial commitment.

If two designs and assembly lines are funded to ensure competition, by 2020 an optimistic cost scenario suggests a cost of more than $72 billion; a more rea listic level would be over $90 billion.

This massive commitment reinforces the traditional concern that nuclear power will crowd out the alternatives.

Compared to U.S. Energy Information Administration ( EIA ) estimates of U.S. spending on generation over th e same period, these huge sums are equal to • three quarters of the total projected investment in electricity generation and • substantially more than the total projected investment in renewables.

Radial changes in licensing and s a fet y r egulation :

SMR techn ologies raise unique safety challenges including i nspection of manu facturing and foreign plants , a ccess to below ground facilities , i ntegrated systems , w as te management , r etrieval of materials with p otentially higher levels of radiation , f looding for below - ground facilities , and c ommon design s that create potential

“epidemic” failure . Yet , SMR advocates want pre - approval and limited review of widely dispersed reactors located in close proximity to population centers and reductions in safety margins , includi ng s hrinking containment structures, l imitations of staff for safety and security , c onsolidation of control to reduce redundancy , and much smaller ev acuation zones . In the wake of global post - Fukushima calls for more rigorous safety regulation , policymaker s and safety regulators are likely to look askance at proposals to dramatically relax safety oversight.

Commercialization 1NC

SMRs can’t be commercialized—laundry list of reasons

Clean Air Task Force 2012

(Nonprofit organization dedicated to reducing atmospheric pollution through research, advocacy, and private sector collaboration.

March 2012, “The Nuclear Decarbonization Option: Profiles of Selected Advanced Reactor Technologies” http://www.catf.us/resources/publications/files/Nuclear_Decarbonization_Option.pdf

)

Note- smLWR=Small modular reactors

Challenges to Commercialization The smLWRs have a number of challenges to successful commercialization. These include regulatory and economic challenges and overcoming the negative public reaction to the Fukushima accident. There are a number of regulatory issues related to the licensing and deployment of smLWRs in spite of the fact that the current licensing base is built around light water reactors. The NRC regulations are designed for large plants. All of the PWRs in the US are non-integral, i.e. separate reactor pressure vessel, steam generators, reactor coolant pumps and pressurizers. Some of the major issues under consideration by the US NRC and the principal contentions are presented below: l Nuclear insurance –

Price Anderson - The premium payments paid annually by the owner/operators for Price Anderson liability insurance are based on the number of reactors owned. The premium is based on large LWRs. It is believed that the second party liability incurred by smLWRs are significantly less than for plants with core thermal ratings as 30 times greater. Proposals to amend the premium structure to account for core power output are under development. l Annual fees – The NRC assesses their annual fees to the licensees on the basis of the number of plants. The NRC has to collect 90% of their annual operating budget from direct fees charged for review, licensing and inspection or indirect fees charged to each nuclear power plant. In

2010, each nuclear plant was charged $4,719,000 regardless of the power output. It is felt that each smLWR should not be charged the same amount as the larger plants. A sliding scale is proposed to deal with the size discrepancies. l Staffing –

Current control room staffing requirements are based on large reactors with fully analog control room technology. The control rooms and I&C systems for the smLWRs should be fully digital, possibly with a separate analog system to provide redundancy and diversity in the shutdown of the smLWRs. The inherent safety of the new smLWR designs in conjunction with the fully digital control systems with a high degree of automation should permit the safe operation of the smLWRs without the tradition one control team for each reactor, used in the existing plants. Alternative staffing requirements are under discussion. Summary Table of smLWRs The following summary table compares and contrasts the three smLWR designs.

The information in the table is provided by the NSSS designers. The decay heat removal time listed is the minimum time without any intervention. One design can theoretically go indefinitely with no intervention. The others require minimal intervention, such as filling of unpressurized tanks and pools, to maintain adequate decay heat removal for an indefinite period of time. smLWR Summary Table 9MARSTON / smLWRs 9 l Security – Security requirements for US LWRs have increased substantially since the terrorist events of 11 Sept 2001. The requirements are based on new threats and the ability for existing reactors to respond to those threats. The smLWR designs include security in the design and have taken major steps to reduce the security needs. For example, the entire nuclear steam supply system (NSSS), spent fuel pool and containment for all designs are located below grade. The access to control and radioactive material areas is significantly reduced over the existing plants. State of the art security and intrusion detection systems are part of the design. Therefore, it is believed that adequate security of a smLWR can be maintained with simplified security requirements. Proposed simplifications are under development for smLWRs. l Emergency planning – size of emergency planning zones – The emergency planning and the zone of evacuation for US plants is based on the existing fleet. The smLWRs are significantly different in terms of source term in the case of a core melt event. The smLWR core damage frequencies are orders of magnitude lower than what is required in the NRC regulations. 10 The containments are located below grade and the long term cooling needs of a beyond design basis core damage event are much less. For these reasons, the industry believes the current emergency planning zones and notification requirements can be greatly simplified and still protect the health and safety of the public. Proposed simplifications of emergency planning for the smLWRs are currently under development. Such simplification is required to locate a smLWR near regions of high populations, such as those surrounding the existing coal plants that will likely be shut down. This simplification will be a major challenge in light of the 2011 Fukushima accident in

Japan. Regulatory challenges could make smLWRs noncompetitive. If the licensing of smLWRs become protracted affairs, the attractiveness of such small plants will vanish. The best hope for smLWRs to be competitive lies in the assumption that they can be licensed, built and commissioned quickly. The primary economic challenge to the commercialization of smLWRs is whether the electricity production costs are (1) affordable and (2) competitive with other forms of generation. With regard to affordability, smLWRs offer potential optionality to the US electric utilities, when the only real options for large generation additions are gas fired, coal fired or large nuclear plants. SmLWRs, being smaller and modular, potentially offer a more manageable nuclear option. SmLWRs are more ‘affordable’, i.e. less of a fiscal risk. They can be deployed in much smaller increments, matching the utilities’ load growths better and reduce the ‘single shaft’ generation risk to an acceptable level. Competing with other forms of electricity generation is a much greater challenge today. Vast amounts of natural gas are being discovered across the US in so-called tight gas (shale) deposits, resulting in cheap and abundant natural gas. The current spot market price of natural gas is less than $3.00/MMBTU. Carbon restraints (taxes or credits), which would improve the competitiveness of smLWRs, appear unlikely to arise in the near future. However it is expected that carbon emissions from large stationary sources will be reduced systematically over time one way or another, and US utilities are very interested in reducing their ‘carbon footprints’. If the economics of the smLWRs are what some of the designs claim, there is a real chance to compete with natural gas fired plants, particularly when carbon constraints are in place. The cost competitiveness of smLWR depend heavily on achieving the following opportunities: l Streamline design and manufacturing are necessary to offset the economies of scale of other generation options, particularly nuclear plants. ALWRs are becoming larger and larger due to the economies of scale. The only prospect to reverse this effect for the smaller smLWRs is to streamline the shop fabrication of the NSSS and other modules, ship them to the site and install them rapidly. The requisite quality standards must be maintained throughout the entire process. l Modularity of the smLWRs provides the opportunity to

transform how we design, build, operate and decommission nuclear power plants. l Reduce construction time by modularization and construction efficiencies l SMRs do not require loan guarantees. This sets the smLWR apart from the larger ALWR, which currently benefit from federal loan guarantees, especially for regulated utilities. Experience shows the loan guarantee process to be a protracted and expensive affair, requiring the expenditure of significant political and fiscal capital. How the impacts of the Fukushima accident affect smLWR development and deployment is unclear. The passive nature of the safety systems and the reduced need for AC power following shutdown should be positive attributes. Likewise, the depth of the containment should mitigate certain security concerns, but may raise flooding concerns. However, the idea of locating a number, up to 10 MARSTON / smLWRs twelve, of smLWRs at a single plant site may become a liability in the eyes of the public. The sequential failure of the Fukushima reactors followed by the hydrogen explosions will be long lasting memories for the public. It may be difficult to convince the public that more reactors at a site is safe, in spite of the fact that the single reactor failure source term is much smaller than current reactors and that there is little chance for system interaction in the new designs.

Licensing Barrier 1NC

Licensing questions prevent solvency- takes too long

O’ Connor ’11

(Dan O’Connor is a Policy Fellow in AEL’s New Energy Leaders Project and will be a regular contributor to the website, American

Energy League, “Small Modular Reactors: Miracle, Mirage, or Between?”, http://leadenergy.org/2011/01/small-modular-reactorsmiracle-mirage-or-medium/, January 4, 2011, LEQ)

Judging only by this promising activity,

it is tempting to dub the SMR a miracle. But the majority of these diverse designs

have yet to be demonstrated

.

In fact, the demonstration stage of the South

African project, Pebble Bed Modular Reactor (a HTR), stalled and faded in 2010 after losing government funding due to lack of customer interest. The importance of demonstration, especially in the highly-regulated US industry, cannot be overstated.

But even in the stages before the crucial demonstration step, skepticism over the SMR’s promises abounds.

The ASME EnComm noted

regulatory, financial, operational, and logistical challenges

.

Treading the uncharted waters of Lego-like power plant construction will not be easy

. In a traditional plant, one reactor provides heat for one or a few steam turbines. In an SMR-based plant, each module drives one turbine with its own controls and operators. As such, few of the costs associated with these systems scale down with reactor capacity. The turbines do not come in a complimentary plug-and-play form either – they would have to be built on site. And while decentralization enables partial operation and online refueling, it also introduces the challenge of module co-operation, the need for numerous highly-trained operator personnel, and brand new reviews by the Nuclear

Regulatory Commission (NRC).

This goes without mentioning the urgent and increased need for a more dynamic national approach to waste storage

.

Licensing questions remain too.

The one-time approval of a module before its mass production, bypassing a regulatory damper for each unit, is a highly-desirable advantage of SMR design. But if a utility would like to increase its capacity over two decades by incrementally adding more modules, will it face the choice between building licensed, though dated, technology or waiting again for a license to build with state of the art modules? Furthermore, as addressed in my past article, “Putting the Cart Before the Horse with Nuclear R&D” and its comments, the waiting time even for a traditional design license is considerable. With each new SMR innovation, from an individualized control room to coolant choice, the licensing duration increases by as much as a decade, pushing the vital demonstration step further away. Additional costs associated with these regulatory complications and non-scalable systems could combine to nullify the SMR’s affordability argument.

Long Timeframe 1NC

Tech isn’t there for a decade

Tomich 12

(Jeffery, April 25, 2012, “Small nuclear reactors generate hype, questions about cost” http://www.stltoday.com/business/local/smallnuclear-reactors-generate-hype-questions-about-cost/article_39757dba-8e5c-11e1-9883-001a4bcf6878.html)

For all the hype, small reactors, are still at least a decade away. And that’s if design, licensing and commercial development go at the pace hoped for by the nuclear industry. And even then, the potential for

sm

all

r

eactors hinges on how they compete in the energy marketplace. More than concerns about nuclear safety in the wake of Fukushima disaster in Japan or the dilemma of where to dispose of highly radioactive spent nuclear fuel, the technology’s future will be dictated by economics.

Long Timeframe 2NC

2020 before it can be commercialized

Clean Air Task Force 2012

(Nonprofit organization dedicated to reducing atmospheric pollution through research, advocacy, and private sector collaboration.

March 2012, “The Nuclear Decarbonization Option: Profiles of Selected Advanced Reactor Technologies” http://www.catf.us/resources/publications/files/Nuclear_Decarbonization_Option.pdf

)

Three smLWR designs appear to have the greatest potential for commercial success in the 2020 timeframe. The designs are integral, pressurized water reactors (PWRs), i.e. designs in which the major nuclear steam supply system (NSSS) components, including reactor and core, steam generators, pressurizer and pumps (if part of the design), are housed in a single pressure vessel. The original integral PWR was designed to power the German commercial ship, N.S. Otto Hahn, commissioned in 1968, which sailed without incident for over one million kilometers. The ship was converted to conventional power in 1979 for economic reasons. The three designs selected are: l NuScale reactor (45 MWe, natural circulation) under development by NuScale Power, Inc., l mPower reactor (180 MWe, forced circulation) under development by the team led by the Babcock & Wilcox Company, l SMR (>225 MWe, forced circulation) under development by

Westinghouse Electric Company LLC. We briefly summarize each design, provide a figure showing the overall plant, the containment structure and the integral reactor pressure vessel, and identify the commercialization strategy. Following the individual smLWR descriptions, a table compares and contrasts the three designs. Each of the designs uses a shortened length variant of standard commercial 17X17 PWR fuel. All of the three smLWRs are designed for 60 years of operation.

Natural Gas 1NC

No market for SMRs—natural gas means it can’t compete

McMahon 12

(Jeff McMahon, Contributor for Forbes, “Small Modular Nuclear Reactors By 2022 -- But No Market For Them”, http://www.forbes.com/sites/jeffmcmahon/2012/05/23/small-modular-reactors-by-2022-but-no-market-for-them/ , May 23, 2012,)

A small modular reactor design. The Department of Energy will spend $452 million—with a match from industry—over the next five years to guide two small modular reactor designs through the nuclear regulatory process by 2022.

But cheap natural gas could freeze even small nuclear plants out of the energy market well beyond that date.

DOE accepted bids through Monday for companies to participate in the Small Modular Reactor program. A number of reactor manufacturers submitted bids, including NuScale Power and a collaboration that includes Westinghouse and General Dynamic. “This would allow SMR technology to overcome the hurdle of NRC certification – the ‘gold standard’ of the international nuclear industry, and would help in the proper development of the NRC’s regulatory framework to deal with SMRs,” according to Paul Genoa, Senior Director of Policy Development at the Nuclear Energy Institute. Genoa’s comments are recorded in a summary released today of a briefing given to Senate staff earlier this month on prospects for small modular reactors, which have been championed by the Obama Administration. DOE defines reactors as SMRs if they generate less than 300 megawatts of power, sometimes as little as 25 MW, compared to conventional reactors which may produce more than 1,000 MW. Small modular reactors can be constructed in factories and installed underground, which improves containment and security but may hinder emergency access.

The same summary records doubt that

SMRs can compete in a market increasingly dominated by cheap natural gas. Nuclear

Consultant Philip Moor told Senate staff that SMRs can compete if natural gas costs $7 to

$8 per million BTU—gas currently costs only $2 per MBTU—or if carbon taxes are implemented, a scenario political experts deem unlikely. “Like Mr. Moor, Mr. Genoa also sees the economic feasibility of SMRs as the final challenge.

With inexpensive natural gas prices and no carbon tax, the economics don’t work in the favor of SMRs,” according to the summary.

Prices 1NC

SMRs are expensive-smaller size means more cost

Makhijani 11

(Arjun, electrical and nuclear engineer who is President of the Institute for Energy and Environmental ResearchJune 15, 2011, “The problems with small nuclear reactors” http://thehill.com/blogs/congress-blog/energy-a-environment/166609-the-problems-with-smallnuclear-reactors

The devil, as usual, is in the details. For instance, the cost of a nuclear reactor per unit of electrical generating capacity declines with increasing size. This is because, contrary to intuition, larger reactors use less material per unit of capacity than smaller reactors. When the size of given type of reactor is reduced from 1,000 to 100 megawatts, the amount of material used per megawatt will more than double. And the notion that U.S. workers would get the bulk of the factory jobs is entirely fanciful, given the rules of the World Trade Organization on free trade. Most likely the reactors would be made in China or another country with industrial infrastructure and far lower wages. And what would we do if the severe quality problems with

Chinese products, such as drywall and infant formula, afflict reactors? Will there be a process for recalls, as has happened with factory products from Toyotas to Tylenol? How do you recall a radioactively-contaminated, mass-produced nuclear reactor if it has problems? There are economies of scale associated with security, too. Today, large crews staff a reactor control room round-the-clock and guard the site. To reduce operating costs, some vendors are advocating to lower the number of security staff and to require only one operator for three modules, raising serious questions about whether there would be sufficient personnel in the event of an accident or attack.

Prices 2NC

SMRs are expensive a) materials

Makhijani & Boyd 10

*Arjun, electrical and nuclear engineer who is President of the Institute for Energy and Environmental Research **Michele, former director of the Safe Energy Program at Physicians for, Staff Scientist at the Institute for Energy and Environmental Research

(“Small Modular Reactors No Solution for the Cost, Safety, and Waste Problems of Nuclear Power” http://ieer.org/wp/wpcontent/uploads/2010/09/small-modular-reactors2010.pdf)

SMR proponents claim that small size will enable mass manufacture in a factory, enabling considerable savings relative to field construction and assembly that is typical of large reactors. In other words, modular reactors will be cheaper because they will be more like assembly line cars than hand-made Lamborghinis. In the case of reactors, however, several offsetting factors will tend to neutralize this advantage and make the costs per kilowatt of small reactors higher than large reactors.

First, in contrast to cars or smart phones or similar widgets, the materials cost per kilowatt of a reactor goes up as the size goes down. This is because the surface area per kilowatt of capacity, which dominates materials cost, goes up as reactor size is decreased. Similarly, the cost per kilowatt of secondary containment, as well as independent systems for control, instrumentation, and emergency management, increases as size decreases. Cost per kilowatt also increases if each reactor has dedicated and independent systems for control, instrumentation, and emergency management. For these reasons, the nuclear industry has been building larger and larger reactors in an effort to try to achieve economies of scale and make nuclear power economically competitive.

b) containment structures

Makhijani & Boyd 10

*Arjun, electrical and nuclear engineer who is President of the Institute for Energy and Environmental Research **Michele, former director of the Safe Energy Program at Physicians for, Staff Scientist at the Institute for Energy and Environmental Research

(“Small Modular Reactors No Solution for the Cost, Safety, and Waste Problems of Nuclear Power” http://ieer.org/wp/wpcontent/uploads/2010/09/small-modular-reactors2010.pdf)

Proponents argue that because these nuclear projects would consist of several smaller reactor modules instead of one large reactor, the construction time will be shorter and therefore costs will be reduced. However, this argument fails to take into account the implications of installing many reactor modules in a phased manner at one site, which is the proposed approach at least for the United States. In this case, a large containment structure with a single control room would be built at the beginning of the project that could accommodate all the planned capacity at the site. The result would be that the first few units would be saddled with very high costs, while the later units would be less expensive. The realization of economies of scale would depend on the construction period of the entire project, possibly over an even longer time span than present largereactor projects. If the later-planned units are not built, for instance due to slower growth than anticipated, the earlier units would likely be more expensive than present reactors, just from the diseconomies of the containment, site preparation, instrumentation and control system expenditures. Alternatively, a containment structure and instrumentation and control could be built for each reactor. This would greatly increase unit costs and per kilowatt capital costs. Some designs (such as the PBMR) propose no secondary containment, but this would increase safety risks. These cost increases are unlikely to be offset even if the entire reactor is manufactured at a central facility and some economies are achieved by mass manufacturing compared to large reactors assembled on site.

c) NRC Licensing and sodium coolants

Makhijani & Boyd 10

*Arjun, electrical and nuclear engineer who is President of the Institute for Energy and Environmental Research **Michele, former director of the Safe Energy Program at Physicians for, Staff Scientist at the Institute for Energy and Environmental Research

(“Small Modular Reactors No Solution for the Cost, Safety, and Waste Problems of Nuclear Power” http://ieer.org/wp/wpcontent/uploads/2010/09/small-modular-reactors2010.pdf)

Furthermore, estimates of low prices must be regarded with skepticism due to the history of past cost escalations for nuclear reactors and the potential for cost increases due to requirements arising in the process of NRC certification. Some SMR designers are proposing that no prototype be built and that the necessary licensing tests be simulated. Whatever the process, it will have to be rigorous to ensure safety, especially given the history of some of proposed designs. The cost picture for sodium-cooled reactors is also rather grim. They have typically been much more expensive to build than light water reactors, which are currently estimated to cost between $6,000 and $10,000 per kilowatt in the US. The costs of the last three large breeder reactors have varied wildly. In 2008 dollars, the cost of the Japanese Monju reactor (the most recent) was $27,600 per kilowatt (electrical); French Superphénix (start up in 1985) was $6,300; and the Fast Flux Test Facility (startup in 1980) at

Hanford was $13,800. 11 This gives an average cost per kilowatt in 2008 dollars of about $16,000, without taking into account the fact that cost escalation for nuclear reactors has been much faster than inflation. In other words, while there is no recent US experience with construction of sodium-cooled reactors, one can infer that (i) they are likely to be far more expensive than light water reactors, (ii) the financial risk of building them will be much greater than with light water reactors due to high variation in cost from one project to another and the high variation in capacity factors that might be expected.

Even at the lower end of the capital costs, for Superphénix, the cost of power generation was extremely high—well over a dollar per kWh since it operated so little. Monju, despite being the most expensive has generated essentially no electricity since it was commissioned in 1994. There is no comparable experience with potassium-cooled reactors, but the chemical and physical properties of potassium are similar to sodium.

Waste Storage Takeout 1NC

SMRs create waste—multiple locations increases storage problems

Makhijani & Boyd 10

*Arjun, electrical and nuclear engineer who is President of the Institute for Energy and Environmental Research **Michele, former director of the Safe Energy Program at Physicians for, Staff Scientist at the Institute for Energy and Environmental Research

(“Small Modular Reactors No Solution for the Cost, Safety, and Waste Problems of Nuclear Power” http://ieer.org/wp/wpcontent/uploads/2010/09/small-modular-reactors2010.pdf)

Proponents claim that with longer operation on a single fuel charge and with less production of spent fuel per reactor, waste management would be simpler. In fact, spent fuel management for SMRs would be more complex, and therefore more expensive, because the waste would be located in many more sites. The infrastructure that we have for spent fuel management is geared toward light-water reactors at a limited number of sites. In some proposals, the reactor would be buried underground, making waste retrieval even more complicated and complicating retrieval of radioactive materials in the event of an accident

. For instance, it is highly unlikely that a reactor containing metallic sodium could be disposed of as a single entity, given the high reactivity of sodium with both air and water. Decommissioning a sealed sodium- or potassiumcooled reactor could present far greater technical challenges and costs per kilowatt of capacity than faced by present-day aboveground reactors

Takes out the aff—means no more development

Spenser 12

(Jack, senior research fellow in Nuclear-Energy Policy at the Heritage Foundation, May 10, 2012, “Uncle Sam, Derelict Nuclear-

Waste Disposer”, http://www.heritage.org/research/commentary/2012/05/uncle-sam-derelict-nuclear-waste-disposer)

Solyndra was back in the news recently. The big story: It had left some buckets and barrels of toxic waste at one of its shuttered facilities. Big whoop. The federal government has abandoned 65,000 metric tons of spent nuclear fuel at plants across the country — even though it's legally required to remove it all. The Obama administration has often been criticized for its lawless behavior, but its [non]handling of spent nuclear fuel is an especially egregious example of its complete disregard for the law of the land. The waste currently sitting at nuclear sites across the country was generated with the understanding, codified in law, that the government would be responsible for removing it. But, remove it to where? Enter the proposed Yucca Mountain nuclear-waste repository. The debate over Yucca Mountain has gone on for decades. But this week, a federal appeals court heard arguments that could result in a court order forcing the Nuclear Regulatory Commission

(NRC) to complete its long-overdue review of the Yucca proposal. To fully understand the extent to which the Obama administration, specifically the Department of Energy (DOE), has acted in direct contradiction of federal statutes, a quick refresher on the Yucca project may be helpful. In 1982, Congress enacted the Nuclear Waste Policy Act. It mandated DOE to build a national repository and move spent nuclear fuel from reactor sites to the repository. Five years later, Congress enacted a law specifying that the repository would be located in Yucca Mountain, an NRC safety review and license. In 2008 — 21 years and some $15 billion later — the DOE finally submitted its licensing application to the NRC. The law requires the

NRC to make a decision on DOE's application within three years of submission. With that deadline fast approaching in 2010,

Mr. Obama's DOE defied the law's clearly defined statutes and tried to withdraw its licensing application for Yucca

Mountain. The president quickly proceeded to striking

any

future funds for Yucca. Mr. Obama's decision to unilaterally terminate the Yucca project is troubling enough. But his attempted justification was even more objectionable. The administration admitted it had no scientific or technical grounds for terminating the project. Mr. Obama was pulling the plug, he said, simply because it lacked "social and political acceptance." This is a textbook example of letting political motivations trump the rule of law. Essentially, the president decided that, if he disagrees with a law, he needn't uphold it. And why work with Congress to change a law, when it's so much easier just to act as though it doesn't exist? By blocking the Yucca project, the administration has

imperiled

the future of nuclear energy. The growing waste stockpiles must be safely disposed of somewhere, yet the president has offered no Plan B. Sure, he appointed a blue-ribbon commission to look into the matter, but its recommendations are simply gathering dust. Besides, even the commission acknowledged that the nation needs a geologic repository. The only reason it didn't address Yucca is because Energy Secretary Steven Chu explicitly told the commissioners not to. Yucca matters because the government's inability to fulfill its legal waste-disposal obligations creates a huge impediment to building additional nuclear-power plants. The federal government's refusal to take possession of the used fuel leaves itself (that is, the taxpayers) liable to the plant operators for an increasingly enormous amount. And it leaves plant owners in the tenuous position of having to store ever-increasing amounts of waste on site indefinitely. That creates a great deal of uncertainty for investors in nuclear energy. It's all quite sad. No scientific or technical data existed to merit ending the project. But, desiring to please the anti-nuclear crowd, the administration chose the backdoor route to kill Yucca. Without a solution to allow for the proper disposal of nuclear waste, the administration is slowly killing cheap, reliable and "green" nuclear energy.

Waste Storage Takeout 2NC

Waste management prevents SMR development

Spencer et al 11

Feb 2, Jack Research Fellow in Nuclear Energy in the Thomas A. Roe Institute for Economic Pol- icy Studies, and Nicolas D. Loris is a Research Associ- ate in the Roe Institute, at The Heritage Foundation, “A Big Future for Small Nuclear Reactors?” http://scholar.googleusercontent.com/scholar?q=cache:EYpY6X7GidoJ:scholar.google.com/+%22Small+Modular+Reactors%22+%2

2Rural%22&hl=en&as_sdt=0,5

Nuclear Waste Management. The lack of a sustainable nuclear waste management solution is perhaps the

greatest obstacle to a broad expansion of U.S. nuclear power

. The federal government has failed to meet its obligations under the 1982 Nuclear Waste Policy Act, as amended, to begin collecting nuclear waste for disposal in

Yucca Mountain. The Obama Administration’s attempts to shutter the existing program to put waste in Yucca Mountain without having a backup plan has worsened the situation. This outcome was predictable because the current program is based on the flawed premise that the federal government is the appropriate entity to manage nuclear waste. Under the current system, waste producers are able to largely ignore waste management because the federal government is responsible. The key to a sustainable waste management policy is to directly connect financial responsibility for waste management to waste production. This will increase demand for more waste-efficient reactor technologies and drive innovation on wastemanagement technologies, such as reprocessing. Because SMRs consume fuel and produce waste differently than LWRs, they could contribute greatly to an economically efficient and sustainable nuclear waste management strategy.

AT Safer 1NC

SMRs not safer and their authors are biased—fuel transport and limited access

Baker 12

(Matthew Baker, staff writer American Security Project, 6/22/12, “Do Small Modular Reactors Present a Serious Option for the

Military’s Energy Needs?” http://americansecurityproject.org/blog/2012/do-small-modular-reactors-present-a-serious-option-for-themilitarys-energy-needs/)

Unfortunately all the hype surrounding SMRs seems to have made the proponents of SMR technology oblivious to some of

its huge flaws

. Firstly like large reactors, one of the biggest qualms that the public has to nuclear is problems associated with nuclear waste. A more decentralized production of nuclear waste inevitably resulting from an increase in

SMRs production was not even discussed. The danger of transporting gas into some military bases in the Middle East is already extremely volatile; dangers of an attack on the transit of nuclear waste would be devastating. Secondly, SMRs pose many of the same problems that regular nuclear facilities face, sometimes to a larger degree. Because SMRs are smaller than conventional reactors and can be installed underground, they can be more difficult to access should an emergency occur.

There are also reports that because the upfront costs of nuclear reactors go up as surface area per kilowatt of capacity decreases, SMRs will in fact be more expensive than conventional reactors. Thirdly, some supporters of SMR technology seem to have a skewed opinion of public perception toward nuclear energy. Commissioner of the U.S. Nuclear Regulatory

Commission, William C. Ostendorff, didn’t seem to think that the recent Fukushima disaster would have any impact on the development on SMRs. Opinion polls suggest Americans are more likely to think that the costs of nuclear outweigh its benefits since the Fukushima disaster. For SMRs to be the philosopher’s stone of the military’s energy needs the public needs to be on board. The DESC’s briefing did illustrate the hype that the nuclear community has surrounding SMRs, highlighting some pressing issues surrounding the military’s energy vulnerability. But proponents of SMRs need to be more realistic about the flaws associated with SMRs and realize that the negative impacts of nuclear technology are more costly than its benefits.

AT Safer 2NC

SMRs not safer—extend Baker

1) Waste transport—not containment method at forward locations, makes target for attacks

2) Limited access—have to be buried underground means no containments of accidents

SMRs uniquely dangerous—decentralizes safety—fuel cycle, heat loads, reactive materials and waste

Smith 11

(Gar,Editor Emeritus of Earth Island Journal, a former editor of Common Ground magazine, a

Project Censored Award-winning journalist, and co-founder of Environmentalists Against War, NUCLEAR ROULETT: THE CASE

AGAINST A“NUCLEAR RENAISSANCE” pg 58)

But mini-reactor detractors note that SMRs would still depend on a costly, inefficient and hazardous fuel cycle that generates intense heat loads, employs dangerous materials (like helium and highly reactive sodium) and produces nuclear waste.

Building mini-nukes would decentralize and scatter all the operational risks of supplying, maintaining, safeguarding and dismantling nuclear reactors.A larger, aboveground “pocket-reactor”would still require its own control room operators and security personnel and, because of “economies of scale,” smaller reactors could wind up costing

more than large plants

.Tom Clements, a nuclear power critic with Friends of the Earth, charged the proposal to develop SMRs at

Savannah River was simply a ploy by the nuclear industry to avoid licensing oversight from the NRC. The Department of

Energy, which runs the site, denies the charge.

***Solvency—DoD

DoD—NRC Oversight 1NC

DoD doesn’t have the expertise to make up for lack of NRC oversight

King et al 11

Marcus, LaVar Huntzinger • Thoi Nguyen, CNA, March, “Feasibility of Nuclear Power on U.S. Military Installations” https://cna.org/sites/default/files/research/Nuclear%20Power%20on%20Military%20Installations%20D0023932%20A5.pdf

It seems unlikely that DoD would pursue exemption under Section 91b in the future. 10 Regulating power plants is a function that lies beyond DoD's core mission. The Department and the military services are unlikely to have personnel with sufficient expertise to act as regulators for nuclear power plants, and it could take considerable time and resources to develop such expertise. Without NRC oversight DoD would bear all associated risks.

DoD—Ownership 1NC

DoD ownership fails—lack of expertise and resources

King et al 11

Marcus, LaVar Huntzinger • Thoi Nguyen, CNA, March, “Feasibility of Nuclear Power on U.S. Military Installations” https://cna.org/sites/default/files/research/Nuclear%20Power%20on%20Military%20Installations%20D0023932%20A5.pdf

A principal advantage of DoD ownership or operation would be the possibility to tailor a project to best fit needs, objectives, and concerns that might not be adequately expressed in contracts. If the objectives and concerns are simply that the plant is safe and efficient, that can be written into contract terms, and there is little advantage to DoD ownership or operation. A significant liability to DoD ownership and operation is having full responsibility for all risks associated with such an undertaking. The risks are made worse by the fact that such an undertaking would require expertise that is outside

DoD core capabilities

. All aspects of preparing for, building, and operating nuclear power plants are both complicated and technically challenging. DoD cannot expect to own and/or operate such a project with satisfactory results without devoting considerable time and resources to developing a competent team. Since the expertise of those involved in such a team would be outside core DoD capabilities, it would be difficult for DoD to maintain a sat-50 isfactory career path for those personnel. There could be some advantages to creating shore assignments for Navy personnel that would be similar to assignments managing and operating nuclear reactors on ships and submarines. The degree of similarity that would be possible would depend on the type of nuclear power plant built on a DoD installation.

DoD—AT “Must Be Developer” 1NC

DoD can’t be FOAK purchaser—entry market costs are too high

King et al 11

Marcus, LaVar Huntzinger • Thoi Nguyen, CNA, March, “Feasibility of Nuclear Power on U.S. Military Installations” https://cna.org/sites/default/files/research/Nuclear%20Power%20on%20Military%20Installations%20D0023932%20A5.pdf

FOAK expense is a critical input parameter. There are significant costs associated with completing preparations to actually build “a first” small nuclear power plant. If a large amount of FOAK expense is included our estimate of the levelized

cost of power for the plant becomes too high to be viable

. Feasibility depends on negotiating arrangements for a project that ensure DoD is not responsible for FOAK expense. We identify three types of

FOAK expenses: • Final detailed engineering for certification • Resolving FOAK licensing issues • Manufacturing engineering, tooling, and facilities. Completing final detailed engineering for certification will take about 2-3 years and is estimated to cost hundreds of millions of dollars. In addition, there are licensing issues related to small reactors that will need to be resolved. We assess the risks to public safety associated with the proposed small reactors are smaller than the risks associated with large reactors. In addition, the small reactors are designed to require less operator intervention. Consequently, there is general agreement that various safety requirements currently imposed on large reactors will be changed for small reactors. However, the precise details of such changes need to be worked out with the NRC. Resolving FOAK licensing issues will take a few years. Several years will be required to plan for and prepare all the details required for actual manufacturing—manufacturing engineering, tools, facilities, etc. Completing certification and licensing consists of working out and carefully documenting satisfactory answers to various questions and concerns. Therefore, the most important factor influencing the amount of calendar time required for certification and licensing is the intensity of effort and close attention that those seeking certification and licensing expend on accomplishing the objective. We estimate that total FOAK expenses could be about $800 million allocated among the different types as shown in Figure 6.

***AT Prolif Adv

Solvency—First Mover 1NC

First mover status causes prolif

Smith 11

Terrence Smith. program coordinator and research assistant with the William E. Simon Chair in Political Economy at CSIS. An Idea I

Can Do Without: “Small Nuclear Reactors for Military Installations”. 2/16/11. http://csis.org/blog/idea-i-can-do-without-smallnuclear-reactors-military-installations .

The report repeatedly emphasizes the point that “DOD’s “’first mover’ pursuit of small reactors could have a profound influence on the development of the industry,” and cautions that “if DOD does not support the U.S. small reactor industry, the industry could be dominated by foreign companies.” The U.S. nonproliferation agenda, if there is one, stands in opposition to this line of thinking. Pursuing a nuclear technology out of the fear that others will get it (or have it), is what fueled the Cold War and much of the proliferation we have seen and are seeing today

. It is a mentality I think we should avoid.

Doesn’t solve prolif

Smith 11

(Terrance, Feb 16, 2011 “An Idea I Can Do Without: “Small Nuclear Reactors for Military Installations’” http://csis.org/blog/idea-ican-do-without-small-nuclear-reactors-military-installations )

The reactors are purely for energy purposes, but in a world that seems to be growing tired of U.S. military intervention, the idea of ensuring our ability to do so through the proliferation of mobile nuclear reactors will hardly quell any hostile sentiment. In addition, it can only add fire to the “nuclear = good” flame. So, while even under best case scenario, the reactors are completely proliferation proof and pose no direct threat to the nonproliferation cause (ignoring the spreading of nuclear tech and knowledge in general), I have a tough time seeing how it helps.

Solvency—SMRs =/= Different 1NC

SMRs deploy exact same tech as larger reactors

McMahon 12

(Jeff, contributing writer for Forbes, May 25, 2012, “Small Modular Nuclear Reactors By 2022 -- But No Market For Them” http://www.forbes.com/sites/jeffmcmahon/2012/05/23/small-modular-reactors-by-2022-but-no-market-for-them/)

Like Mr. Moor, Mr. Genoa also sees the economic feasibility of SMRs as the final challenge. With inexpensive natural gas prices and no carbon tax, the economics don’t work in the favor of SMRs,” according to the summary.

The SMRs most likely to succeed are designs that use the same fuels and water cooling systems as the large reactors in operation in the U.S. today, according to

Gail

Marcus, an independent consultant in nuclear technology

and policy and a former deputy director of the Department of Energy Office of Nuclear

Energy, simply because the NRC is accustomed to regulating those reactors

.

“Those SMR designs that use light water cooling have a major advantage in licensing and development

[and] those new designs based on existing larger reactor designs , like Westinghouse’s scaled ‐ down

200 MW version of the AP ‐ 1000 reactor, would have particular advantage.” This is bad news for some innovative reactor designs such as thorium reactors that rely on different, some say safer, fuels and cooling systems. Senate staff also heard criticism of the Administration’s hopes for SMRs from Edwin Lyman, Senior

Scientist in the Global Security Program at the Union of Concerned Scientists: The last panelist, Dr. Lyman, provided a more skeptical viewpoint on SMRs, characterizing public discussion on the topic as “irrational exuberance.” Lyman argued that, with a few exceptions, safety characteristics were not significantly better than full ‐ size reactors, and in general, safety tended to rely on the same sorts of features

. Some safety benefits, he stated, also declined as reactor power approached the upper bound of the SMR category…. Lyman argued that the Fukushima disaster should lead to a “reset” in licensing. In his opinion, the incident exposed numerous weaknesses in how nuclear power is regulated, and in order to remedy these oversights, regulation should be revisited.

1NC SMRs =/= Renaissance

SMRs don’t solve nuclear renaissance and tech fails

Cooper 14

(Mark, Senior Fellow for Economic Analysis @ Institute for Energy and the Environment at Vermont Law School, May 15, 2014,

“The Economic Failure of Nuclear Power and the Development of a Low-Carbon Electricity Future: Why Small Modular Reactors

Are Part of the Problem, Not the Solution” http://216.30.191.148/Cooper%20SMRs%20are%20Part%20of%20the%20Problem,%20Not%20the%20Solution%20FINAL2.pdf

)

The troubling track record: T he experience of construction cost escalation in the U.S. and

France , two nations that account for the majority of reactors b uilt in advance d industrial market economies, shows that there is little in the track record of nuclear power to suggest that learning and innovation will solve the nuclear cost problem any time soon. Even after the purported learning processes within a te chnology have take n place, each subsequent technology results in higher cost . The larger the technological change, the larger the ultimate cost increase .

Small Modular Reactors are likely to suffer similar problems : SMR technology represents a particularly challenging leap in nuclear technology that is li kely to suffer greatly form the historic problem s of nuclear power . SMR technology will suffer disproportionately from material cost increases because the y use mo re material per MW of capacity . T he novel, e ven radically new design characteristics of SMRs pose even more of a challenge than the failed

“nuclear renaissance” technology . The untested design and the aggressive deployment strategy for SMR technology raise important safety questions and concerns. Co st estimates that assume quick design approval and deployment are certain to prove to be wildly optimistic. The technology is already failing the market test: Two of the leadi ng U.S. developers have announced they are throttling back on the development of SMR technology because they cannot find customers (Westinghouse) or major investors (Babcock and Wilcox).

The harsh judgment of the marketplace on

SMR technology is well founded.

1NC Cred

No credibility- The US is not abiding to legally binding disarmament laws

Compliance Campaign-12

( fight for US compliance to International laws), “Iran or the USA: Who really violates international obligations?”, February 3, http://compliancecampaign.wordpress.com/tag/npt/ . Google. 7/8/12. JD.

There is no mention in the defense strategy of pursuing nuclear disarmament, an explicit obligation of the United States as a state party to the Nuclear Non-Proliferation Treaty (NPT) and as the world’s leading possessor of nuclear weapons. As the

2010 NPT Review Conference reminded states parties to the treaty: The Conference recalls that the overwhelming majority of States entered into legally binding commitments not to receive, manufacture or otherwise acquire nuclear weapons or other nuclear explosive devices in the context, inter alia, of the corresponding legally binding commitments by the nuclear-weapon

States to nuclear disarmament in accordance with the Treaty. The Conference further regretted that nuclear-armed countries such as the United States have failed to live up to their end of the NPT bargain: The Conference, while welcoming achievements in bilateral and unilateral reductions by some nuclear-weapon States, notes with concern that the total estimated number of nuclear weapons deployed and stockpiled still amounts to several thousands. The Conference expresses its deep concern at the continued risk for humanity represented by the possibility that these weapons could be used and the catastrophic humanitarian consequences that would result from the use of nuclear weapons.

The double standard causes nations to challenge the NPT.

Aboul-Enein- 11

Sameh, (PROLIFERATION ANALYSIS), “NPT 2010-2015: The Way Forward”, March 31, http://carnegieendowment.org/2011/03/31/npt-2010-2015-way-forward/10wh . Google. 7/10/12. JD.

In his June 2009 speech in Cairo, President Obama stated that “no nation should pick and choose which nation holds nuclear weapons.” States in the Middle East should be no exception in this “nuclear zero” campaign. The continued application of double standards regarding nuclear haves and have–nots has significantly contributed to instability in the nonproliferation regime and has encouraged those who seek to challenge the NPT.

No credibility- not disarming

Wellen-12

Russ (is the editor of the Foreign Policy in Focus blog Focal Points and holds down the Nukes and Other WMDs Desk at the Faster

Times.), “West's Idea of Nuclear Disarmament Doesn't Include Itself”. June 19. http://truth-out.org/news/item/9758-wests-idea-ofnuclear-disarmament-doesnt-include-itself. 7/8/12. JD.

When dueling narratives clash and the subject is nuclear weapons, the sparks that fly could make flashing sabers seem dim in comparison. According to conventional thinking in the West, Iran is not abiding by the Nuclear Non-Proliferation Treaty

(NPT) and restraining itself from all nuclear weapons activities. Thus, it should be denied its right to enrich uranium. But in the view of much of the rest of the world, the West is making little more than cosmetic efforts to roll back its nuclear arsenals. Therefore, it has no business denying Iran nuclear energy - not to mention nuclear weapons (but that's another story). Keep Truthout strong, and build a lasting foundation for independent news! Click here to become a Truthout Member.

In other words, the side that committed to disarming thinks that the side that promised not to proliferate continues to proliferate. And the side that promised not to proliferate thinks that the side that committed to disarming is not disarming. In truth, abundant evidence exists that any nuclear weapons work Iran has done since 2003 is conceptual - if that - work which is not expressly forbidden by the NPT. The uranium Iran enriches to the higher levels that worry the West seems to be for medical isotopes, which are used for radiation therapy, as well as diagnosis. Combined with enrichment at lower levels for nuclear energy, it serves as a bargaining chip in negotiations.

Turn—Commercialiation

Prolif 1NC

SMRs commercialization causes prolif—exports and understaffing

Makhijani & Boyd 10

*Arjun, electrical and nuclear engineer who is President of the Institute for Energy and Environmental Research **Michele, former director of the Safe Energy Program at Physicians for, Staff Scientist at the Institute for Energy and Environmental Research

(“Small Modular Reactors No Solution for the Cost, Safety, and Waste Problems of Nuclear Power” http://ieer.org/wp/wpcontent/uploads/2010/09/small-modular-reactors2010.pdf)

Mass manufacturing raises a host of new safety, quality, and licensing concerns that the NRC has yet to address. For instance, the NRC may have to devise and test new licensing and inspection procedures for the manufacturing facilities, including inspections of welds and the like. There may have to be a process for recalls in case of major defects in mass-manufactured reactors, as there is with other mass-manufactured products from cars to hamburger meat. It is unclear how recalls would work, especially if transportation offsite and prolonged work at a repair facility were required. Some vendors, such as PBMR

(Pty) Ltd. and Toshiba, are proposing to manufacture the reactors in foreign countries. In order to reduce costs, it is likely that manufacturing will move to countries with cheaper labor forces, such as China, where severe quality problems have arisen in many products from drywall to infant formula to rabies vaccine. Other issues that will affect safety are NRC requirements for operating and security personnel, which have yet to be determined. To reduce operating costs, some SMR vendors are advocating lowering the number of staff in the control room so that one operator would be responsible for three modules. 12 In addition, the SMR designers and potential operators are proposing to reduce the number of security staff, as well as the area that must be protected. NRC staff is looking to designers to incorporate security into the SMR designs, but this has yet to be done. 13 Ultimately, reducing staff raises serious questions about whether there would be sufficient personnel to respond adequately to an accident. Of the various types of proposed SMRs, liquid metal fast reactor designs pose particular safety concerns. Sodium leaks and fires have been a central problem—sodium explodes on contact with water and burns on contact with air. Sodium-potassium coolant, while it has the advantage of a lower melting point than sodium, presents even greater safety issues, because it is even more flammable than molten sodium alone. 14 Sodium-cooled fast reactors have shown essentially no positive learning curve (i.e., experience has not made them more reliable, safer, or cheaper). The world’s first nuclear reactor to generate electricity, the EBR I in Idaho, was a sodiumpotassium-cooled reactor that suffered a partial meltdown. 22 EBR II, which was sodium cooled reactor, operated reasonably well, but the first US commercial prototype, Fermi I in Michigan had a meltdown of two fuel assemblies and, after four years of repair, a sodium explosion. 23 The most recent commercial prototype, Monju in Japan, had a sodium fire 18 months after its commissioning in 1994, which resulted in it being shut down for over 14 years. The French Superphénix, the largest sodium-cooled reactor ever built, was designed to demonstrate commercialization. Instead, it operated at an average of less than 7 percent capacity factor over 14 years before being permanently shut. 24 In addition, the use of plutonium fuel or uranium enriched to levels as high as 20 percent—four to five times the typical enrichment level for present commercial light water reactors—presents serious

proliferation risks,

especially as some SMRs are proposed to be exported to developing countries with small grids and/or installed in remote locations. Security and safety will be more difficult to maintain in countries with no or underdeveloped nuclear regulatory infrastructure and in isolated areas. Burying the reactor underground, as proposed for some designs, would not sufficiently address security because some access from above will still be needed and it could increase the environmental impact to groundwater, for example, in the event of an accident.

***AT Warming Adv

AT SMRs Solve Warming 1NC

SMRs don’t solve warming

Makhijani & Boyd 10

*Arjun, electrical and nuclear engineer who is President of the Institute for Energy and

Environmental Research **Michele, former director of the Safe Energy Program at Physicians for, Staff Scientist at the Institute for

Energy and Environmental Research

(“Small Modular Reactors No Solution for the Cost, Safety, and Waste Problems of Nuclear Power” http://ieer.org/wp/wpcontent/uploads/2010/09/small-modular-reactors2010.pdf

)

Efficiency and most renewable technologies are already cheaper than new large reactors. The long time—a decade or more— that it will take to certify SMRs will do little or

nothing

to help with the global warming problem and will actually complicate current efforts underway. For example, the current schedule for commercializing the above-ground sodium cooled reactor in Japan extends to 2050, making it irrelevant to addressing the climate problem. Relying on assurances that SMRs will be cheap is contrary to the experience about economies of scale and is likely to waste time and money, while creating new safety and proliferation risks, as well as new waste disposal problems.

**Can’t solve fast enough—no infrastructure and doesn’t reduce emissions

-even if we triple output by 2050 only reduces growth of emissions by 20%

-solving requires building 1400 large scale reactors—replacement rates means construction emissions outweigh reductions

Smith 11

(Gar,Editor Emeritus of Earth Island Journal, a former editor of Common Ground magazine, a

Project Censored Award-winning journalist, and co-founder of Environmentalists Against War, NUCLEAR ROULETT: THE CASE

AGAINST A“NUCLEAR RENAISSANCE” pg 18)

More than 200 new reactors have been proposed around the world but not enough reactors can be built fast enough to replace the world’s vanishing fossil fuel resources.2 Even if nuclear output could be tripled by 2050 (which seems unlikely in light of the industry’s record to date), this would only lower greenhouse emissions by 25 to 40 billion annual tons—12.5 to 20 percent of the reductions needed to stabilize the climate.3 The International Energy Agency estimates that renewables and efficiency measures could produce ten times these savings by 2050. The IEA estimates that cutting CO2 emissions in half by mid-century would require building 1,400 new 1,000-MW reactors—32 new reactors every year. But since it usually takes about 10 years from groundbreaking to atom-smashing, these reactors could not be constructed fast enough to prevent an irreversible “tipping” of world climate.This hardly seems feasible since the industry has only managed to bring 30 new reactors on-line over the past ten years. Of the 35 reactors the IEA listed as “under construction” in mid-2008, a third of these had been “under construction” for 20 years or longer. Some may never be completed. By contrast, a 1.5 MW wind turbine can be installed in a single day and can be operational in two weeks.4 Still, the pace of nuclear construction has picked up lately. In 2010, the number of reactor projects underway had ballooned to 66—with most located in China (27) and Russia

(11). And it’s not just a matter of designing and building new reactors.The construction of 1,400 new nuclear reactors also would require building 15 new uranium enrichment plants, 50 new reprocessing plants and 14 new waste storage sites—a deal-breaker since the sole proposed U.S. storage site at Yucca Mountain is apparently dead.The cost of this additional nuclear infrastructure has been estimated at $3 trillion.5 Moreover, since the operating lifetime of these new reactors would still be a mere 40 years, even if new construction was practical, quick and affordable, it would only “solve” the globalwarming problem for another 40 years, at which point the plants would need to be decommissioned.

Won’t solve

Squassoni 9

Sharon, Carnegie Endowment, Who's Right?: Climate Change Experts Debate Nuclear Energy, 12/10/9, http://carnegieendowment.org/2009/12/10/who-s-right-climate-change-experts-debate-nuclear-energy/1lii

First, Squassoni questions the practicality of switching to nuclear energy.

Building

sufficient nuclear capacity

would take

many years

, while the need to reduce greenhouse gasses is immediate

, she says. She argues the key to reducing energy consumption lies not just in replacing fuel but in improving energy efficiency.

Switching to nuclear power would not immediately address emissions from other sources, such as cars, homes, businesses and industries

.

While she agrees that a sense of panic won’t speed the process of replacing fossil fuels with nuclear power, Squassoni believes the climate change

issue is urgent enough to require faster solutions

— the leaders of the G8 countries have set 2015 as the year when carbon dioxide emissions cannot rise any higher. She also argues that private financial investors have shown little interest in funding the high cost of nuclear plants and are more focused on smaller renewable projects that offer a faster return

. In addition, the hazards of nuclear waste and the possible proliferation of nuclear fuel for weapons are major concerns. Efficiency, she says, is the fastest and safest way to reduce emissions.

AT SMRs Solve Warming 2NC

Fuel life cycle calculations prove no real reduction in emissions

-complete life cycle costs means it takes year to overcome production emissions and low grade uranium increases emissions

Smith 11

(Gar,Editor Emeritus of Earth Island Journal, a former editor of Common Ground magazine, a

Project Censored Award-winning journalist, and co-founder of Environmentalists Against War, NUCLEAR ROULETT: THE CASE

AGAINST A“NUCLEAR RENAISSANCE” pg 20)

Meanwhile, the argument that “clean” nuclear power can help fight Global Warming has been given the cold shoulder by climate activists. It turns out that nuclear reactors are only marginally useful in countering climate change—and an examination of the complete life-cycle costs reveals that nuclear energy actually helps stoke Global Warming.While it is true that a fully functioning reactor releases little CO2, an honest greenhouse-gas assessment cannot overlook the significant volumes of CO2 generated by the overall operations of the nuclear industry. Vast amounts of CO2 are generated by all the fossil-fuel-powered drills, trucks, locomotives and cargo ships involved in mining the ore and delivering it to refineries, enrichment facilities, power plants and, ultimately, to a radioactive waste storage site. Fossil fuels also are consumed (and

CO2 released) in the fabrication of the thick concrete housings and assembly of the huge metal parts that go into making a nuclear power plant.

It takes

many years for a fully operational nuclear plant to generate sufficient energy to offset the energy consumed in the plant’s construction.10 When the entire fuel cycle is considered, a nuclear reactor burning high-grade uranium produces a third as much CO2 as a gas-fired power plant. But the world’s supply of high-grade ore is running out.When reactors are forced to start enriching low-grade ore (containing only one-tenth the amount of uranium), the nuclear fuel cycle will start pumping out more CO2 than would be produced by burning fossil fuels directly.11

*** AT Grid Adv

1NC - Squo Solves

Status quo solves grid cyber vulnerability

Paul

Clark 12

, MA Candidate, Intelligence/Terrorism Studies, American Military University; Senior Analyst, Chenega Federal

Systems, 4/28/12, “The Risk of Disruption or Destruction of Critical U.S. Infrastructure by an Offensive Cyber Attack,” http://blog.havagan.com/wp-content/uploads/2012/05/The-Risk-of-Disruption-or-Destruction-of-Critical-U.S.-Infrastructure-by-an-

Offensive-Cyber-Attack.pdf

An attack against the

electrical grid is a reasonable threat scenario

since power systems are "a high priority target for military and insurgents" and there has been a trend towards utilizing commercial software and integrating utilities into the public Internet that has "increased vulnerability across the board" (Lewis 2010).

Yet the increased vulnerabilities are mitigated by an increased detection and deterrent capability that has been "honed over many years of practical application

" now that power systems are using standard, rather than proprietary

and specialized, applications and components

(Leita and Dacier 2012).

The security of the electrical grid is

also enhanced by increased awareness after a smart-grid hacking demonstration in 2009 and the identification of

the

Stuxnet

malware in 2010: as a result the public and private sector are working together in an "unprecedented effort

" to establish robust security guidelines and cyber security measures

(Gohn and Wheelock 2010).

2NC - Squo Solves

Squo solves islanding---the military adapted

Michael

Aimone 9-12

, Director, Business Enterprise Integration, Office of the Deputy Under Secretary of Defense

(Installations and Environment), 9/12/12, Statement Before the House Committee on Homeland Security, Subcommittee on

Cybersecurity, Infrastructure Protection and Security Technologies, http://homeland.house.gov/sites/homeland.house.gov/files/Testimony%20-%20Aimone.pdf

DoD’s facility energy strategy is

also

focused heavily on grid security

in

the name of mission assurance

. Although the Department’s fixed installations traditionally served largely as a platform for training and deployment of forces, in recent years they have begun to provide direct support for combat operations, such as unmanned aerial vehicles (UAVs) flown in Afghanistan from fixed installations here in the United States. Our fixed installations also serve as staging platforms for humanitarian and homeland defense missions. These installations are largely dependent on a commercial power grid that is vulnerable to disruption due to aging infrastructure, weather-related events, and potential kinetic, cyber attack.

In

2008

, the Defense Science Board warned

that

DoD’s reliance on a fragile

power grid

to deliver electricity to its bases places critical missions at risk

.1

Standby Power Generation

Currently

,

DoD

ensures

that it can

continue mission critical activities

on base

largely through its fleet of on-site power generation equipment.

This

equipment is

connected to essential mission systems

and automatically operates in the event of a commercial grid outage

. In addition,

each installation

has standby generators in storage for repositioning as required

.

Facility power production specialists ensure

that the generators are

primed and ready to work

, and that they are maintained and fueled during an emergency

. With careful maintenance these generators can

bridge the gap for even a lengthy outage

. As further back up to this installed equipment, DoD maintains a strategic stockpile of electrical power generators and support equipment that is kept in operational readiness. For example, during Hurricane Katrina, the Air Force transported more than

2 megawatts of specialized diesel generators from Florida, where they were stored, to Keesler Air Force Base in Mississippi, to support base recovery.

***AT Manufacturing Adv

AT Domestic Jobs 1NC

SMR development would be outsourced

Makhijani 11

(Arjun, electrical and nuclear engineer who is President of the Institute for Energy and Environmental ResearchJune 15, 2011, “The problems with small nuclear reactors” http://thehill.com/blogs/congress-blog/energy-a-environment/166609-the-problems-with-smallnuclear-reactors

And the notion that U.S. workers would get the bulk of the factory jobs is entirely fanciful, given the rules of the

W

orld

T

rade

O

rganization on free trade. Most likely the reactors would be made in China or another country with industrial infrastructure and far lower wages.

Download