Nanotech 1AC Case Neg - Open Evidence Project

advertisement

Nanotech 1AC Case Neg

Strat Sheet

Notes

Plan Text

Plan: The United States federal Government should provide uncapped NOAA Community

Based Removal Grants for nanotechnological recycling programs.

CX Guide

1.

Your Courage 10 evidence at the top of the disease flow says that, “Climate has a strong impact on the incidence of disease” so which piece of evidence says that the affirmative solves climate change? Blah Blah Blah. None Cool. (NOTE AFF NOW HAS A WARMING

ADVANTAGE)

2.

How do you make the nanotubes? (You are trying to get a distinction between the aff’s tech and the CP’s tech to make the CP competitive)

Marine Debris

1NC

1.

Cleaning the oceans of microplastics faces multiple solvency barriers – the aff can’t solve

Stiv Wilson , 07/17/ 13 ,

(Associate Director at The 5 Gyres Institute Freelance Writer, Photojournalist at Freelance Journalist, previously Rise Above Plastics Campaign Coordinator and Oceanic Voyage Ambassador at Surfrider Foundation, http://inhabitat.com/thefallacy-of-cleaning-the-gyres-of-plastic-with-a-floating-ocean-cleanup-array/, The Fallacy of Cleaning the Gyres of Plastic With a Floating

"Ocean Cleanup Array, SRB)

As the policy director of the ocean conservation nonprofit 5Gyres

.org,

I can tell you that

the problem of ocean plastic pollution is massive. In case you didn’t know, an ocean gyre is a rotating current that circulates within one of the world’s ocean s – and recent research has found that these massive systems are filled with plastic waste.

There are no great estimates

(at least scientific) on how much plastic is in the ocean

, but I can say from firsthand knowledge (after sailing to four of the world’s five gyres) that it’s so pervasive it confounds the senses.

Gyre cleanup has often been floated as a solution in the past,

and recently Boyan Slat’s proposed ‘Ocean Cleanup Array’ went viral in a big way. The nineteen-year-old claims that the system can clean a gyre in 5 years with ‘unprecedented efficiency’ and then recycle the trash collected.

The problem is that the barriers to gyre cleanup are so massive that the vast majority of the scientific and advocacy community believe it’s a fool’s errand

– the ocean is big, the plastic harvested is near worthless, and sea life would be harmed.

The solutions starts on land.

2.

Their PRnews 13 evidence says that disposable containers have been on the rise for

21 years with stagnate recycling – the impact should have already been triggered.

3.

Alt Cause to recycling stagnation – their PR News 13 evidence concedes that surge in bottled water sales and failure of Congress to implement recycling regulations have stagnated plastic recycling – the aff can’t resolve these

4.

Their third piece of evidence (Reuseit 14) was written by Re use it a company that sells reusable products – of course they are going to say that disposable resources are the worst things ever.

5.

The affirmatives evidence all relies on false statistics and estimates – the amount of plastic in the ocean was greatly overestimated.

Ron

Meador, 07/07/14

(writes Earth Journal for MinnPost. He is a veteran journalist whose last decade in a 25-year stint at the Star

Tribune involved writing editorials and columns with environment, energy and science subjects as his major concentration., http://www.minnpost.com/earth-journal/2014/07/good-news-probably-ocean-plastic-lot-it-seems-be-disappearing , Good news, probably, on ocean plastic: A lot of it seems to be disappearing, SRB)

New research suggests that conventional estimates of the amount of plastic

accumulating in the

world's five great ocean gyres may be 100 times too high.

Writing in the Proceedings of the National Academy of Sciences last week, a Spanishled research team says its nine-month sampling of seawater on a round-the-world research cruise

in 2010-11 found floating plastic debris to be widespread

: Of more than 3,000 samples collected, 88 percent contained bits of plastic. However, the samples showed a strange dropoff at particle sizes smaller than 1 millimeter, suggesting that as solar radiation and weathering break the wastes into smaller and smaller pieces, some of it may simply disappear from the ocean surface.

Where it goes remains a mystery.

The team's estimate of the current volume of plastic litter is

on the order of 7,000 to 35,000 tons — less than 1

percent of what it calls a conservative estimate of the amount of plastic waste released into the ocean in the last several decades

.

6.

Their NJ.gov 13 evidence indicates that NOAA funded debris removal is already happening in the status quo – they don’t do anything to enhance this program - means that the status quo solves the advantage

7.

Plastics in the ocean are self-correcting – they are being broken down and naturally degraded now.

Ron

Meador, 07/07/14

(writes Earth Journal for MinnPost. He is a veteran journalist whose last decade in a 25-year stint at the Star

Tribune involved writing editorials and columns with environment, energy and science subjects as his major concentration., http://www.minnpost.com/earth-journal/2014/07/good-news-probably-ocean-plastic-lot-it-seems-be-disappearing , Good news, probably, on ocean plastic: A lot of it seems to be disappearing, SRB)

And while it is widely known that plastic materials can be obnoxiously persistent in all environments

— including seawater, which has a remarkable ability to break down and absorb all kinds of waste, including crude oil — that doesn't mean the yoke from a six-pack of soda cans remains intact.

Plastics pollution found on the ocean surface is dominated by particles smaller than 1 cm in diameter, commonly referred to as microplastics. Exposure of plastic objects on the surface waters to solar radiation results in their photodegradation, embrittlement, and fragmentation by wave action.

However, plastic fragments are considered to be quite stable and highly durable, potentially lasting hundreds to thousands of years. Persistent nano-particles may be generated during the weathering of plastic debris, although their abundance has not been quantified in ocean waters. The Malaspina's surface tow nets were capable of collecting microplastics with diameters larger than 200 microns, or about four times the thickness of a human hair. Its sampling found the greatest abundance of microplastics in the size range around 2 mm (about one and a half times the thickness of a dime) and a dramatic falloff in particles smaller than 1 mm. And this was puzzling, because our knowledge of plastics degradation suggests that "progressive fragmentation of the plastic objects into more and smaller pieces should lead to a gradual increase of fragments toward small sizes" and a steadier distribution across allsizes.

So where is all the microplastic going?

One possibility is that something accelerates sunlight's degradation of microplastic fragments below 1 mm; another is that the finest bits of debris are washing ashore and remaining there.

In the absence of any observations to support either pathway, the research team considers them unlikely. Two other avenues seem more probable:

"

Biofouling" by accumulations of bacteria could be hastening further breakdown of the plastics, or "ballasting" them with enough additional weight to cause the particles to sink below the surface.

Something could be eating the sub-millimeter particles, which just happen to be about the same size as the zooplankton that form a foundation of the ocean's food web. The paper notes other research documenting the presence of plastics in the guts of fish — especially "mesopelagic" fish, feeding at depths of 200 to 1,000 meters, where the light isn't so good — and also in the guts of birds and other predators that feed on fish (also, in their poop).

7. Nanotechnology has adverse environmental effects – replicates aff impacts

Zhang et al. 11 (B. Zhang1 , H.Misak1 , P.S. Dhanasekaran1 , D. Kalla2 and R. Asmatulu1,

1Department of Mechanical Engineering Wichita State University, 2Department of Engineering

Technology, Metropolitan State College of Denver, Environmental Impacts of Nanotechnology and Its Products,

Midwest Section Conference of the American Society for Engineering Education, 2011, https://www.asee.org/documents/sections/midwest/2011/ASEE-

MIDWEST_0030_c25dbf.pdf)//rh

Nanoparticles have higher surface areas than the bulk materials which can cause more damage to the human body and environment compared to the bulk particles . Therefore, concern for the potential risk to the society due to nanoparticles has attracted national and international attentions. Nanoparticles are not only beneficial to tailor the properties of polymeric composite materials and environment in air pollution monitoring, but also to help reduce material consumption and remediation (Figure 1). For example, carbon nanotube and graphene based coatings have been developed to reduce the weathering effects on composites used for wind

turbines and aircraft. Graphene has been chosen to be a better nanoscale inclusion to reduce the degradation of UV exposure and salt. By using nanotechnology to apply a nanoscale coating on existing materials, the material will last longer and retain the initial strength longer in the presence of salt and UV exposure. Carbon nanotubes have been used to increase the performance of data information system. However, there are few considerations of potential risks need to be considered using nanoparticles: The major problem of nanomaterials is the nanoparticle analysis method . As nanotechnology improves, new and novel nanomaterials are gradually developed. However, the materials vary by shape and size which are important factors in determining the toxicity. Lack of information and methods of characterizing nanomaterials make existing technology extremely difficult to detect the nanoparticles in air for environmental protection . information of the chemical structure is a critical factor to determine how toxic a nanomaterial is, and minor changes of chemical function group could drastically change its properties . Full risk assessment of the safety on human health and environmental impact need to be evaluated at all stages of nanotechnology. The risk assessment should include the exposure risk and its probability of exposure, toxicological analysis, transport risk, persistence risk, transformation risk and ability to recycle. environmental impacts. imental design in advance of manufacturing a nanotechnology based product can reduce the material waste. Carbon nanotubes have applications in many materials for memory storage, electronic, batteries, etc. However, some scientists have concerns about carbon nanotubes because of unknown harmful impacts to the human body by inhalation into lungs, and initial data suggests that carbon nanotubes have similar toxicity to asbestos fiber 11 . Lam et al. and Warheit et al. studied on pulmonary toxicological evaluation of single-wall carbon nanotubes12 .

From Lam’s research, carbon nanotube showed to be more toxic than carbon black and quartz once it reaches lung 13 , and Warheit found multifocal granulomas were produced when rats were exposure to singlewall carbon nanotubes 14 . Also, previous disasters need to be re-analyzed to compare with current knowledge as well. In the 1980s, a semiconductor plant contaminated the groundwater in Silicon Valley, California . This is a classic example of how nanotechnology can harm the environment even though there are several positive benefits.

. As current nanoscale materials are becoming smaller, it is more difficult to detect toxic nanoparticles from waste which may contaminate the environment (Figure 2). Nanoparticles may interact with environment in many ways: it may be attached to a carrier and transported in underground water by bio-uptake, contaminants, or organic compounds . Possible aggregation will allow for conventional transportation to sensitive environments where the nanoparticles can break up into colloidal nanoparticles. As Dr. Colvin says “we are concerned not only with where nanoparticles may be transported, but what they take with them”16 . There are four ways that nanoparticles or nanomaterials can become toxic and harm the surrounding environment 17: researchers are currently working on TiO2 powder as a coating inclusion that will reduce the weathering effects, such as salt rain degradation on composite materials. Ivana Fenoglio, et al. 18 expressed their concern that the effect of TiO2 nanoparticles to be assessed when leaked into the environment . Mobility of contaminants: There are two general methods that nanoparticle

can be emitted into atmosphere. 19 Nanoparticles are emitted into air directly from the source called primary emission, and are the main source of the total emissions. However, secondary particles are emitted naturally, such as homogeneous nucleation with ammonia and sulfuric acid presents. As Figure 2 demonstrates that nanoparticles can easily be attached to contaminations and transported to a more sensitive environment such as aqueous environments . For example, nuclear waste traveled almost 1 mile from a nuclear test site in 30 years20. However, after 40 years of the incident the first flow mechanism model is being developed to describe the methods of nanoparticle based waste travels21 .

Nanoparticles are invented and developed in advance of the toxic assessment by scientists. Many of the nanoparticles are soluble in water, and are hard to separate from waste if inappropriately handled. Any waste product, including nanomaterials, can cause environmental concerns/problems if disposed inappropriately.

8. Climate Change is alt cause to coral reef destruction

Roberts, 12

(Callum, marine conservation biologist at the University of York, “Corrosive Seas,”

The Ocean of Life, May 31, pg. 109-110, it’s a book, AW)

As carbon dioxide levels in the sea rise, carbonate saturation will fall , and the depths at which carbonate dissolves will become shallower.

Recent estimates suggest that this horizon is rising by three feet to six feet per year in some places . So far, most carbon dioxide added by human activity remains near the surface. It has mixed more deeply—to depths of more than three thousand feet

—in areas of intense downwelling in the polar North and South

Atlantic, where deep bottom waters of the global ocean conveyor current are formed. Elsewhere the sea has been stirred to only a thousand feet deep or less. All tropical coral reefs inhabit waters that are less than three hundred feet deep, so they will quickly come under the influence of ocean acidification.

If carbon dioxide in the atmosphere doubles from its current level, all of the world's coral reefs will shift from a state of construction to erosion .

They will literally begin to crumble and dissolve, as erosion and dissolution of carbonates outpaces deposition . What is most worrying is that this level of carbon dioxide will be reached by 2100 under a low-emission scenario of the Intergovernmental Panel on Climate Change .

The 2009 Copenhagen negotiations sought to limit carbon dioxide emissions so that levels would never exceed 450 parts per million in the atmosphere. That target caused deadlock in negotiations, but even that, according to some prominent scientists, would be too high for coral reefs. Just as

Ischia's carbonated volcanic springs provide a warning of things to come, bubbling carbon dioxide released beneath reefs in Papua New Guinea give us tangible proof of the fate that awaits coral reefs.

' 3 Reef growth has failed completely in places where gas bubbles froth vigorously, reducing pH there to levels expected everywhere by early in the twenty-second century under a business-as-usual scenario . The few corals that survive today have been heavily eroded by the corrosive water.

The collapse of coral reefs in the Galapagos following

El Nino in the early 1980s was hastened by the fact that eastern Pacific waters are naturally more acid due to their deep-water upwelling than those in other parts of the oceans.'4 Corals there were only loosely cemented into reef structures and collapsed quickly.

9. Climate change alt cause to bio-diversity– laundry list of reasons it affects all levels of ecosystems

Bellard et al, 12 (Celine – PhD; postdoc work on impact of climate change at the Universite

Paris , Cleo Bertelsmeier, Paul Leadley, Wilfried Thuiller, and Franck Courchamp, “Impacts of climate change on the future of biodiversity,” US National Library of Medicine National

Institutes of Health – accepted for publication in a peer reviewed journal, January 4, http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3880584/, AW)

The multiple components of climate change are anticipated to affect all the levels of biodiversity, from organism to biome levels (Figure 1, and reviewed in detail in, e.g.,

Parmesan 2006). They primarily concern various strengths and forms of fitness decrease , which are expressed at different levels, and have effects on individuals, populations, species, ecological networks and ecosystems. At the most basic levels of biodiversity, climate change is able to decrease genetic diversity of populations due to directional selection and rapid migration, which could in turn affect ecosystem functioning and resilience (Botkin et al.

2007) (but, see Meyers & Bull 2002). However, most studies are centred on impacts at higher organizational levels, and genetic effects of climate change have been explored only for a very small number of species. Beyond this, the various effects on populations are likely to modify the “web of interactions” at the community level

(Gilman et al. 2010; Walther 2010). In essence, the response of some species to climate change may constitute an indirect impact on the species that depend on them. A study of 9,650 interspecific systems, including pollinators and parasites, suggested that around 6,300 species could disappear following the extinction of their associated species (Koh et al. 2004). In addition, for many species, the primary impact of climate change may be mediated through effects on synchrony with species’ food and habitat requirements

(see below). Climate change has led to phenological shifts in flowering plants and insect pollinators, causing mismatches between plant and pollinator populations that lead to the extinctions of both the plant and the pollinator with expected consequences on the structure of plant-pollinator networks (Kiers et al. 2010;

Rafferty & Ives 2010). Other modifications of interspecific relationships (with competitors, prey/predators, host/parasites or mutualists) also modify community structure and ecosystem functions (Lafferty 2009; Walther 2010; Yang & Rudolf 2010) At a higher level of biodiversity , climate can induce changes in vegetation communities that are predicted to be large enough to affect biome integrity . The Millenium Ecosystem Assessment forecasts shifts for 5 to 20% of Earth’s terrestrial ecosystems, in particular cool conifer forests, tundra, scrubland, savannahs, and boreal forest (Sala et al. 2005). Of particular concern are “tipping points” where ecosystem thresholds can lead to irreversible shifts in biomes

(Leadley et al.

2010). A recent analysis of potential future biome distributions in tropical South America suggests that large portions of Amazonian rainforest could be replaced by tropical savannahs (Lapola et al. 2009). At higher altitudes and latitudes, alpine and boreal forests are expected to expand northwards and shift their tree lines upwards at the expense of low stature tundra and alpine communities (Alo & Wang 2008). Increased temperature and decreased rainfall mean that some lakes, especially in Africa , might dry out (Campbell et al.

2009). Oceans are predicted to warm and become more acid, resulting in widespread degradation of tropical coral reefs (Hoegh-Guldberg et al. 2007). The implications of climate

change for genetic and specific diversity have potentially strong implications for ecosystem services. The most extreme and irreversible form of fitness decrease is obviously species extinction . To avoid or mitigate these effects, biodiversity can respond in several ways, through several types of mechanisms.

10. Ocean biodiversity is getting better – disproves their impact

Panetta 13

(Leon, former US secretary of state, co-chaired the Pew Ocean Commission and founded the Panetta Institute at California State

University, Monterey Bay, “Panetta: Don't take oceans for granted,” http://www.cnn.com/2013/07/17/opinion/panetta-oceans/index.html, ND)

Our oceans are a tremendous economic engine

, providing jobs for millions of Americans, directly and indirectly, and a source of food and recreation for countless more. Yet, for much of U.S. history, the health of America's oceans has been taken for granted, assuming its bounty was limitless and capacity to absorb waste without end. This is far from the truth.

The situation the commission found in 2001 was grim. Many of our nation's commercial fisheries were being depleted and fishing families and communities were hurting. More than 60% of our coastal rivers and bays were degraded by nutrient runoff from farmland, cities and suburbs.

Government policies

and practices, a patchwork of inadequate laws and regulations at various levels, in many cases made matters worse.

Our nation needed a wake-up call.

The situation, on many fronts, is dramatically different today because of a combination of leadership initiatives from the White House and old-fashioned bipartisan cooperation on Capitol Hill. Perhaps the most dramatic example can be seen in the effort to end overfishing in U.S. waters

. In 2005, President George W.

Bush worked with congressional leaders to strengthen America's primary fisheries management law,

the Magnuson-Stevens Fishery Conservation and Management Act. This included establishment of science-based catch limits to guide decisions in rebuilding depleted species.

These reforms enacted by Congress are paying off.

In fact, an important milestone was reached last June when the National Oceanic and Atmospheric Administration announced it had established annual, science-based catch limits for all U.S. ocean fish populations.

We now have some of the best managed fisheries in the world. Progress also is evident in improved overall ocean governance and better safeguards for ecologically sensitive marine areas.

In 2010, President Barack

Obama issued a historic executive order establishing a national ocean policy directing federal agencies to coordinate efforts to protect and restore the health of marine ecosystems.

President George W.

Bush set aside new U.S. marine sanctuary areas from 2006 through 2009.

Today, the Papahanaumokuakea Marine National Monument, one of several marine monuments created by the Bush administration, provides protection for some of the most biologically diverse waters in the

Pacific.

11. No extinction – we can isolate ourselves from the environment

Powers 2002

(Lawrence, Professor of Natural Sciences, Oregon Institute of Technology, The Chronicle of Higher Education, August 9, ND)

Mass extinctions

appear to result from major climatic changes

or catastrophes, such as asteroid impacts.

As far as we know, none has resulted from the activities of a species, regardless of predatory voracity

, pathogenicity, or any other interactive attribute.

We are the first species with the potential to

manipulate global climates and to destroy habitats, perhaps even ecosystem s -- therefore setting the stage for a sixth mass extinction

. According to Boulter, this event will be an inevitable consequence of a "self-organized Earth-life system."

This

Gaia-like proposal might account for many of the processes exhibited by biological evolution before

man's technological intervention, but ... the rules are now dramatically different.

... Many species may vanish, ... but that doesn't guarantee

, unfortunately, that we will be among the missing. While other species go bang

in the night, humanity will technologically isolate itself

further from the natural world and will rationalize the decrease in biodiversity in the same manner as we have done so far.

I fear, that like the fabled cockroaches of the atomic age,

we may be one of the last life-forms to succumb, long after the "vast tracts of beauty" that Boulter mourns we will no longer behold vanish before our distant descendants' eyes.

2NC Extension Coral Reefs Impact D

1.

Coral is incredibly resilient. If it survived 300 million years of badness it can survive anything we throw at it.

Ridd ‘7

(Peter, Reader in Physics – James Cook U. Specializing in Marine Physics and

Scientific Advisor – Australian Environment Foundation, “The Great Great Barrier Reef

Swindle”, 7-19, http://www.onlineopinion.com.au/view.asp?article=6134)

In biological circles, it is common to compare coral reefs to canaries, i.e. beautiful and delicate organisms that are easily killed.

The analogy is pushed further by claiming that

, just as canaries were used to detect gas in coal mines, coral reefs are the canaries of the world and their death is a first indication of our apocalyptic

greenhouse future.

The bleaching events of 1998 and 2002 were our warning. Heed them now or retribution will be visited upon us.

In fact a more appropriate creature with which to compare corals would be cockroaches - at least for their ability to survive. If our future brings us total self-annihilation by nuclear war, pollution or global warming, my bet is that both cockroaches and corals will survive

.

Their track-record is impressive. Corals have survived

300 million years of massively varying climate both much warmer and much cooler than today, far higher CO2 levels than we see today, and enormous sea level changes

.

Corals saw the dinosaurs come and go, and cruised through mass extinction events that left so many other organisms as no more than a part of the fossil record

.

Corals are particularly well adapted to temperature changes and in general, the warmer the better

.

It seems odd that coral scientists are worrying about global warming because this is one group of organisms that like it hot

. Corals are most abundant in the tropics and you certainly do not find fewer corals closer to the equator. Quite the opposite, the further you get away from the heat, the worse the corals

.

A cooling climate is a far greater threat. The scientific evidence about the effect of rising water temperatures on corals is very encouraging. In the GBR, growth rates of corals have been shown to be increasing over the last 100 years, at a time when water temperatures have risen. This is not surprising as the highest growth rates for corals are found in warmer waters. Further, all the species of corals we have in the GBR are also found in the islands, such as

PNG, to our north where the water temperatures are considerably hotter than in the GBR. Despite the bleaching events of 1998 and 2002, most of the corals of the GBR did not bleach and of those that did, most have fully recovered.

Coral Reef extinction is inevitable by 2100 – ocean acidification is destroying them – means the aff can’t solve

Kintisch 12 (

Eli, American Science Journalist, Coral Reefs Could Be Decimated by 2100 ,

Science AAAS, http://news.sciencemag.org/sciencenow/2012/12/coral-reefs-could-bedecimated-b.html)//rh

Nearly every coral reef could be dying by 2100 if current carbon dioxide emission trends continue, according to a new review of major climate models from around the world. The only way to maintain the current chemical environment in which reefs now live, the study suggests, would be to deeply cut emissions as soon as possible. It may even become necessary to actively remove carbon dioxide from the atmosphere, say with massive treeplanting efforts or machines. The world's open-ocean reefs are already under attack by the combined stresses of acidifying and warming water, overfishing, and coastal pollution .

Carbon emissions have already lowered the pH of the ocean a full 0.1 unit, which has harmed reefs and hindered bivalves' ability to grow. The historical record of previous mass extinctions suggests that acidified seas were accompanied by widespread die-offs but not total extinction. To study how the world's slowly souring seas would affect reefs in the future,

scientists with the Carnegie Institution for Science in Palo Alto, California, analyzed the results of computer simulations performed by 13 teams around the world. The models include simulations of how ocean chemistry would interact with an atmosphere with higher carbon dioxide levels in the future. This so-called "active biogeochemistry" is a new feature that is mostly absent in the previous generation of global climate models. Using the models' predictions for future physical traits such as pH and temperature in different sections of the ocean, the scientists were able to calculate a key chemical measurement that affects coral. Corals make their shells out of the dissolved carbonate mineral known as aragonite. But as carbon dioxide pollution steadily acidifies the ocean, chemical reactionschange the extent to which the carbonate is available in the water for coral. That availability is known as its saturation, and is generally thought to be a number between 3 and 3.5.

No precise rule of thumb exists to link that figure and the health of reefs. But the Carnegie scientists say paleoclimate data suggests that the saturation level during preindustrial times—before carbon pollution began to accumulate in the sky and seas—was greater than 3.5. The models that the Carnegie scientists analyzed were prepared for the major global climate report coming out next year: the

Intergovernmental Panel on Climate Change report. The team compared the results of those simulations to the location of 6000 reefs for which there is data, two-thirds of the world total.

That allowed them to do what amounted to a chemical analysis of future reef habitats. In a talk reviewing the study at the fall meeting of the American Geophysical Union earlier this month, senior author and Carnegie geochemist Ken Caldeira showed how the amount of carbon emitted in the coming decades could have huge impacts on reefs' fates. In a low-emissions trajectory in which carbon pollution rates were slashed and carbon actively removed from the air by trees or machines, between 77% and 87% of reefs that they analyzed stay in the safe zone with the aragonite saturation above 3. "If we are on the [business as usual] emissions trajectory, then the reefs are toast ," Caldeira says. In that case, all the reefs in the study were surrounded by water with Aragonite saturation below 3, dooming them . In that scenario,

Caldeira says, "details about sensitivity of corals are just arguments about when they will die."

" In the absence of deep reductions in CO2 emissions, we will go outside the bounds of the chemistry that surrounded all open ocean coral reefs before the industrial revolution ," says

Carnegie climate modeler Katharine Ricke, the first author on the new study. Greg Rau, a geochemist at Lawrence Livermore National Laboratory in California, says the work sheds new light onto the future of aragonite saturation levels in the ocean, also known as "omega." "There is a very wide coral response to omega—some are able to internally control the [relevant] chemistry," says Rau, who has collaborated with Caldeira in the past but did not participate in this research. Those tougher coral species could replace more vulnerable ones "rather than a wholesale loss" of coral. "[But] an important point made by [Caldeira] is that corals have had many millions of years of opportunity to extend their range into low omega waters.

With rare exception they have failed. What are the chances that they will adapt to lowering omega in the next 100 years ?"

2NC Extension Bio-D

Changes in biodiversity are both inevitable and unstoppable — their impacts are empirically disproven.

Dodds 2k

( Donald J. Dodds is the former president of the North Pacific Research Board, M.S. and P.E. (“The Myth of Biodiversity,”

Published in 2000, Available Online at http://www.docstoc.com/docs/97436583/THE-MYTH-OF-BIODIVERSITY, ND)

Biodiversity is a corner stone of the environmental movement. But there is no proof that biodiversity is important to the environment.

Something without basis in scientific fact is called a Myth. Lets examine biodiversity through out the history of the earth. The earth has been a around for about 4 billion years. Life did not develop until about 500 million years later. Thus for the first 500 million years bio diversity was zero. The planet somehow survived this lack of biodiversity.

For the next 3 billion years, the only life on the planet was microbial and not diverse

. Thus, the first unexplainable fact is that the earth existed for 3.5 billion years, 87.5% of its existence, without biodiversity.

Somewhere around 500 million years ago life began to diversify and multiple celled species appeared. Because these species were partially composed of sold material they left better geologic records, and the number of species and genera could be cataloged and counted. The number of genera on the planet is a indication of the biodiversity of the planet. Figure 1 is a plot of the number of genera on the planet over the last 550 million years. The little black line outside of the left edge of the graph is 10 million years.

Notice

the left end of this graph.

Biodiversity has never been higher than it is today. Notice next that at least ten times biodiversity fell rapidly; none of these extreme reductions in biodiversity were caused by humans

. Around 250 million years ago the number of genera was reduce 85 percent from about 1200 to around 200, by any definition a significant reduction in biodiversity. Now notice that after this extinction a steep and rapid rise of biodiversity.

In fact, if you look closely at the curve, you will find that every mass-extinction was followed by a massive increase in biodiversity

. Why was that? Do you suppose it had anything to do with the number environmental niches available for exploitation? If you do, you are right.

Extinctions are necessary for creation

. Each time a mass extinction occurs the world is filled with new and better-adapted species. That is the way evolution works, its called survival of the fittest.

Those species that could not adapted to the changing world conditions simply disappeared and better species evolved. How efficient is that? Those that could adapt to change continued to thrive. For example, the cockroach and the shark have been around well over 300 million years. There is a pair to draw to, two successful species that any creator would be proud to produce. To date these creatures have successful survived six extinctions, without the aid of humans or the EPA.

Now notice that only once in the last 500 million years did life ever exceed 1500 genera, and that was in the middle of the

Cretaceous Period around 100 million years ago, when the dinosaurs exploded on the planet. Obviously, biodiversity has a bad side. The direct result of this explosion in biodiversity was the extinction of the dinosaurs that followed 45 million years later at the KT boundary. It is interesting to note, that at the end of the extinction the number of genera had returned to the 1500 level almost exactly.

Presently biodiversity is at an all time high and has again far exceeded the 1500 genera level. Are we over due for another extinction?

A closer look at the KT extinction 65 million years ago reveals at least three things. First the

1500 genera that remained had passed the test of environmental compatibility and remained on the planet. This was not an accident. Second, these extinctions freed niches for occupation by better-adapted species. The remaining genera now faced an environment with hundreds of thousands of vacant niches. Third, it only took about 15 million years to refill all of those niches and completely replaced the dinosaurs, with new and better species. In this context, a better species is by definition one that is more successful in dealing with a changing environment.

Many of those genera that survived the KT extinction were early mammals, a more sophisticated class of life that had developed new and better ways of facing the environment. These genera were now free to expand and diversify without the presences of the life dominating dinosaurs. Thus, as a direct result of this mass extinction humans are around to discuss the consequences of change. If the EPA had prevented the dinosaur extinction, neither the human race, nor the EPA would have existed. The unfortunate truth is that the all-powerful human species does not yet have the intelligence or the knowledge to regulate evolution. It is even questionable that they have the skills to prevent their own extinction.

Change is a vital part of the environment

. A successful species is one that can adapt to the changing environment, and the most successful species is one that can do that for the longest duration. This brings us back to the cockroach and the shark. This of course dethrones egotistical homosapien-sapiens as god’s finest creation, and raises the cockroach to that exalted position. A fact that is difficult for the vain to accept. If humans are to replace the cockroach, we need to use our most important adaptation (our brain) to

prevent our own extinction. Humans like the Kola bear have become over specialized, we require a complex energy consuming social system to exist.

If one thing is constant in the universe, it is change. The planet has change significantly over the last 4 billion years and it will continue to change over the next 4 billion years. The current human scheme for survival, stopping change, is a not only wrong, but futile because stopping change is impossible.

Geologic history has repeatedly shown that species that become overspecialized are ripe for extinction. A classic example of overspecialization is the Kola bears, which can only eat the leaves from a single eucalyptus tree. But because they are soft and furry, look like a teddy bear and have big brown eyes, humans are artificially keeping them alive. Humans do not have the stomach or the brain for controlling evolution. Evolution is a simple process or it wouldn’t function. Evolution works because it follows the simple law: what works—works, what doesn’t work—goes away. There is no legislation, no regulations, no arbitration, no lawyers, scientists or politicians. Mother Nature has no preference, no prejudices, no emotions and no ulterior motives. Humans have all of those traits.

Humans are working against nature when they try to prevent extinctions and freeze biodiversity.

Examine the curve in figure one, at no time since the origin of life has biodiversity been constant

. If this principal has worked for 550 million years on this planet, and science is supposed to find truth in nature, by what twisted reasoning can fixing biodiversity be considered science? Let alone good for the environment. Environmentalists are now killing species that they arbitrarily term invasive, which are in reality simply better adapted to the current environment. Consider the Barred Owl, a superior species is being killed in the name of biodiversity because the Barred Owl is trying to replace a less environmentally adapted species the Spotted Owl. This is more harmful to the ecosystem because it impedes the normal flow of evolution based on the idea that biodiversity must remain constant

.

Human scientists have decided to take evolution out of the hands of Mother Nature and give it to the EPA.

Now there is a good example of brilliance. We all know what is wrong with lawyers and politicians, but scientists are supposed to be trustworthy. Unfortunately, they are all to often, only people who think they know more than anybody else. Abraham Lincoln said, “Those who know not, and know not that the know not, are fools shun them.” Civilization has fallen into the hands of fools. What is suggested by geologic history is that the world has more biodiversity than it ever had and that it maybe overdue for another major extinction.

Unfortunately, today many scientists have too narrow a view. They are highly specialized. They have no time for geologic history. This appears to be a problem of inadequate education not ignorance. What is abundantly clear is that artificially enforcing rigid biodiversity works against the laws of nature, and will cause irreparable damage to the evolution of life on this planet and maybe beyond. The world and the human species may be better served if we stop trying to prevent change, and begin trying to understand change and positioning the human species to that it survives the inevitable change of evolution.

If history is to be believed, the planet has 3 times more biodiversity than it had 65 million years ago. Trying to sustain that level is futile and may be dangerous.

The next major extinction, change in biodiversity, is as inevitable as climate change.

We cannot stop either from occurring, but we can position the human species to survive

those changes.

2NC Extension Environment Turn

1.

Aff replicates its impacts – Nanotech increases toxicological pollution

Zhang et al. 11 (B. Zhang1 , H.Misak1 , P.S. Dhanasekaran1 , D. Kalla2 and R. Asmatulu1,

1Department of Mechanical Engineering Wichita State University, 2Department of Engineering

Technology, Metropolitan State College of Denver, Environmental Impacts of Nanotechnology and Its Products,

Midwest Section Conference of the American Society for Engineering Education, 2011, https://www.asee.org/documents/sections/midwest/2011/ASEE-

MIDWEST_0030_c25dbf.pdf)//rh

Nanotechnology increases the strengths of many materials and devices, as well as enhances efficiencies of monitoring devices, remediation of environmental pollution, and renewable energy production. While these are considered to be the positive effect of nanotechnology, there are certain negative impacts of nanotechnology on environment in many ways, such as increased toxicological pollution on the environment due to the uncertain shape, size, and chemical compositions of some of the nanotechnology products (or nanomaterials). It can be vital to understand the risks of using nanomaterials, and cost of the resulting damage. It is required to conduct a risk assessment and full life-cycle analysis for nanotechnology products at all stages of products to understand the hazards of nanoproducts and the resultant knowledge that can then be used to predict the possible positive and negative impacts of the nanoscale products. Choosing right, less toxic materials (e.g., graphene) will make huge impacts on the environment. This can be very useful for the training and protection of students, as well as scientists, engineers, policymakers, and regulators working in the field.

Disease

1NC

1.

Can’t solve disease - Tech barriers in squo

Cario 12 (Elke, Associate Editor of Mucosal Immunology, Nanotechnology-based drug delivery in mucosal immune diseases: hype or hope?

, Mucosal Immunology (2012) 5, 2–3, http://www.nature.com/mi/journal/v5/n1/full/mi201149a.html)//rh

Despite this optimistic outlook on nanotechnology-based drug delivery, many hurdles must still be overcome . Naturally, the future nanoparticle-preparation process should be contaminant-free, standardized and reproducible, relatively inexpensive, and easy to scale up . To increase drug efficacy and ensure patient compliance, the nanocarrier formulation must be safe, simple to administer, and, most important, nontoxic. Very little is known about the potential effects of nanoparticles and individual components on the human immune system (not to mention with long-term administration). Protective mucus usually traps and removes foreign particles from the mucosal surface . Biodegradable polymeric particles of larger size (200 nm) have been shown to be capable of rapidly penetrat ing healthy

(e.g., cervicovaginal14) or diseased (e.g., chronic rhinosinusitis15) human mucus barriers .

During this process, however, nanoparticles can alter the microstructure of the mucus barrier,16 but the functional impact of this observation remains to be examined in vivo (e.g., do nanoparticle-induced “holes” disrupt the mucus barrier, allowing bacterial translocation?). Once the mucus is crossed, nanoparticles should not cause toxicity to the epithelial cells or immune hyper-/hyposensitivity in the underlying lamina propria.

2.

The first piece of evidence they read on this flow says that disease is inevitable in the squo due to climate change since the affirmative can’t solve climate change they can’t solve disease.

3.

Ocean cleanup nanotech and medical nanotech are not the same the plan only funds

NOAA nanotech development.

4.

New cures solve all diseases

ASNS, 2008 ( ASNS, Africa Science News Service, Uganda, 9-15-2008, AIDS cure may lie in supercharged "mineral water")

Antibiotics and vaccines that prompt side effects, genetic mutations, and resistant germs may soon be obsolete pending the results of an AIDS trial sponsored by volunteers, humanitarian groups, and The Republic of

Uganda

. At the Victoria Medical Center, in this nation at the epicenter of the pandemic , a new type of "mineral water" will be tested to compete with the drug industry's most profitable weapons

against disease. As governments worldwide are stockpiling defenses against bioterrorist attacks and deadly new outbreaks, Uganda will test a new possible cure for infectious diseases made from energized water and silver. It is called

UPCOSHTM, short for "Uniform Picoscaler Concentrated Oligodynamic Silver

Hydrosol." OXYSILVERTM is the leading brand. The base formula was developed by NASA scientists to protect astronauts in space

.

The solution of pure water and energized silver and oxygen uniquely boasts a covalent electromagnetic bond

between these two non-toxic elements that kills most harmful germs, oxygenates the blood, alkalines the body, helps feed essential nutrients to healthy cells and desirable digestive bacteria, and even relays a musical note upon which active DNA depends. These factors are crucial for developing mega-immunity and winning the war against cancer and infectious diseases

experts say.

According to

the product's developers, including famous health scientists, this entirely new class of liquids and gels is performing "miraculously" in killing HIV, the AIDS virus, tuberculosis, and malaria in initial tests.

Africa's greatest killers (after starvation, dehydration, and

resulting immunological destruction

) are no match for a few drops of UPCOSHTM

. Even using a germ infested glass, as is commonly the case in the poorest communities, you need not fear.

This water safely disinfects everything it touches

. Ugandan officials were encouraged by the nation's leading AIDS activist, Peter Luyima, co-founder of the WASART African

Youth Movement, to study OXYSILVERTM. Mr. Luyima invited several humanitarian doctors, researchers, organizations, and corporations to sponsor this promising human experiment on 70 terminally-ill patients. If successful , the government plans to grant funding to

Mr.

Luyima's youth organization to establish an OXYSILVERTM manufacturing plant to supply this life-saving liquid to distributors across Africa

. " Better late than never,

OXYSILVERTM may prove to be civilization's greatest hope for surviving against the current and coming plagues," says

Dr. Leonard

Horowitz

, an award-winning public health and emerging diseases expert

who contributed to the product's electro-genetic formulation.

Author of the American bestseller, Emerging Viruses: AIDS & Ebola--Nature, Accident or Intentional?, and the scientific text, DNA: Pirates of the Sacred Spiral, Dr. Horowitz is most critical of the drug cartel profiting from humanity's suffering.

5.

Empirics should really frame this debate

Richard Posner , Senior Lecturer in Law at the University of Chicago, judge on the United States

Court of Appeals for the Seventh Circuit, January 1, 2005

, Skeptic, “Catastrophe: the dozen most significant catastrophic risks and what we can do about them,” http://goliath.ecnext.com/coms2/gi_0199-4150331/Catastrophe-the-dozen-mostsignificant.html#abstract

Yet the fact that Homo sapien s has managed to survive every disease to assail it in the 200,000 years or so of

its existence is a source of genuine comfort

, at least if the focus is on extinction events .

There have been enormously destructive plagues, such as the Black Death, smallpox, and now

AIDS, but none has come close to destroying the entire human race. There is a biological reason. Natural selection favors germs of limited lethality

; they are fitter in an evolutionary sense because their genes are more likely to be spread if the germs do not kill their hosts too quickly.

The AIDS virus is an example of a lethal virus, wholly natural, that by lying dormant yet infectious in its host for years maximizes its spread. Yet there is no danger that AIDS will destroy the entire human race.

The likelihood of a natural pandemic that would cause the extinction of the human race is probably even less today than in the past

(except in prehistoric times, when people lived in small, scattered bands, which would have limited the spread of disease), despite wider human contacts that make it more difficult to localize an infectious disease.

The reason is improvements in medical science.

But the comfort is a small one. Pandemics can still impose enormous losses and resist prevention and cure: the lesson of the AIDS pandemic. And there is always a lust time.

6.

Nanotechnology does not improve health actually it’s been linked to numerous health links.

Kevin

Bullis,

3/22/

08 (MIT Technology Review’s senior editor for energy, http://www.technologyreview.com/news/410172/somenanotubes-could-cause-cancer/, Some Nanotubes Could Cause Cancer, SRB)

Certain types of carbon nanotubes could cause the same health problems as asbestos

, according to the results of two recent studies. In one, published yesterday, tests in mice showed that long and straight multiwalled

carbon nanotubes cause the same kind of inflammation and lesions in the type of tissues that surround the lungs that is caused by asbestos.

The other study

, also done in mice, showed that similar carbon nanotubes eventually led to cancerous tumors.

Carbon nanotubes, tube-shaped carbon molecules just tens of nanometers in diameter, have excellent electronic and mechanical properties that make them attractive for a number of applications. They have already been incorporated into some products, such as tennis rackets and bicycles, and eventually they could be used in a wide variety of applications, including medical therapies, water purification, and ultrafast and compact computer chips.

“It’s a material that’s got many unique characteristics,” says Andrew Maynard, a coauthor of one of the studies, which appears in the current issue of Nature Nanotechnology. “But of course nothing comes along like this that is completely free from risk.” Carbon nanotubes that are straight and 20 micrometers or longer in length–qualities that are well suited for composite materials used in sports equipment–resemble asbestos fibers. This has long led many experts to suggest that these carbon nanotubes might pose the same health risks as asbestos, a fire-resistant material that can cause mesothelioma, a cancer of a type of tissue surrounding the lungs. But until now, strong scientific evidence for this theory was lacking.

The new studies partially confirm the carbon nanotubes’ similarity to asbestos by showing that long, straight carbon nanotubes injected into mesothelial tissues in mice cause the sort of lesions and inflammation that also develop as a result of asbestos.

Such reactions are a strong indicator that cancer will develop with chronic exposure. One of the studies, which appeared in the Journal of Toxicological Study and was done by researchers at Japan’s National Institute of Health Sciences, also showed actual cancerous tumors. The Nature Nanotechnology study was done primarily by researchers in the United Kingdom at the University of Edinburgh and elsewhere. What isn’t known is whether, during nanotubes’ manufacture, use, and disposal, they can become airborne and be inhaled in sufficient quantities to cause problems. Indeed, earlier work has shown that it is actually difficult to get carbon nanotubes airborne, since they tend to clump together, says Maynard, the chief science advisor for the Project on Emerging Nanotechnologies, at the Woodrow Wilson Center for

Scholars, in Washington, DC. He says that this could decrease the chance that they will be inhaled. He adds that further research is needed to confirm this. Not all types of carbon nanotubes behave like cancer-causing asbestos. TheNature Nanotechnology article showed that short nanotubes (those less than 15 micrometers long) and long nanotubes that have become very tangled do not cause inflammation and lesions. Also, while the study did not look explicitly at single-walled nanotubes, these tend to be shorter and more tangled than multiwalled nanotubes, so they probably won’t act like asbestos, the researchers say. The authors suggest that this could be because such nanotubes can easily be taken up by immune cells called macrophages, and long, straighter ones can’t. (Macrophages can only stretch to 20 micrometers, which makes it difficult for them to engulf nanotubes longer than that.) This finding is consistent withresults published in

January that suggest that certain types of short carbon nanotubes are nontoxic to mice, says Hongjie Dai, the professor of chemistry at Stanford University who published the earlier work. Short nanotubes are likely to be useful in electronics and medical applications, while long, multiwalled nanotubes are more attractive for composite materials because of their mechanical strength. Dai says that it’s important not to lump all carbon nanotubes together, since they can have very different characteristics depending on how they are manufactured.

The Nature

Nanotechnology study is a strong one because it establishes the link between a particular type of nanotube and asbestos-like symptoms, while controlling for chemical impurities

that are a by-product of manufacturing carbon nanotubes, says Vicki Colvin, a professor of chemistry and chemical and biological engineering at Rice University in Houston, TX. Such chemical impurities have led to contradictory results in earlier toxicity studies on nanoparticles. The Journal of Toxicological Study paper, which showed not only that long carbon nanotubes could cause lesions, but also that these can actually lead to cancerous tumors, had the drawback that the researchers used genetically modified mice that are particularly sensitive to asbestos, Colvin says. But that study still shows a relationship between these particular kinds of carbon nanotubes and mesothelioma. As is the case with asbestos, carbon nanotubes are not likely to cause problems while they’re embedded inside products. It’s most important to protect workers involved in the manufacturing and disposal of these products, at which point the nanotubes could be released into the air, the authors of the Nature Nanotechnology study say. This could be done with established methods for handling fibrous particles, Colvin says, and by starting to keep track of what products have the potentially dangerous nanotubes–something that’s not done systematically now. Armed with the results, engineers could possibly use types of carbon nanotubes that are safer, Maynard says.

Anthony Seaton, one of the authors of the Nature Nanotechnology paper, a researcher at the University of Aberdeen, and a medical doctor who has treated people exposed to asbestos, draws a connection between the promise of carbon nanotubes and the hope people once had for asbestos. Asbestos, like carbon nanotubes, was seemingly ideal for many applications. At one point,

Seaton says, asbestos was “almost ubiquitous.” But whereas the dangers of asbestos weren’t recognized and dealt with until people got sick, the new findings present a chance to keep people from being hurt, he says, by taking preventative measures. “We’ve learned a serious lesson from asbestos,” Seaton says.

7. Virulent diseases cannot cause extinction because of burnout theory

Gerber 5 (Leah R. Gerber, PhD, Associate Professor of Ecology, Evolution, and Environmental

Sciences, Ecological Society of America, Exposing Extinction Risk Analysis to Pathogens: Is

Disease Just Another Form of Density Dependence?

August 2005)//rh

The density of a population is an important parameter for both PVA and host–pathogen theory. A fundamental principle of epidemiology is that the spread of an infectious disease through a population is a function of the density of both susceptible and infectious hosts

.

If infectious agents are supportable by the host species of conservation interest, the impact of a pathogen on a declining population is likely to decrease as the host population declines

. A pathogen will spread when, on average, it is able to transmit to a susceptible host before an infected host dies or eliminates the infection (Kermack and McKendrick 1927, Anderson and May 1991

). If the parasite affects the reproduction or mortality of its host, or the host is able to mount an immune response, the parasite population may eventually reduce the density of susceptible hosts to a level at which the rate of parasite increase is no longer positive

. Most epidemiological models indicate that there is a host threshold density (or local population size) below which a parasite cannot invade, suggesting that rare or depleted species should be less subject to host-specific disease.

This has implications for small, yet increasing, populations. For example, although endangered species at low density may be less susceptible to a disease outbreak,

recovery to higher densities places them at increasing risk of future disease-related decline (e.g., southern sea otters; Gerber et al. 2004). In the absence of stochastic factors (such as those modeled in PVA), and given the usual assumption of disease models that the chance that a susceptible host will become infected is proportional to the density of

infected hosts (the mass action assumption) a hostspecific pathogen cannot drive its host to extinction

(McCallum and Dobson 1995). Extinction in the absence of stochasticity is possible if alternate hosts (sometimes called reservoir hosts) relax the extent to which transmission depends on the density of the endangered host species. Similarly, if transmission occurs at a rate proportional to the frequency of infected hosts relative to uninfected hosts (see McCallum et al. 2001), endangered hosts at low density may still face the threat of extinction by disease. These possibilities suggest that the complexities characteristic of many real host– pathogen systems may have very direct implications for the recovery of rare endangered species.

Their impact is hype and out of context – their impact card is in the context of the Chytrid

Fungus, but several problems – first – Fungi are spores which require air to live – means that these deadly fungi are only found on land and therefore exploring the ocean can’t solve.

Second, first the Chytrid Fungus only infects amphibians and therefore can’t cause human extinction. Also – The Chytrid Fungus has been infecting amphibians for over 50 years and there has not been any serious extinction scenario. Finally, current research into fungal pathogens solves

Science Codex 12 (Super cool and accurate science website, Preserved Frogs Hold Clues to

Deadly Pathogen, June 20, 2012, http://www.sciencecodex.com/preserved_frogs_hold_clues_to_deadly_pathogen-93651)//rh

A Yale graduate student has developed a novel means for charting the history of a pathogen deadly to amphibians worldwide. Katy Richards-Hrdlicka, a doctoral candidate at the

Yale School of Forestry & Environmental Studies, examined 164 preserved amphibians for the presence of Batrachochytrium dendrobatidis, or Bd, an infectious pathogen driving many species to extinction. The pathogen is found on every continent inhabited by amphibians and in more than 200 species. Bd causes chytridiomycosis, which is one of the most devastating infectious diseases to vertebrate wildlife. Her paper, "Extracting the Amphibian Chytrid Fungus from

Formalin-fixed Specimens," was published in the British Ecological Society's journal Methods in

Ecology and Evolution and can be viewed at http://onlinelibrary.wiley.com/doi/10.1111/j.2041-

210X.2012.00228.x/full. Richards-Hrdlicka swabbed the skin of 10 species of amphibians dating back to 1963 and preserved in formalin at the Peabody Museum of Natural History.

Those swabs were then analyzed for the presence of the deadly pathogen . The frog being swabbed is a Golden Toad (Cranopsis periglenes) of Monteverde, Costa Rica. The species is extinct as a result of a lethal Bd infection. (Photo Credit: Michael Hrdlicka) "I have long proposed that the millions of amphibians maintained in natural-history collections around the world are just waiting to be sampled, " she said. The samples were then analyzed using a highly sensitive molecular test called quantitative polymerase chain reaction (qPCR) that can detect Bd DNA, even from specimens originally fixed in formalin. Formalin has long been recognized as a potent chemical that destroys DNA. "This advancement holds promise to uncover Bd's global or regional date and place of arrival, and it could also help determine if some of the recent extinctions or disappearances could be tied to Bd," said Richards-Hrdlicka.

" Scientists will also be able to identify deeper molecular patterns of the pathogen, such as genetic changes and patterns relating to strain differences, virulence levels and its population genetics ." Richards-Hrdlicka found Bd in six specimens from Guilford, Conn., dating back to 1968, the earliest record of Bd in the Northeast. Four other animals from the

1960s were infected and came from Hamden, Litchfield and Woodbridge. From specimens

collected in the 2000s, 27 infected with Bd came from Woodbridge and southern

Connecticut.

In other related work, she found that nearly 30 percent of amphibians in

Connecticut today are infected , yet show no outward signs of infection. Amphibian populations and species around the world are declining or disappearing as a result of land-use change, habitat loss, climate change and disease. The chytrid fungus , caused by Bd, suffocates amphibians by preventing them from respiring through their skin . Since Bd's identification in the late 1990s, there has been an intercontinental effort to document amphibian populations and species infected with it. Richards-Hrdlicka's work will enable researchers to look to the past for additional insight into this pathogen's impact

2NC Extension Health Problems

1.

Nanotechnology has been linked to numerous health problems

Philip E.

Ross ,

May 1, 20

06

(Guest contributor for MIT Tech Review, http://www.technologyreview.com/featuredstory/405743/tinytoxins/page/3/, Tiny Toxins? SRB)

It was just the type of event that many in the nanotechnology community have feared

– and warned against. In late March

, six people went to the hospital with serious

(but nonfatal) respiratory problems

after using a German household cleaning product called Magic Nano. Though it was unclear at the time what had caused the illnesses – and even whether the aerosol cleaner contained any nanoparticles – the events reignited the debate over the safety of consumer products that use nanotechnology. The number of products fitting that description has now topped 200, according to a survey published in March by the Project on Emerging Nanotechnologies in Washington, DC. Among them are additives that catalyze combustion in diesel fuel, polymers used in vehicles, high-strength materials for tennis rackets and golf clubs, treated stain-resistant fabrics, and cosmetics. These products incorporate everything from buckyballs – soccer ball-shaped carbon molecules named after

Buckminster Fuller – to less exotic materials such as nanoparticles of zinc oxide. But they all have one thing in common: their “nano” components have not undergone thorough safety tests.

Nanoparticles, which are less than 100 nanometers in size, have long been familiar as byproducts of combustion or constituents of air pollution; but increasingly, researchers are designing and synthesizing ultrasmall particles to take advantage of their novel properties.

Most toxicologists agree that nanoparticles are neither uniformly dangerous nor uniformly safe, but that the chemical and physical properties that make them potentially valuable may also make their toxicities differ from those of the same materials in bulk form.

One of the reasons for concern about nanoparticles’ toxicity has to do with simple physics.

For instance, as a particle shrinks, the ratio of its surface area to its mass rises. A material that’s seemingly inert in bulk thus has a larger surface area as a collection of nanoparticles, which can lead to greater reactivity. For certain applications, this is an advantage; but it can also mean greater toxicity. “The normal measure of toxicity is the mass of the toxin, but with nanomaterials, you need a whole different set of metrics,” says Vicki Colvin, a professor of chemistry at Rice University in Houston and a leading expert on nanomaterials.

Beyond the question of increased reactivity, the sheer tininess of nanoparticles is itself a cause for concern. Toxicologists have known for years that relatively small particles could create health problems when inhaled. Researchers have found evidence that the smaller particles are, the more easily they can get past the mucus membranes in the nose and bronchial tubes to lodge in the alveoli, the tiny sacs in the lungs where carbon dioxide in the blood is exchanged for oxygen.

In the alveoli, the particles face the white-cell scavengers known as macrophages, which engulf them and clear them from the body.

But at high doses, the particles overload the clearance mechanisms.

It is the potential growth, however, of technologies involving precisely engineered nanoparticles, such as buckyballs and their near cousins, carbon nanotubes, and the use of these new materials in consumer products that has made the question of toxicity particularly urgent. In addition to questions about how easily nanoparticles can penetrate the body, there is also debate over where they could end up once inside. Günter Oberdörster, a toxicologist at the University of Rochester, found that various kinds of carbon nanoparticles, averaging 30 to 35 nanometers in diameter, could enter the olfactory nerve in rodents and climb all the way up to the brain. “

There is a possibility that because of their small size, nanoparticles can reach sites in the body that large particles cannot, cross barriers, and react,

” says Oberdörster. In 2004, Oberdörster’s daughter, Eva Oberdörster, a toxicology researcher at Duke

University, put largemouth bass into water containing buckyballs

at the concentration of one part per million.

After two days, the lipids in the brains of the fish showed 17 times as much oxidative damage as those of unexposed fish.

Carbon nanotubes, which are basically cylindrical versions of the spherical buckyballs, are one of the stars of nanotech, with potential uses in everything from solar cells to computer chips. But in 2003, researchers at NASA

’s Johnson Space Center in Houston, headed by Chiu-Wing Lam, showed that in the lungs of mice, carbon nanotubes caused lesions that got progressively worse over time

. Under the conditions of the experiment, the researchers concluded, carbon nanotubes were “much more toxic than carbon black

[that is, soot] and can be more toxic than quartz, which is considered a serious occupational health hazard.

” Another extremely promising nanoparticle is the fluorescent “quantum dot,” now being explored for use in bioimaging. Researchers envision applications in which they tag the glowing nanodots with antibodies, inject them into subjects, and watch as they selectively highlight certain tissues or, say, tumors. Quantum dots are typically made of cadmium selenide, which can be toxic as a bulk material, so researchers encase them in a protective coating. But it is not yet known whether the dots will linger in the body, or whether the coating will degrade, releasing its cargo. Sensible regulation of nanoparticles will require new methods for assessing toxicity, which take into account the qualitative differences between nanoparticles and other regulated chemicals. Preferably, those methods will be generally applicable to a wide spectrum of materials. Today’s assays are not adequate for the purpose, says Oberdörster. “We have to formalize a tiered approach,” he says, “beginning with noncellular studies to determine the reactivity of particles, then moving on to in vitro cellular studies, and finally in vivo studies in animals. We have to establish that some particles are benign and others are reactive, then benchmark new particles against them.”

Separately testing every newly developed type of nanoparticle would be a Herculean task

, so Rice’s Colvin wants to develop a model that indicates whether a particular nanoparticle deserves special screening. “My dream is that there would be a predictive algorithm that would say, for a certain size and surface coating, this particular type of material is one you’d want to stay away from,” she says. “We should be able to do it, with the advance we have made in computing power, but we have to ask the right questions. For instance, is it acute cytotoxicity, or is it something else?” Amidst all the uncertainty about evaluating nanoparticles’ toxicity, regulatory agencies are in something of a quandary. In the United States, the Food and Drug Administration will assess medical products that incorporate nanoparticles, such as the quantum dots now being tested in animals; the Occupational Safety and Health Administration is responsible for the workplace environment in the factories that make the products involving nanoparticles; and the Environmental Protection Agency looks at products or chemicals that broadly permeate the environment, like additives to diesel fuel. In principle, these federal agencies have sweeping power over nanomaterials, but at the moment, their

traditional focus, their limited resources, and the sheer lack of test tube and clinical data make effective oversight next to impossible. For example, the National Institute for Occupational Safety and Health, the part of the Centers for Disease Control and Prevention in Atlanta responsible for studying and tracking workplace safety, acknowledges that “minimal information” is available on the health risks of making nanomaterials. The agency also points out that there are no reliable figures on the number of workers exposed to engineered nanomaterials. The EPA seems further along. In its draft “Nanotechnology White Paper,” issued in December, it proposed interagency negotiations to hammer out standards and pool resources. It acknowledged that at present, some nanoparticles that should be under its review are not, because they are not included in the inventory of chemicals controlled under the Toxic Substances Control Act. The EPA must defend the safety not only of human beings but of the natural environment – plants and ecological systems that may be exposed to a regulated material. There is scant data on the effects of nanomaterials in the environment, but some of it is troubling.

One study, for example, showed that alumina nanoparticles, which are already commonly used, inhibit root growth in some plants.

In a report written for the Project on Emerging Technologies, J.

Clarence Davies, assistant administrator for policy, planning, and evaluation at the EPA from 1989 to 1991, advocates passing a new law assigning responsibility for nanomaterial regulation to a single interagency regulatory authority. Davies would also require manufacturers to prove their nanotech products safe until enough evidence had been gained to warrant exemptions. But some executives in the nanotech industry cringe at the prospect of such regulations. Alan Gotcher, head of Altair Nanotechnologies, a manufacturer in Reno, NV, that makes various types of nanoparticles, testified before the U.S. Senate in February and cited the Davies report. “

To fall into ‘a one-size-fits-all’ approach to nanotechnology,” he said, “is irresponsible and counterproductive

.” Gotcher would prefer a government-funded effort to amass the necessary data and build the necessary models before setting any standards. It is doubtful, however, that the nanotech community will stop developing new products, or that the public will stop buying them, while awaiting a new regulatory framework that could take years and millions of dollars to finalize. While few agree on how to efficiently determine the toxicity of nanoparticles, or how to regulate them, nearly everyone agrees on the urgency of quickly tackling both questions. The use of nanoparticles in consumer products like cosmetics and cleaners represents only a tiny sliver of nanotech’s potential, but any unresolved safety concerns could cast a huge shadow. “If I was someone producing these materials, I would be afraid that one health problem, anywhere, would hurt the entire industry,” says Peter Hoet, a toxicologist at the Catholic University of Leuven, in Belgium. The large consumer corporations DuPont and Procter and

Gamble participated in a study on nanoparticles’ toxicity. But the nanotech community needs to put pressure on manufacturers using the “nano” label for marketing purposes to stand up and take responsibility for their products. That means contributing resources and money to toxicity studies and freely disclosing which nanotechnologies they are relying on.

Warming

1NC

1.

Co2 is the ONLY solution to stop the impending food crisis that will kill millions.

Idso, 11 (Craig D., PhD Center for the Study of Carbon Dioxide and Global Change, 6/15/11, Center for the Study of Carbon Dioxide and Global Climate Change

, “Estimates of Global Food Production in the

Year 2050: Will We Produce Enough to Adequately Feed the World?” AS)

As indicated in the material above, a very real and devastating food crisis is looming on the horizon, and continuing advancements in agricultural technology and expertise will most likely not be able to bridge the gap between global food supply and global food demand just a few short years from now.

However, the positive impact of Earth’s rising atmospheric

CO2 concentration on crop yields will considerably lessen the severity of the coming food shortage . In some regions and countries it will mean the difference between being food secure or food insecure ; and it will aid in lifting untold hundreds of millions out of a state of hunger and malnutrition, preventing starvation and premature death .

For those regions of the globe where neither

¶ enhancements in the techno-intel effect nor

¶ the rise in CO2 are projected to foster food

¶ security, an Apollo moon-mission-like

¶ commitment is needed by governments and

¶ researchers to further increase crop yields

¶ per unit of land area planted, nutrients applied, and water used. And about the only truly viable option for doing so (without taking enormous amounts of land and water from nature

and driving untold numbers of plant and animal species to extinction) is to have researchers and governments invest the time, effort and capital needed to identify and to prepare for production the plant genotypes that are most capable of maximizing CO2 benefits for important food crops.

Rice, for example , is the third most important global food crop , accounting for 9.4% of global food production. Based upon data presented in the CO2 Science Plant

Growth Database, the average growth response of rice to a 300-ppm increase in the air’s CO2 concentration is 35.7%. However, data obtained from De

Costa et al. (2007), who studied the growth responses of 16 different rice genotypes, revealed

CO2-induced productivity increases ranging from -7% to +263%. Therefore, if countries learned to identify which genotypes provided the largest yield increases per unit of CO2 rise, and then grew those genotypes , it is quite possible that the world could collectively produce enough food to supply the needs of all its inhabitants . But since rising Co2 concentrations are considered by many people to be the primary cause of global warming, we are faced with a dilemma of major proportions.

If proposed regulations restricting

¶ anthropogenic

CO2 emissions (which are

¶ designed to remedy the potential global

¶ warming problem) are enacted, they will

¶ greatly exacerbate future food problems

¶ by reducing the CO2-induced yield

¶ enhancements that are needed to

¶ supplement increases provided by

¶ advances in agricultural technology and

¶ expertise. And as a result of such CO2

¶ emissions regulations, hundreds of millions of the world’s population will be subjected to hunger and malnutrition.

Even more troubling is the fact that thousands would die daily as a result of health problems they likely would have survived had they received adequate food and nutrition. About the only option for avoiding the food crisis , and its negative ramifications for humanity and nature alike,

is to allow the atmospheric CO2 concentration to continue to rise as predicted (no CO2 emission restrictions), and then to learn to maximize those benefits through the growing of

CO2-loving cultivars.

2.

The food crisis outweighs any impact of climate change.

Idso, 11 (Craig D., PhD Center for the Study of Carbon Dioxide and Global Change, 6/15/11, Center for the Study of Carbon Dioxide and Global Climate Change , “Estimates of Global Food Production in the

Year 2050: Will We Produce Enough to Adequately Feed the World?” AS)

In light of the host of real-world research findings discussed in the body of this report, it should be evident to all that the looming food shortage facing humanity mere years to decades from now is far more significant than the theoretical and largely unproven catastrophic climate - and weather-related projections of the world’s climate alarmists

. And it should also be clear that the factor that figures most prominently in both scenarios is the air’s CO2 content.

The theorists proclaim that we must drastically reduce anthropogenic CO2 emissions by whatever means possible, including drastic government interventions in free-market enterprise systems. The realists suggest that letting economic progress take its natural unimpeded course is the only way to enable the air’s CO2 content to reach a level that will provide the aerial fertilization effect of atmospheric CO2 enrichment that will be needed to provide the extra food production that will be required to forestall massive human starvation and all the social unrest and warfare that will unavoidably accompany it, as well as humanity’s decimation of what little yet remains of pristine nature, which will include the driving to extinction of untold numbers of both plant and animal species.

Climate alarmists totally misuse the precautionary principle when they ignore the reality of the approaching lack-offood -induced crisis that would decimate the entire biosphere , and when they claim instead that the catastrophic projections of their climate models are so horrendous that anthropogenic

CO2 emissions must be reduced at all costs. Such actions should not even be contemplated without first acknowledging the fact that none of the catastrophic consequences of rising global temperatures have yet been conclusively documented , as well as the much greater likelihood of the horrendous global food crisis that would follow such actions. The two potential futures must be weighed in the balance , and very carefully, before any such actions are taken.

3.

Warming irreversible b/c of feedback loops

Mims 12 ( Christopher, staff writer for Grist, “Climate scientists: It’s basically too late to stop warming”, 3/26/12, http://grist.org/list/climate-scientists-its-basically-too-late-to-stop-warming/

, HG)

If you like cool weather and not having to club your neighbors as you battle for scarce resources, now’s the time to move to Canada, because the story of the 21st century is almost written , reports Reuters . Global warming is close to being irreversible, and in some cases that ship

has already sailed . Scientists have been saying for a while that we have until between 2015 and

2020 to start radically reducing our carbon emissions, and what do you know: That deadline’s almost past! Crazy how these things sneak up on you while you’re squabbling about whether global warming is a religion. Also, our science got better in the meantime, so now we know that no matter what we do, we can say adios to the planet’s ice caps . For ice sheets — huge refrigerators that slow down the warming of the planet — the tipping point has probably already been passed, Steffen said. The West Antarctic ice sheet has shrunk over the last decade and the Greenland ice sheet has lost around 200 cubic km (48 cubic miles) a year since the

1990s. Here’s what happens next:

Natural climate feedbacks will take over and, on top of our prodigious human-caused carbon emissions, send us over an irreversible tipping point . By

2100, the planet will be hotter than it’s been since the time of the dinosaurs, and everyone who lives in red states will pretty much get the apocalypse they’ve been hoping for. The subtropics will expand northward, t he bottom half of the U.S. will turn into an inhospitable desert, and everyone who lives there will be drinking recycled pee and struggling to salvage something from an economy wrecked by the destruction of agriculture, industry, and electrical power production. Water shortages, rapidly rising seas, superstorms swamping hundreds of billions of dollars’ worth of infrastructure: It

’s all a-coming, and anyone who is aware of the political realities knows that the odds are slim that our government will move in time to do anything to avert the biggest and most avoidable disaster short of all-out nuclear war. Even if our government did act, we can’t control the emissions of the developing world.

China is now the biggest emitter of greenhouse gases on the planet and its inherently unstable autocratic political system demands growth at all costs. That means coal . Meanwhile, engineers and petroleum geologists are hoping to solve the energy crisis by harvesting and burning the nearly limitless supplies of natural gas frozen in methane hydrates at the bottom of the ocean, a source of atmospheric carbon previously considered so exotic that it didn’t even enter into existing climate models. So, welcome to the 21st century. Hope you packed your survival instinct.

4.

Regardless of previous or future climate policies, ice-melt is unstoppable.

Rignot 14 – Eric Rignot , glaciologist at NASA's Jet Propulsion Laboratory, the lead author of last week's landmark scientific paper on West Antartica (“Global warming: it's a point of no return in West Antarctica. What happens next?” The Guardian , May 17, 2014, Available at: http://www.theguardian.com/commentisfree/2014/may/17/climate-change-antarctica-glaciersmelting-global-warming-nasa, Accessed on: 7/17/2014, IJ)

We announced that we had collected enough observations to conclude that the retreat of ice in the Amundsen sea sector of West Antarctica was unstoppable , with major consequences – it will mean that sea levels will rise one metre worldwide. What's more, its disappearance will likely trigger the collapse of the rest of the West Antarctic ice sheet, which comes with a sea level rise of between three and five metres. Such an event will displace millions of people worldwide.

Two centuries – if that is what it takes – may seem like a long time, but there is no red button to stop this process.

Reversing the climate system to what it was in the 1970s seems

unlikely; we can barely get a grip on emissions that have tripled since the Kyoto protocol, which was designed to hit reduction targets.

Slowing down climate warming remains a good idea, however – the Antarctic system will at least take longer to get to this point.

The Amundsen sea sector is almost as big as France. Six glaciers drain it. The two largest ones are Pine Island glacier (30km wide) and Thwaites glacier (100km wide). They stretch over

500km.

What this means is that we may be ultimately responsible for triggering the fast retreat of

West Antarctica.

This part of the continent was likely to retreat anyway, but we probably pushed it there faster. It remains difficult to put a timescale on it, because the computer models are not good enough yet, but it could be within a couple of centuries, as I noted. There is also a bigger picture than West Antarctica. The Amundsen sea sector is not the only vulnerable part of the continent. East Antarctica includes marine-based sectors that hold more ice. One of them,

Totten glacier, holds the equivalent of seven metres of global sea level.

5.

CFC’S responsible for global warming, not CO2-disregard the affirmative because they focus on CO2

Bastasch 13 (Michael Bastach, quoting studies “REPORT: CO2 IS NOT RESPOSNIBLE FOR

GLOBAL WARMING” May 30, 2013 http://dailycaller.com/2013/05/30/report-co2-notresponsible-for-global-warming/2/”, HG)

Chlorofluorocarbons ( CFCs ) — not carbon emissions — are the real culprit behind global warming, claims a new study out of the University of Waterloo.

¶ “Conventional thinking says that the emission of human-made non-CFC gases such as carbon dioxide has mainly contributed to global warming. But we have observed data going back to the Industrial

Revolution that convincingly shows that conventional understanding is wrong

,” said Qing-Bin

Lu, a science professor at the University of Waterloo and author of the study.¶ “In fact, the data shows that CFCs conspiring with cosmic rays caused both the polar ozone hole and global warming

,” Lu said.¶ Ads by Google¶ Ads by CouponDropDown ¶ Lu’s findings were published in the International Journal of Modern Physics B and analyzed data from 1850 to the present.¶

Lu’s study runs counter to the long-standing argument that carbon dioxide emissions were the driving force behind global warming. Recently scientists warned that carbon concentrations were nearing the 400 parts per million level. Scientists say that carbon dioxide levels must be lowered to 350 ppm to avoid the severe impacts of global warming.¶ “The 400-ppm threshold is a sobering milestone and should serve as a wake-up call for all of us to support clean-energy technology and reduce emissions of greenhouse gases before it’s too late for our children and grandchildren,” said Tim Lueker, an oceanographer and carbon cycle researcher who is a member of the Scripps CO2 Group.¶ Lu notes that data from 1850 to 1970 show carbon emissions increasing due to the Industrial Revolution. However, global temperatures stayed constant.¶ “The conventional warming model of CO2, suggests the temperatures should have risen by 0.6°C over the same period, similar to the period of 1970-2002,” reads the study’s press release.¶ Ads by Google¶ CFCs “are nontoxic, nonflammable chemicals containing atoms of carbon, chlorine, and fluorine” that are used to make “aerosol sprays, blowing agents for foams and packing materials, as solvents, and as refrigerants” according to the National Oceanic and

Atmospheric Administration. The Montreal Protocol phased out the production of CFCs as they were believed to be linked to ozone depletion. According to the National Institutes of Health,

CFCs are considered a greenhouse gas, like carbon dioxide, because they absorb heat in the atmosphere and send some of it back to the earth’s surface, which contributes to global warming.

“From the University of Waterloo, an extraordinary claim,’ writes global warming blogger Anthony Watt. “While plausible, due to the fact that CFC’s have very high [Global

Warming Potential] numbers, their atmospheric concentrations compared to CO2 are quite low, and the radiative forcings they add are small by comparison to CO2.”

“This may be nothing more than coincidental correlation,” Watt added. “But, I have to admit, the graph is visually compelling. But to determine if his proposed cosmic-ray-driven electron-reaction mechanism is valid, I’d say it is a case of ‘further study is needed’, and worth funding.”

When Barack Obama promised to slow the earth’s rising sea levels and heal the planet during the 2008 campaign, he probably had no idea that curbing carbon dioxide emissions might not lower the sea levels.¶ A study published in the Journal of Geodesy found that the sea level has only risen by 1.7 millimeters per year over the last 110 years — about 6.7 inches per century — all while carbon dioxide concentrations in the air have risen by a third, suggesting that rising carbon concentrations have not impacted the rate at which sea levels are rising.¶ The study used data from the Gravity Recovery And Climate Experiment satellite mission and analyzed “continental mass variations on a global scale, including both land-ice and land-water contributions, for 19 continental areas that exhibited significant signals” over a nine-year period from 2002 to 2011.¶

The results echoed a study conducted last year, which also found that sea level has been rising on average by 1.7 mm/year over the last 110 years. This was also suggested by two other studies conducted in the last decade.¶ “The latest results show once again that sea levels are not accelerating after all, and are merely continuing their modest rise at an unchanged rate,” said

Pierre Gosselin, who runs the climate skeptic blog NoTricksZone. “The more alarmist sea level rise rates some have claimed recently stem from the use of statistical tricks and the very selective use of data. Fortunately, these fudged alarmist rates do not agree with real-life observations. Overall the latest computed rates show that there is absolutely nothing to be alarmed about.”¶

Other experts agree, citing data regarding the Earth’s rate of rotation.¶ “For the last 40-50 years strong observational facts indicate virtually stable sea level conditions,” writes Nils-Axel Mörner, former head of the Paleogeophysics and Geodynamics department at

Stockholm University , in the Journal Energy and Environment. ”The Earth’s rate of rotation records a mean acceleration from 1972 to 2012, contradicting all claims of a rapid global sea level rise, and instead suggests stable, to slightly falling, sea levels.”¶ But in the wake of

Hurricane Sandy, U.S. coastal states have been more concerned about the possible effects of global warming on rising sea levels.¶ A report by 21 U.S. scientists, commissioned by Maryland

Democratic Gov. Martin O’Malley, found that the sea levels are rising faster than they predicted five years ago. Florida Keys residents are also concerned about sea levels by the island that have risen 9 inches in the past decade, according to a tidal gauge that has operated since pre-Civil War days.¶ “It doesn’t need a lot of rocket science,” said Donald Boesch, president of the University of Maryland Center for Environmental Science. “We’ve got tide gauges that show us sea level is increasing. This is a real phenomenon. We should take it seriously and have to plan for it.”¶ The

Maryland report found that ocean waters and the Chesapeake Bay might only rise about one foot

by 2050, but the study’s authors said that it would be prudent to plan for a two-foot rise in sea levels to account for the risks of flooding caused by storms. The state has already seen sea levels rise by about a foot in the past century — half coming from the natural sinking of the land and the other half coming from rising seas from a warming ocean.¶ New York City Mayor Michael

Bloomberg has also announced a $20 billion plan to adapt to global warming to prepare the city for rising sea levels and hotter summers.¶ A report commissioned by New York City found that the number of sweltering summer days could double, maybe even triple, and that waters surrounding the city could rise by 2 feet or more¶ New York City can “do nothing and expose ourselves to an increasing frequency of Sandy-like storms that do more and more damage,”

Bloomberg remarked. “Or we can make the investments necessary to build a stronger, more resilient New York — investments that will pay for themselves many times over in the years go to come.”

Solvency

1NC

1. No Investment means they can’t solve – too risky, low return, and safety concerns

Sweeney 11 (Deborah, CEO, MyCorporation.com, Should Investors Roll the Dice with

Nanotechnology?

Nanotech-now, August 8, 2011, http://www.nanotechnow.com/columns/?article=566)//rh

Great rewards typically only come with great risks, and investing in Nanotechnology is a definite and substantial risk.

But with all investment opportunities, potential backers of Nano-products need to determine whether the gamble is worth the initial risk, or if their money would be better spent on safer, older products.

The answer to that is not very simple, and all investments should be considered carefully

, but there are signs that it may not be such a bad idea to place a bet on Nanotech's future.

The US dominates the

Nanotechnology field, but that may soon change if steps aren't taken to protect the fledgling industry from outside competition and give it the necessary funding to grow

.

There is a lot of money being pumped into Nanotech

research and development but, so far, not enough return.

The commercial applications of nanotechnology are limitless, but

too few innovations of the field are tweaked to become viable, commercial products.

This needs to change if investment is expected to continue.

Before looking at a project, and potentially sinking money into it, investors must ask if this can be sold to someone.

Whether it is the military, the health care industry, or just to the average consumer; what is the end game for this product? If they, or the person seeking funding, cannot answer that question, then investors may want to look elsewhere in the field

. Recently, the US government has become more involved with the safety of Nanotechnology, which means investors are just as concerned.

The United States federal government's nanotech program, the National

Nanotechnology Initiative, has begun to work closely with public health organizations like the FDA, indicating safety is a priority.

If a project you invest is found to have a harmful impact on the environment, or worse consumer health, it will likely be shut down

. The United States invests quite a bit of federal money into

Nanotechnology as well, so they have the economic, as well as legal, sway to ensure the demise of any questionable innovation. The company seeking investment should be as open as possible when it comes to their product's safety. An in-house safety group is great, but a third-party testing service is even better.

2. They can’t solve regulations - Their 1NC Matsurra 6 evidence says that regulations are key to solve but their plan text doesn’t mandate implementing any regulations

3. Nanotech is inevitable – pork barrel spending and private companies

Crews Jr. 3 (Clyde Wayne, vice president for policy and director of technology studies at the

Competitive Enterprise Institute, Washington’s Big Little Pork Barrel: Nanotechnology, CATO

Institute, May 29, 2003, http://www.cato.org/publications/commentary/washingtons-big-littlepork-barrel-nanotechnology)//rh

But now Republican advocacy of science pork is back. Exhibit A for 2003 is nanotechnology , the cutting-edge science of direct manipulation of matter at the molecular level. Government wants to get involved in a big way, despite companies such as IBM, Hewlett Packard and Intel

— and numerous venture capitalists — already taking the lead. Promised applications include smaller and cheaper computer chips, nano-scale “punch cards” to boost computer storage, stronger-than-steel carbon “nanotubes” with myriad applications, and new materials and coatings including responsive clothing. The field sports its share of hype :

Surely, promised “nanobots”

to attack cancers and other human ailments-or even repair cellular damage and revive cryogenically frozen human beings-remain in the far-distant future . Similarly, the proposed

“Starlight Express” carbon-nanotube elevator to outer space-from a NASA-funded outfit called

Highlift Systems-belongs to the realm of science fiction. Perhaps more representative are today’s uses in cosmetics and sunscreens , and “NanoTitanium” fishing rods that incorporate nano-particle titanium and carbon fiber. Regardless, the little technology has clearly reached the big time. Michael Crichton’s best-selling novel “Prey,” the story of destructive, out-of-control nanobots is surely only the latest in pop culture’s speculations on the dark side of microengineering. Meanwhile, the ETC Group, while alarmed about the potential hazards of unrestrained nanotechnology, points out that yearly scientific citations to “nano” have grown nearly 40-fold, the number of nano-related patents is surging, and nine nanotechnology-related

Nobel prizes have been awarded since 1990. To many in Congress, what’s needed is not a free hand for technology entrepreneurs to explore this blossoming field, but government money.

President’ Bush’s proposed 2004 fiscal year budget for the National Nanotechnology Initiative is

$847 million, a 9.5 percent increase over 2003. The NNI was created by the Bush administration in 2001. In addition, the House Science Committee authorized a $2.4 billion funding program for nanotechnology , and the full House approved it last week.

That’s not huge by

Washington standards, but such programs only grow.

Politicians have no innate ability to pick among competing technologies, whether nano, macro or otherwise. If they did, they’d be entrepreneurs themselves . And they’re particularly bad at the job when using taxpayer money . Politicians can merely transfer wealth, which automatically invites wasteful porkbarreling to propel funds to one’s home state. Scientific merit need not carry the day. But even if it did, taxpayers should get to decide for themselves which technologies to invest in.

Nanotechnology is plainly viable on its own , moving forward on fronts too numerous to catalog, all seeking to make breakthroughs before others. Nanotech venture capitalist Josh Wolfe told Wired that most business proposals he sees now have “nano” in the title. Venture capitalists have plowed in hundreds of millions of dollars over the past five years. And according to the

National Science Foundation, the market in nanotech products could be $1 trillion a year by

2015. That’s nearly 10 percent the size of today’s gross domestic product.

The vigorous calls for government research seem in part a reaction to the technology market downturn. But we ought not look for a technology savior in emergent biotech or nanotech spawned in government labs. Forthcoming technologies should be products of capitalism and entrepreneurship, not central planning, government R&D, and pork barrel . Tomorrow’s nanotechnology markets have too much potential and are too important be creatures of government. It’s still early enough in this particular pork game to stop it before it goes any further. My Cato Institute colleague Tom Miller put it best when asked by technology reporter

Declan McCullagh about federal nanotechnology funding:

“I suggest giving them nanodollars.”

4. Plastic too difficult to clean-up – multiple warrants – current models do more harm than good.

Discovery Channel 11 (What is preventing the cleanup of the Great Pacific Garbage Patch?

Curiosity.com From Discovery, 2011, http://curiosity.discovery.com/question/prevents-cleanupgreat-pacific-garbage)//rh

Several organizations are working on figuring out how to take on such a massive cleanup project.

One group, called Project Kaisei, proposed an interesting idea that involved dredging all the plastic from the site and recycling it by turning it into fuel, but in the process of removing the plastic, sea life would undoubtedly be harmed . Unfortunately, as beneficial as a cleanup would be to the environment and the local ecosystems, no group has yet been able to come up with a feasible plan that would not do more harm than good; most believe that cleaning up the patch is simply too big of a job. There are three primary obstacles to any effort proposed to clean up the Great Pacific Garbage Patch: Distance: The patch is not near any port or supplies. A cleanup would consume unrealistic amounts of time, fuel and other resources.

Photodegradation: This is the process by which sunlight degrades plastic. The sun's rays essentially dry out the plastic to the point that it breaks into countless tiny pieces . These bits float as far down as 300 feet below the water's surface, and no good method of picking them out of the water has been developed [source: Berton]. Cost: Any project that could overcome these challenges would be prohibitively expensive and would probably go bankrupt . Rather than planning a cleanup, it seems that the more realistic option right now is to prevent the patch from spreading, and encourage recycling as much as possible

2NC Extension Won’t Invest

1.

Can’t solve – no one will invest

Sargent 8 (John F., Specialist in Science and Technology Policy, Resources, Science, and

Industry Division, Nanotechnology and U.S. Competitiveness: Issues and Options, CRS Report for Congress, May 15, 2008, http://fas.org/sgp/crs/misc/RL34493.pdf)//rh

However, research and development investments, scientific papers, and patents may not provide reliable indicators of the United States’ current or future competitive position.

Scientific and technological leadership may not necessarily result in commercial leadership and/or in national competitiveness for the following reasons: Basic research in nanotechnology may not translate into viable commercial applications.

Though no formal assessment of the composition of the NNI budget has been made, there is general consensus that the NNI investment since its inception has been focused on basic research. The National Science

Foundation defines the objective of basic research as seeking “to gain more comprehensive knowledge or understanding of the subject under study without applications in mind.”20

Therefore, while basic research may underpin applied research, development, and commercialization, that is not its primary focus or intent . In general, basic research can take decades21 to result in commercial applications, and many advances in scientific understanding may not present commercial opportunities .

Off Case

T of the Ocean

Of indicates object of action

Merriam-Webster 14 2014 Merriam-Webster, Incorporated http://www.merriam-webster.com/dictionary/ of

9a — used as a function word to indicate the object of an action denoted or implied by the preceding noun <love of nature> b — used as a function word to indicate the application of a verb <cheats him of a dollar> or of an adjective <fond of candy>

Ocean means the body of salt water that covers the Earth

Merriam-Webster 14 Merriam-Webster 2014, Incorporated http://www.merriamwebster.com/dictionary/ocean

Full Definition of OCEAN

1 a : the whole body of salt water that covers nearly three fourths of the surface of the earth

b : any of the large bodies of water (as the Atlantic Ocean) into which the great ocean is divided

2: a very large or unlimited space or quantity

B. Violation – The aff only establishes nanotech recycling that occurs on the land, not the ocean

C. Standards –

1. Limits – The aff limits the topic by allowing any aff that does land based development based on already existing ocean infrastructure.

2. Extra T – The aff claims advantages off of land based development – this makes the negative burden of research impossible because w e have to research every possible land based effect of ocean development – this explodes the topic

D. T and Extra-T are voters for fairness and education

Politics Links??

Nanotech empirically popular

Sargent 8 (John F., Specialist in Science and Technology Policy, Resources, Science, and

Industry Division, Nanotechnology and U.S. Competitiveness: Issues and Options, CRS Report for Congress, May 15, 2008, http://fas.org/sgp/crs/misc/RL34493.pdf)//rh

Many areas of public policy could affect the ability of the United States to capture the future economic and societal benefits associated with these investments. Congress established programs , assigned responsibilities, authorized funding levels, and initiated research to address key issues in the 21st Century Nanotechnology Research and Development Act. The agency budget authorizations provided for in this act extend through FY2008 (see text box, “National

Nanotechnology Initiative,” for discussion of authorizations and appropriations).14 Both the

House and Senate have held committee hearings related to amending and reauthorizing this act in

2008. A companion report, CRS Report RL34401, The National Nanotechnology Initiative:

Overview, Reauthorization, and Appropriations Issues, by John F. Sargent, provides an overview of nanotechnology; the history, goals, structure, and federal funding of the National

Nanotechnology Initiative; and issues related to its management and reauthorization. As the state of nanotechnology knowledge has advanced, new policy issues have emerged. In addition to providing funding for nanotechnology R&D, Congress has directed increased attention to issues affecting the U.S. competitive position in nanotechnology and related issues, including nanomanufacturing; commercialization; environmental, health, and safety concerns; workforce development; and international collaboration. Views and options related to these issues are presented later in this report.

Neolib Links

Nanotechnology has become means of production within neoliberalism – their appeal to technology emphasizes efficiency and production while ceding the will to power to technocratic elites

Armitage 2 (John, Professor at the Winchester School of Art, University of Southhampton,

Resisting the Neoliberal Discourse of Technology, February 19, 2002, http://sami.is.free.fr/Oeuvres/armitage_resisting_discourse_technology.html)//rh

The Neoliberal Discourse of Technology

Contemporary neoliberalism is the pan-capitalist theory and practice of explicitly technologized, or "telematic", societies

. [4]

Neoliberalism

is of course a political philosophy which originated in the advanced countries in the 1980s. It is associated with the idea of "liberal fascism": free enterprise, economic globalization and national corporatism as the institutional and ideological grounds for the civil disciplining of subaltern individuals, "aliens" and groups.

However, while pan-capitalism appears largely impregnable to various oppositional political forces and survives broadly uncontested, it nonetheless relies

extensively on a specifically neoliberal discourse of technology

. What is more, this discourse is principally concerned with legitimating the political and cultural control of individuals, groups, and new social movements through the material and ideological production, promotion, distribution, and consumption of self-styled "virtual" technologies like virtual reality (VR) and cyberspace

. These contentions about pan-capitalism, telematics, and the neoliberal discourse of virtual technologies derive from the fact that human labour is no longer central to market-driven conceptions of business and political activities. Actually, as far as some neoliberals are concerned, technology is now the only factor of production

. [5]

Artefacts like VR, cyberspace, and the Internet thus embody not "use value" but what Arthur Kroker and Michael Weinstein term "abuse value":

"

The primary category of the political economy of virtual reality is abuse value. Things are valued for the injury that can be done to them or that they can do. Abuse value is the certain outcome of the politics of suicidal nihilism

. The transformation, that is, of the weak and the powerless into objects with one last value: to provide pleasure to the privileged beneficiaries of the will to purity in their sacrificial bleeding, sometimes actual

(Branch Davidians) and sometimes specular (Bosnia)." [6]

The neoliberal analysis of production under the conditions of pan-capitalism

and telemetry accordingly focuses

not on

the outmoded Marxian conception of the "labor process", but on the technological and scientific processing of labour

. [7]

The result is that surplus labor is transformed by relentless technological activity, and the means of virtual production produce abuse value

. Technology and the Politics of Cyberculture The technological fixations of the neoliberals are,

of course, presently extending themselves from virtual production to virtual culture; to technoscience and to cyberculture,

including the culture of cyborgs, cyberfeminism, cyberspace, cyberwarfare, and cyberart. [8] Nietzsche emphasizes, in The Wanderer and His Shadow, that technologies and machines are "...premises whose thousand year conclusion no one has yet dared to draw." [9] Yet, in scarcely over one hundred years, it has become clear that technology is not only voraciously consuming what is left of "nature," but is also busily constructing it anew

.

Nanotechnology, for example, brings together the basic atomic building blocks of nature effortlessly, cheaply, and in just about any molecular arrangement we ask.

[10]

Information and communications technologies evoke the virtual architecture and circuitry of fiber-optics, computer networks, cybernetic systems, and so on. These technologies, these assemblages, though, need to be appreciated for what they are: synthetic materials transformed into instruments of "the will to virtuality," or of human incorporation - even

"disappearance" - into cybernetic machinery

.

Cybercultural technologies are agents of

physical colonization, imperialists of the human sensorium,

created, like Frankenstein, by our own raw desire.

They represent what Virilio calls "the third revolution", the impending bodily internalization of science and technology

. As Virilio recently defined the third revolution: "By this term I mean that technology is becoming something physically assimilable, it is a kind of nourishment for the human race, through dynamic inserts, implants and so on.

Here, I am not talking about implants such as silicon breasts, but dynamic implants like additional memory storage. What we see here is that science and technology aim for miniaturisation in order to invade the human body

." [11] As a result, the division between living bodies and technology is increasingly difficult to maintain

; both are now so hopelessly entwined in the "cyborgian" sociotechnical imagination. [12] We are well on our way to "becoming machinic". As Deleuze and Guattari comment: "This is not animism, any more than it is mechanism; rather it is universal machinism: a plane of consistency occupied by an immense abstract machine comprising an infinite number of assemblages." [13] Nevertheless, the technologically determinist assemblages of sundry neoliberal computer mystics

, like Jaron Lanier and John Perry Barlow, are questionable because

cybercultural technologies, like all technologies, are innately political. Technologies like VR do not appear - like rainfall - as heavenly gifts. They have to be willed into existence, they have to be produced by real human beings.

Information and communications technologies, for instance, both contain and signify the cultural and political values of particular human societies.

Accordingly, these technologies are always expressions of socioeconomic, geographical, and political interests, partialities, alignments and commitments.

In brief, the will to technical knowledge is the will to technical power. It is crucial, then, to redefine, and to develop a fully conscious and wholly critical account of the neoliberal discourse of technology at work

in the realm of cyberculture; one that exposes not only the economic and social interests embodied within cultural technologies, but also their underlying authoritarianism

. Maybe

Marshall McLuhan was right? The medium is the message. The question is, what does it say? Moreover, how does it manage to say it so eloquently, so perfectly, that some among us are more than "willing" to trade corporeality for virtuality?

And all for what? A chance to dance to the (pre- programmed) rhythms of technologized bodies?

Indeed, it is hard to disagree with Hakim Bey when he writes: "Physical separateness can never be overcome by electronics, but only by

"conviviality", by "living together" in the most literal physical sense. The physically divided are also the conquered and the Controlled. "True desires" - erotic, gustatory, olfactory, musical, aesthetic, psychic, & spiritual - are best attained in a context of freedom of self and other in physical proximity & mutual aid. Everything else is at best a sort of representation." [14]

Nanotechnology is the next innovation in the technological revolution – this appeal to innovation is a strategy for neoliberalism to assure solving the next problem without having to change the system.

Reynolds and Szerszynski 12 (Laurence Reynolds and Bronislaw Szerszynski, Neoliberalism and technology: Perpetual innovation or perpetual crisis? this paper was published as Chapter 1 of Neoliberalism and Technoscience: Critical Assessments ,ed. Luigi Pellizzoni & Marja Ylönen,

Farnham: Ashgate, 2012, pp. 27-46, http://www.academia.edu/1937914/Neoliberalism_and_technology)//rh

The neoliberal era has often been imagined as a period of intense technological revolution .

The goal of a high-tech ‘knowledge based economy’ (KBE) of perpetual innovation has been elevated into a key guiding principle and salvationary strategy for advanced capitalist economies.

Innovation is held up as the solution to multiple problems that became apparent in the 1970s, including the crisis in capital accumulation, the globalization of competition and the rise of environmental degradation.

Since that decade we have been

dazzled by a seemingly escalating proliferation of innovations, from information technology and mobile telephony through to biotechnology and nanotechnology. Yet, at the same time, there is also a sense that the high-tech promise of the 1970s, of a ‘space age’ where robots would replace workers, or the later prediction of a ‘biotech century’, have somehow not been realized.

‘Tomorrow’s world’ never quite came about . 1 Skeptical commentators have questioned the idea of the technological fecundity of the neoliberal period, arguing instead that the last quarter of the twentieth century and after has been a ‘great stagnation’, where we have reached a ‘technological plateau’

(Cowen 2011). A popular trope, deployed by both

Gordon(2000: 60) and Cowen (2011), has been to compare the dramatic changes in everyday life that resulted from technological transformations during the first half of the twentieth century with the much more mode stchanges experienced since then.

Save for the Internet and mobile telephony, the basic technologicalinfrastructure (based on cars, oil, etc.) has seen little radical transformation. In this trope, our contemporary experience is indeed one that looks like a plateau when compared with the radical techno-social change that someone reaching old age in the 1960s would have experienced over the preceding half century.

Rotating Reactors CP

1NC

CP Text: The United States federal government should invest in the Continuous

Production Method of Carbon Nanotubes using Rotator Reactor.

Rotating reactors produce better and cheaper nanotech

Universiti Sains Malaysia 12 (Universiti Sains Malaysia, New method for continuous production of carbon nanotubes , Science Daily, April 12, 2012 http://www.sciencedaily.com/releases/2012/04/120412105109.htm)//rh

A group of researchers from Universiti Sains Malaysia (USM) have successfully created a new method for producing carbon nanotubes . The new method is capable of reducing the price of carbon nanotubes from $100 - $700 US to just $15 to $35 US for each gram, much lower than world market prices.

The method known as the Continuous Production Method of

Carbon Nanotubes using Rotation Reactor is the first ever created in Southeast Asia. Carbon nanotubes are widely used in the production of end products such as memory chips, rechargeable batteries, tennis rackets, badminton rackets, bicycles, composite to manufacture cars, airplanes and so forth. The research team leader, Assoc. Dr. Abdul Rahman Mohamed said, a new rotation of the reactor system is designed to enable the continuous production of carbon nanotubes without compromising the quality and authenticity. "The system is capable of producing up to 1000 grams of carbon nanotubes a day,'' he said. He added that the developed system is also environmentally friendly as it operates at atmospheric conditions, cost effective and does not require a large space to operate the reactor.

2NC Solves Environment

Rotating reactors solve and are more efficient

Leong 12 (Jasmine, NEWS: NEW METHOD FOR CONTINUOUS PRODUCTION OF

CARBON NANOTUBES BY USM RESEARCHERS, April 23, 2012, Scientific Malaysian, http://www.scientificmalaysian.com/2012/04/23/continuous-production-carbon-nanotubesusm/)//rh

Being the first in South East Asia, a team of researchers lead by Prof. Dr Abdul Rahman

Mohamed from Universiti Sains Malaysia (USM) discovered the continuous production method of carbon nanotubes (CNTs) using rotation reactor. The method developed is reported to be significantly more efficient than the previous methods used.

CNTs are widely applied as composite material in production of biomedical and electronic end products, electrochemical devices, sensor and probes. Due to its mechanical strength, lightweight and ideal electronic properties, CNTs are highly in demand. Japan, China, United States and Korea are among the countries that currently producing this valuable material. Using this improved method, the production cost of CNTs is reduced from USD 100-700 to USD 15-35 for each gram with production capacity of 1000g a day. This system is reported to be environmental friendly as it operates at atmospheric conditions with minimum reactor space required .

Prof. Abdul Rahman together with Dr. Chai Siang Piao, Seah Choon Ming, Assoc. Prof. Dr. Lee

Keat Teong and Yeoh Wei Ming contributed to the success of this project. Currently, CNTs are produced using this improved method by Advance Nanocarbon Sdn. Bhd., a spin-off company of

USM. Multiple international awards have been attained by Advance Nanocarbon including a

Gold Medal at the International Jury of IENA in 2009 as well as a Special Prize for Technical

Culture-Creation and a Gold Medal at the 33rd International Exhibition of Invention, New

Techniques and Products in Geneva in 2005. The success of this project marks the impact of research in Malaysia on the economy and technological advancement in the area of engineering.

Rotating Reactors are better than the aff – operates at lower temperatures which minimizes pollution

Pirard et al. 14 (Dominique Toye, Jean-Paul Pirard Laboratoire de Génie Chimique, B6a,

Université de Liège, Influence of heat exchanges and of temperature profile for carbon nanotube synthesis in a continuous rotary reactor, http://www.nanopt.org/14Abstracts/2014_Pirard_Sophie_Sophie.Pirard@ulg.ac.be_Abstract_SPi

rard.pdf)//rh

The influence of the reaction exothermicity has been taken into account for the modeling of a continuous inclined mobile-bed rotating reactor for carbon nanotube synthesis by the CCVD method using ethylene as carbon source. The modeling of the continuous reactor was performed according to the reaction chemical engineering approach which consists in studying the four factors governing the reactor, i.e. geometric, hydrodynamic, physical and physicochemical factors [1]. So the reactor equations have been written by applying the co-current plug-flow hypothesis and by taking the true kinetic equation and the sigmoid catalytic deactivation into account [2]. The four reactor equations correspond to the three mass balances and to the heat balance. The optimal temperature to maximize the productivity and to avoid the

formation of soot and tars is equal to 700°C with ethylene [3, 4]. In small scale reactors, the heat exchange between the carbon nanotube growing bed and the atmosphere of the furnace surrounding the reactor is efficient enough to evacuate the heat released by the reaction and to keep the temperature constant along the reactor.

However, for higher production capacity reactors, the global heat released by reaction increases, and the heat exchange has to be efficient enough to evacuate the heat released by the reaction. Otherwise, one may observe a runaway phenomenon. So the heat released by reaction influences the temperature profile through the reactor and heat exchanges have to be taken into account to model the axial temperature profile. The model has been validated with data obtained on two industrial reactors equipped with heating systems belonging several distinct heating zones with the same length providing an adequate control of the temperature in the reactor. To avoid a too high reaction speed, possibly leading to excessive heat release and to hot point responsible of cracking of ethylene and of reactor fouling by tars and soot deposition in the first reactor sections, the feed temperature of the reacting gas has to be fixed at a value lower or equal to

650°C.

Indeed, Fig. 1 highlights the temperature profiles for initial temperatures equal to 650°C and 700°C for a given experimental set. When the initial gas temperature is equal to 700°C, the heat release leads to a significant temperature increase beyond 700°C, leading to a great deposition of soot and tars . Fig. 1 shows the corresponding temperature profile, which is continuously increasing and tends towards an adiabatic profile. Furthermore, the model shows that the temperature profile along the reactor has to be as close as possible of the temperature profile of an isothermal reactor at 700°C (Fig.2). This temperature profile can be reached with several heating zones (at least four heating zones) and with an initial temperature at the inlet of the reactor smaller than 700°C due to the exothermicity of the reaction. This result is in agreement with experimental data obtained with the two industrial reactors.

Several articles in the literature show that the fluidized-bed reactor works correctly and produce

CNT of good quality [5-7]. According to some references, the fluidized-bed reactor is the only one able to produce CNTs continuously in a large-scale and has already been adopted worldwide for the commercial production of CNTs, because compared with moving-bed reactor, the fluidized-bed reactor has excellent heat and mass transfer properties and good mixing behavior.

However, the residence time of each catalyst particle is not constant in a fluidized-bed reactor, leading to an inhomogeneous quality of produced CNTs. The present article shows that by imposing an adequate temperature at the inlet of the reactor and by controlling the temperature of the heating zones in order to regulate heat exchanges between the CNT growing bed and the atmosphere of the heating zones through the reactor wall, the temperature profile along an industrial continuous mobile-bed reactor is almost isotherm and this kind of reactor is very well adapted to continuously produce CNTs of homogeneous quality at a large-scale.

AT Normal Means

Not normal means – current nanotech is produced by chemical vapor deposition

De Guire 13 (Eileen, American Ceramic Society, Commercialization of carbon nanotubes and their surprisingly long history, March 12, 2013, http://ceramics.org/ceramic-techtoday/commercialization-of-carbon-nanotubes-and-their-surprisingly-long-history)//rh

The article reports that most CNTs produced today are disorganized (unaligned), which limits the ability to capitalize on the interesting properties of aligned structures such as yarns, “forests,” and sheets.

Nevertheless, there are enough applications for CNTs incorporated into bulk composites and thin films that the authors say CNT powders are “now entering the growth phase of their product life cycle.”

Chemical vapor deposition is the prevailing manufacturing-scale method for synthesizing in production quantities of MWNTs. SWNTs require much closer control of the CVD process, which keeps their prices much higher than

MWNTs (by orders of magnitude, according to the article). However, CNTs can still be pricey, themselves (and the reason your touring bike is not a CNT composite)—up to about $100 per kilogram, which is as much as 10 times the price of carbon fiber.

Useless Counterplans?

Clothes CP

This is stupid but hilarious

Greenburg 14

(Zach O’ Malley, Forbes Magazine, From Blue To Green: Inside Pharrell's Latest

Fashion Venture, 2/10/14, http://www.forbes.com/sites/zackomalleygreenburg/2014/02/10/fromblue-to-green-inside-pharrells-latest-fashion-venture/)//rh

Halfway between Japan and North America, a swirling vortex of trash known as the Great

Pacific Garbage Patch occupies a slice the ocean’s surface twice the size of Texas. It’s not visible via satellite and is often hard to discern with the naked eye, mostly because it’s made up of pieces of clear plastic in various states of degradation. These materials come from bags, bottles and other mundane household items, but the consequences of their presence are dire. Half of the 400,000 albatrosses born every year in the nearby Midway Atoll die after consuming plastic they’ve mistaken for food. Over 250 species suffer similar fates–or ingest garbage and send it all the way up the food chain. A dead whale recently washed up near Seattle with a stomach full of plastic bags, surgical gloves, duct tape and a pair of pants. Oddly enough, clothing may be part of the solution to this problem, thanks to Pharrell Williams. The superproducer earned headlines in recent weeks for his Grammy night haberdashery, but his latest fashion move could make even larger waves. He’s helping G-Star design and launch a new line of clothing made from yarn sourced from recycled ocean plastic. Dubbed Raw For The

Oceans, the collaboration will hit stores this summer. The inspiration dates back to his involvement with the Live Earth concerts many years ago. “When I got there I was struck by all the people who were so into the cause, and what it really meant,” he told a crowd gathered beneath a giant blue whale at the Museum of Natural History last weekend. “I said to myself,

‘Maybe there’s something I could do, let me find something that makes sense for me.’ [Raw For

The Oceans] was an interesting thing that popped up, and it was the first thing that made sense.”

The story begins about a year ago, when Bionic Yarn—which Williams co-owns—started talks with G-Star, together with ocean pollution awareness outfit Parley. Williams and other key players met in Berlin to discuss a collaboration, proposing a simple question with a complex answer: Can you make denim out of recycled ocean plastic? “It’s Pharrell, so you say ‘yes,’ of course,” says G-Star CMO Thecla Schaeffer. “Since then, we’ve been on this creative journey to get the plastic out of the ocean and then to integrate it into denim that’s as beautiful and feels as good as regular denim.” To make Raw For The Oceans happen, G-Star had to turn its entire supply chain upside down. Instead of purchasing fabrics from mills, the company had to go all the way back to the raw materials. That meant working with outfits like Bionic Yarn, Parley for the Oceans and the Vortex Project, which finds new ways to extract plastic from the water. The process for making clothing out of this material begins in the water with the collection of plastic, and continues as Bionic Yarn breaks down the material into tiny chips, then yarn. G-Star also had to run tests with its mills to make sure the yarn would work in their machines. So far, so good. Next up: the design phase—and even more hands-on involvement from Williams. “We buy the yarn from him and he co-designs the collection with us,” explains Schaeffer. “It’s very personal and goes very deep. He’s very much involved in the zippers and the colors of the items.” The multi-year, seven-figure deal calls for Williams to earn a per-unit royalty, most of which he’ll contribute to the Vortex Project. In addition to launching capsule collections with

Williams, who also boasts clothing brands Ice Cream and Billionaire Boys Club, G-Star is looking to integrate Bionic Yarn into all of its lines. That’s good news for the species affected by the plastic buildup in the Great Pacific Garbage Patch—and the citizens of Earth on the whole.

“If the oceans die,” said Captain Paul Watson, founder of the Sea Shepherd Conservation

Society, at last weekend’s event, “we die.”

SNUR Advantage CP

1NC

CP Text: The United States federal government should mandate the Environmental

Protection Agency pass and enforce a Nanomaterial Significant New Use Rule and an information reporting rule.

CP solves – key to safe use of nanotechnology

Votaw 13 (James G, Partner, Manatt, Phelps & Phillips, LLP, focuses on conventional, nanoscale, industrial, pesticidal and specialty chemical product regulation, policy and approval matters, Nanotechnology Regulation – EPA Developing Rule to Regulate All New Uses of

Engineered Nanoscale Materials , July 25, 2013, Environmental Leader.com, http://www.environmentalleader.com/2013/07/25/nanotechnology-regulation-epa-developingrule-to-regulate-all-new-uses-of-engineered-nanoscale-materials/)//rh

EPA’s Omnibus Nanomaterial Regulation in Development Nevertheless, in 2009, EPA began work on a TSCA regulation that would be applicable to all nanoscale materials. The working text of the rule has not been publicly released, but EPA’s description of the regulation in the Unified Regulatory Agenda[2] explains that it would have two components: a Significant

New Use Rule and an information reporting rule.

Nanomaterial Significant New Use Rule.

As described , the anticipated SNUR would be applicable to any use of a nanoscale material, except commercial applications already in use at the time that the rule is proposed . Such

‘grandfathered’ uses would be exempt from most requirements. For new uses, the rule would require importers, manufacturers, and processors to submit a dossier (a “ significant new use notice

” or “SNUN”) to EPA detailing how the nanomaterial substance would be manufactured and used by the proponent and its downstream customers. The SNUN would have to be submitted at least 90 days prior to any new use of a nanomaterial. EPA would conduct a risk assessment of that use, and could then choose to ban or restrict the use under an order, or compel testing . The user would be obligated to notify EPA before exporting the material, and required to keep records. Based recent EPA SNURs for new nanomaterials, the restriction might tightly bind the user to a particular use of the material, restrict the user to material made by a particular manufacturer, and require a submission of a new SNUN and new

90-day review period if there were any changes. These operational constraints could be made to apply to grandfathered uses as well. The SNUR might also require the user to have its customers enter into a parallel order with EPA, at least for 12 or 18 months until EPA issued a regulation imposing those restrictions . Nanomaterial Information Collection Rule.

Coupled with the nanomaterial SNUR, EPA is also poised to issue a TSCA §8(a) information collection rule applicable to all existing nanoscale materials . EPA has great flexibility is determine the extent of the information to be collected on manufacturing, processing, use and exposure. In this case, it appears EPA anticipates very significant information collection efforts by industry as it estimates each response will require nearly 160 man hours to complete .[3] EPA would use that information to identify existing uses that may present risks warranting future EPA regulation. EPA also would use the information responses to inventory all existing commercial uses of nanomaterials. Any use not identified in the

‘inventory’ presumably would be deemed to be a “new use” and prohibited unless first notified to EPA under the nanomaterial SNUR. And while small entities usually are exempt from such

information collection rules, that exemption does not necessarily apply when coupled with a

SNUR. TSCA §8(a)(3(A).

2NC

CP Solves toxic risks of nanotech

EPA 11 (Environmental Protection Agency, Control of Nanoscale Materials under the Toxic

Substances Control Act, Pollution Prevention and Toxics April 29, 2011, http://www.epa.gov/oppt/nano/)//rh

Significant New Use Rule (SNUR) The Agency is developing a SNUR under section 5(a)(2) of

TSCA to ensure that nanoscale materials receive appropriate regulatory review. The SNUR would require persons who intend to manufacture, import, or process new nanoscale materials based on chemical substances listed on the TSCA Inventory to submit a

Significant New Use Notice (SNUN) to EPA at least 90 days before commencing that activity.

The SNUR would identify existing uses of nanoscale materials based on information submitted under the Agency's voluntary Nanoscale Materials Stewardship Program (NMSP) and other information. The SNUNs would provide the Agency with a basic set of information on nanoscale materials, such as chemical identification, material characterization, physical/chemical properties, commercial uses, production volume, exposure and fate data, and toxicity data. This information would help the Agency evaluate the intended uses of these nanoscale materials and to take action to prohibit or limit activities that may present an unreasonable risk to human health or the environment . Top of Page Information

Gathering Rule As part of the Agency's efforts to ensure a more comprehensive understanding of nanoscale materials that are already in commerce, EPA is also developing a proposed rule under TSCA section 8(a) to require the submission of additional information.

This rule would propose that persons who manufacture these nanoscale materials notify

EPA of certain information including production volume, methods of manufacture and processing, exposure and release information, and available health and safety data.

TOCA CP

1NC

CP Text: The United States federal government should develop and fund The Ocean

Cleanup Array.

The CP Solves

Boyan Slat Et. Al, 14 (Slat is the founder & president of The Ocean Cleanup, Hester Jansen is the feasibility study editor for The Ocean Cleanup, Jan De Sonneville is the lead engineer for

The Ocean Cleanup, http://www.theoceancleanup.com/fileadmin/mediaarchive/theoceancleanup/press/downloads/TOC_Feasibility_study_lowres.pdf , executive summary, SRB)

The world’s oceans are characterized by a system of large-scale rotating currents, called

‘gyres’

. The ocean systems are constantly moving as a result of the turning of the earth and wind patterns. The five major gyres are the Indian Ocean Gyre, the North Atlantic Gyre, the

North Pacific Gyre, the South Atlantic Gyre and the South Pacific Gyre.

If the ocean’s water is constantly moving according to predictable patterns, so is the plastic pollution.

This led to the idea of a ‘passive cleanup’: using an array of floating barriers fixed to the sea bed to catch the debris as it flows past on the natural ocean currents.

The Ocean Cleanup Array utilizes long floating barriers which

- being at an an angle - capture and concentrate the plastic, making mechanical extraction possible

. One of the main advantages of this passive cleanup concept is that it is scalable.

Using the natural circulation period of the

North Pacific Subtropical Gyre, cleanup duration could be reduced to a minimum of just 5 years.

Using a passive collection approach, operational expenses can potentially be very low, making the cleanup more viable.

Furthermore, converting the extracted plastic into either energy, oil or new materials could partly cover execution costs. Because no nets would be used, a passive cleanup may well be harmless to the marine ecosystem and could potentially catch particles that are much smaller than what nets could capture.

The CP avoids the negative harms of nanotubes

Jorn van Dooren, 2011 (Science Communications Consultant by Wired Science

Communications, http://www.bitsofscience.org/carbon-nanotube-environment-toxicity-4038/,

Carbon nanotubes have unexpected negative impact on environment, SRB)

Carbon nanotubes are stronger than steel, harder than diamond, light as plastic and conduct electricity better than copper.

It is no wonder they can be found in an increasing range of products, ranging from tennis rackets to solar cells and consequently end up in the environment in increasing quantities.

But much is still unknown about the effect of these nanotubes on the environment

.

Especially aquatic systems are of interest since that is where most nanoparticles eventually end up.

New research now shows that carbon nanotubes at least are not toxic for green algae. Good news for the algae.

Or is it?

Earlier findings have shown that carbon nanotubes are harmful for cells in the human body, more or less having the same effect on them as a spear has on game.

Finding that the toxic effects on green algae are minimal was therefore a pleasant surprise.

When further examining the effects of the nanotubes on green algae Empa researchers found that they nevertheless have a negative impact on the growth of algae.

The reasons for this are twofold: high concentrations of nanotubes form a layer that partly blocks sunlight and they stimulate algae to clump together depriving them of light and

room.

Luckily these effects only occur at concentrations as high as one milligram per litre, a concentration that is unlikely to be met presently. But in a few years they may. Currently hundreds of tons of carbon nanotubes are produced each year, but this amount is on the rise and with it the quantity that can be released into the environment.

Proceed with caution.

2NC

Download