Novice SPS 1ac

advertisement
Plan: The United States federal government
should deploy space based solar power.
1
Contention 1 is Inherency
Tech barriers to SPS have been resolved – all that remains is government
inaction
Edmonton Journal, Austin Mardon received an honorary doctorate of laws from the University of Alberta on
Friday. He is a member of the Order of Canada and is a full member of the International Academy of Astronautics.
Pauline Balogun is a U of A student who is interested in green technologies for the future, 6/12/ 11, “Solar satellites
key to green energy”,
http://www.edmontonjournal.com/technology/Solar+satellites+green+energy/4933251/story.html//jchen
With gas prices on the rise, the race is on for cheap alternative fuel sources, including solar power,
but amid a wash of criticism, the solar industry may not even be in the running. The major
criticisms against solar power facilities, such as wind farms, are unreliability and inefficiency. Solar
power depends on environmental factors beyond human control and that makes investors anxious.
These facilities also require areas with high amounts of sunlight, usually hundreds if not thousands
of acres of valuable farmland and all for relatively little power production. This is why, in the
1960s, scientists proposed solar-powered satellites (SPSs). SPSs have about the most favourable
conditions imaginable for solar energy production, short of a platform on the sun. Earth's orbit
sees 144 per cent of the maximum solar energy found on the planet's surface and takes up next to
no space in comparison to land-based facilities. Satellites would be able to gather energy 24 hours
a day, rather than the tenuous 12-hour maximum that land-based
plants have, and direct the transmitted energy to different locations, depending on
where power was needed most. So, with so many points in its favour, why hasn't anyone built one yet? Obviously, putting anything in to outer space takes a lot of money. Many governments claim there simply isn't any money in the budget for
launching satellites into space, but in 2010, amid an economic crisis, the United States managed to find $426 million for nuclear fusion research and $18.7 billion for NASA, a five-per-cent increase from 2009. The most recent projections, made in the
1980s, put the cost of launching an SPS at $5 billion, or around 8-10 cents/ kWh. Nuclear power plants cost a minimum of $3 billion to $6 billion, not including cost overruns, which can make a plant cost as much as $15 billion. In the U.S., nuclear
power costs about 4.9 cents/kWh, making SPS power supply only slightly more expensive. But these estimates are over two decades old and the numbers likely need to be re-examined. The idea for space-based solar energy has been around since the
. Governments and investors
are rarely willing to devote funding to something that doesn't have quick cash returns. The projected cost of
'60s; given the technological advancements since then, surely governments would have invested in making an SPS power supply more budget-friendly. That is not the case
launching these satellites once ranged from $11 billion to $320 billion. These figures have been adjusted for inflation, but the original estimates were made back in the 1970s, when solar technology
was in its infancy, and may have since become grossly inaccurate. How long an SPS would survive in orbit is anybody's guess, given the maintenance due to possible damage to solar panels from solar
winds and radiation. As for adding to the ever-expanding satellite graveyard in Earth's orbit, most solutions to satellite pollution remain theoretical. Still, these satellites should not be so largely dismissed. There is a
significant design flaw keeping these satellites from production. One of the major shortfalls in the design of SPSs is simply in getting the power from point A to point B. This remains the most controversial aspect of SPSs: the use of microwaves to
transmit power from high orbit to the ground. Critics often cite the dangers of microwave radiation to humans and wildlife, however, the strength of the radiation from these beams would be equal to the leakage from a standard microwave oven,
which is only slightly more than a cellphone. A NASA report from 1980 reveals that the major concern with solarpowered satellites was problems with the amplifier on the satellite itself. Several workable solutions were proposed in that same report.
also recommended that NASA develop and invest in SPS technology, so that by the 2000s,
these satellites would be a viable alternative fuel source. This recommendation was ignored. We
should already have the technology and the infrastructure in place for green energy, but we don't.
Instead, we are engaged in a mad dash for the quickest, cheapest alternative to oil and that may be
the source of our downfall. For the sake of the future, expediency must take a back seat to longevity
and longevity may just be found in outer space.
The report
2
Contention 2 is Hegemony and Deterrence
Scenario 1 is Power Projection
SPS is key to force mobility– provides the only sustainable power source to the
military
Taylor Dinerman, senior editor at the Hudson Institute’s New York branch and co-author of the
forthcoming Towards a Theory of Spacepower: Selected Essays, from National Defense University Press,
11/24/2008,
“Space
solar
power
and
the
Khyber
Pass”,
The
Space
Review,
http://www.thespacereview.com/article/1255/1
Last year the National Security Space Office released its initial report on space solar power (SSP).
One of the primary justifications for the project was the potential of the system to provide power
from space for remote military bases. Electrical power is only part of the story. If the military really
wants to be able to operate for long periods of time without using vulnerable supply lines it will
have to find a new way to get liquid fuel to its forward operating forces. This may seem impossible
at first glance, but by combining space solar power with some of the innovative alternative fuels
and fuel manufacturing systems that are now in the pipeline, and given enough time and effort, the
problem could be solved. The trick is, of course, to have enough raw energy available so that it is possible to transform whatever is available into liquid fuel. This may mean
something as easy as making methanol from sugar cane or making jet fuel from natural gas, or something as exotic as cellulosic ethanol from waste products. Afghanistan has coal and natural gas that
is a portable system that can be transported in standard
containers and set up anywhere there are the resources needed to make fuel. This can be done even
before space solar power is available, but with SSP it becomes much easier.
could be turned into liquid fuels with the right technology. What is needed
In the longer run Pakistan’s closure of the Khyber Pass
supply route justifies investment in SSP as a technology that landlocked nations can use to avoid the pressures and threats that they now have to live with. Without access to the sea, nations such as Afghanistan are all too vulnerable to machinations
from their neighbors. Imagine how different history would be if the Afghans had had a “Polish Corridor” and their own port. Their access to the world economy might have changed their culture in positive ways. Bangladesh and Indonesia are both
Muslim states whose access to the oceans have helped them adapt to the modern world.
That’s key to deter nuclear conflict
Gerson 9 – Michael S. Gerson, Research analyst @ Center for Naval Analyses, a federally funded research
center, where he focuses on deterrence, nuclear strategy, counterproliferation, and arms control, Autumn
2009, “Conventional Deterrence in the Second Nuclear Age,” Parameters
Conventional deterrence also plays an important role in preventing nonnuclear aggression by
nuclear-armed regimes. Regional nuclear proliferation may not only increase the chances for the
use of nuclear weapons, but, equally important, the possibility of conventional aggression. The
potential for conventional conflict under the shadow of mutual nuclear deterrence was a perennial
concern throughout the Cold War, and that scenario is still relevant. A nuclear-armed adversary
may be emboldened to use conventional force against US friends and allies, or to sponsor terrorism,
in the belief that its nuclear capabilities give it an effective deterrent against US retaliation or
intervention.15 For example, a regime
might calculate that it could undertake conventional aggression against a neighbor and, after achieving a relatively quick victory, issue implicit or
explicit nuclear threats in the expectation that the United States (and perhaps coalition partners) would choose not to get involved. In this context, conventional deterrence can be an important mechanism to limit options for regional aggression
ust conventional forces in and around the theater of potential conflict, the United
States can credibly signal that it can respond to conventional aggression at the outset, and therefore
the opponent cannot hope to simultaneously achieve a quick conventional victory and use nuclear
threats to deter US involvement. Moreover, if the United States can convince an opponent that US
forces will be engaged at the beginning of hostilities—and will therefore incur the human and
financial costs of war from the start—it can help persuade opponents that the United States would
be highly resolved to fight even in the face of nuclear threats because American blood and treasure
would have already been expended.16 Similar to the Cold War, the deployment of conventional
power in the region, combined with significant nuclear capabilities and escalation dominance, can
help prevent regimes from believing that nuclear possession provides opportunities for
conventional aggression and coercion.
below the nuclear threshold. By deploying rob
Scenario 2 is Soft Power
SPS has immense international support – US development key to soft power
NSSO, National Security Space Office, 10/10/07, “Space‐Based Solar Power: As an Opportunity for Strategic
3
Security”, http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA473860&Location=U2&doc=GetTRDoc.pdf//jchen
FINDING: The SBSP Study Group found that no outright policy or legal showstoppers exist to
prevent the development of SBSP. Full‐scale SBSP, however, will require a permissive
international regime, and construction of this new regime is in every way a challenge nearly equal
to the construction of the satellite itself. The interim review did not uncover any hard show‐
stoppers in the international legal or regulatory regime. Many nations are actively studying Space‐
Based Solar Power. Canada, the UK, France, the European Space Agency, Japan, Russia, India, and
China, as well as several equatorial nations have all expressed past or present interest in SBSP.
International conferences such as the United Nations‐connected UNISPACE III are continually held
on the subject and there is even a UN‐affiliated non‐governmental organization, the Sunsat Energy
Council, that is dedicated to promoting the study and development of SBSP. The International
Union of Radio Science (URSI) has published at least one document supporting the concept, and a
study of the subject by the International Telecommunications Union (ITU) is presently ongoing.
There seems to be significant global interest in promoting the peaceful use of space, sustainable
development, and carbon neutral energy sources, indicating that perhaps an open avenue
exists for the United States to exercise “soft power” via the development of SBSP.
That there are no show‐
stoppers should in no way imply that an adequate or supportive regime is in place. Such a regime must address liability, indemnity, licensing, tech transfer, frequency allocations, orbital slot assignment, assembly and parking orbits, and transit
corridors. These will likely involve significant increases in Space Situational Awareness, data‐sharing, Space Traffic Control, and might include some significant similarities to the International Civil Aviation Organization’s (ICAO) role for facilitating
safe international air travel. Very likely the construction of a truly adequate regime will take as long as the satellite technology development itself, and so consideration must be given to beginning work on the construction of such a framework
immediately.
SPS is key to international collaboration in space
Schwab, Martin Schwab, Professor of Philosophy, Philosophy School of Humanities, English Professor
School of Humanities, Director of Humanities and Law Minor, April 15, 2002, “The New Viability of Space
Solar Power: Global Mobilization for a Common Human Endeavor”,
http://scholar.google.com/scholar?start=40&q=unilateral+solar+powered+satellites&hl=en&as_sdt=0,30&as
_ylo=2000, Date accessed June 25, 2011
If a non-integrated, decentralized SSP system were to be a truly international effort, perhaps costs
for such an effort could be reduced. It is conceivable that a sense of global mobilization (being part
of a common human endeavor) might take hold in an international effort to build thousands of SSP
space and ground segments. The peoples of poor nations might be able to find employment in
digging the foundations for and in the maintenance of SSP assembly and launch facilities and
ground rectennae. Borrowing from FDR’s New Deal philosophy, these facilities could purposely be
built around the globe so that vocational training in aerospace technology could also be offered,
adding to
the human capital in developing countries. This new environment of international cooperation could and should be constantly v erified by UN inspectors to ensure that these new facilities remain true to peaceful purposes. There are of course risks in
any new relationship, but in light of the track record of other attempts to maintain international security, these acceptable risks are perhaps worth the effort to make them work. U.S. Secretary of Defense Donald Rumsfeld is conscious of making every
. Making poor people of the world
actually feel needed should be a focal point of U.S. foreign policy. This would reduce the general
sense of marginalization in many parts of the world, perhaps making terrorism at least flourish less.
This approach could start by abandoning “diplomatic” terms
member of the U.S. Military feel needed in the war on terror. This is the same approach that could be taken when building a system of SSP for the peoples of Earth
such as “periphery” and “international development.” These terms only reinforce the idea that
other countries and other cultures have nothing of inherent value to offer the West. When Rumsfeld was a CEO in the pharmaceutical industry, he said that the role of serendipity in developing new products increased with the number of separate
areas of research and development that were funded. This idea should be even more true as human capital is developed around the world. Some see involvement in space as a luxury that much of the world cannot afford. This same logic would also
playing golf and by exploring space.14A
global mobilization for a common human endeavor via the common language of science and
technology, as it relates to outer space need not be seen as naïve
deny golf lessons for inner city youth. Perhaps this worldview fails to see the value in “teeing up” unknown lessons to be learned, both by
or some call for one world government. Ronald Reagan for instance, characteristically
and perhaps instinctively invoked the rhetorically inclusive phrase, “the people of this planet” when he attempted to marshal international condemnation against terrorism during his administration.26
4
Independently, international space cooperation cements US leadership
CSIS “National Security and the Commercial Space Sector”, 2010 CSIS Draft for Comment, April 30th,
http://csis.org/files/publication/100430_berteau_commercial_space.pdf
“New opportunities for partnership and collaboration with both international and commercial
space actors have the potential to support future national security space activities and enhance U.S.
leadership.” Forming alliances and encouraging cooperation with foreign entities could provide
several benefits to the United States, including ensuring continued U.S. access to space after a
technical failure or a launch facility calamity, strengthening the competitive position of the U.S.
commercial satellite sector, enhancing the U.S. position in partnerships, and reinforcing
collaboration among other space-faring nations.
As the Booz, Allen & Hamilton 2000 Defense Industry Viewpoint notes, strategic commercial alliances: (1) provide capabilities
to expand quickly service offerings and markets in ways not possible under time and resource constraints; (2) earn a rate of return 50 percent higher than base businesses—“returns more than double as firms gain experience in alliances”; and (3)
are a powerful alternative to acquiring other companies because they “avoid costly accumulation of debt and buildup of balance sheet goodwill.” In those respects, international commercial alliances could help U.S. firms access foreign funding,
business systems, space expertise, technology, and intellectual capital and increase U.S. industry’s market share overseas, thus providing economic benefits to the United States. Moreover, U.S. experiences with foreign entities in foreign markets could
systems in other countries, resolve satellite spectrum and
coordination issues, and mitigate risks associated with catastrophic domestic launch failures by
providing for contingency launch capabilities from foreign nations. Multinational alliances would
also signal U.S. policymakers’ intent to ensure U.S. commercial and military access to space within a
cooperative, international domain, help promote international cooperation, and build support for
U.S. positions within various governmental and business forums. First, partnerships could allow the
United States to demonstrate greater leadership in mitigating those shared risks related to
vulnerability of space assets through launch facility and data sharing, offering improved space
situational awareness, establishing collective security
help those entities obtain the requisite approvals to operate U.S. government satellite
agreements for space assets, exploring space deterrence and satellite security doctrines, and formulating and
agreeing to rules of the road on the expected peaceful behavior in the space domain. Second, partnerships could also help the United States build consensus on important space-related issues in bilateral or multilateral organizations such as the United
Nations, the International Telecommunication Union, and the World Trade Organization; working with emerging space-faring nations is particularly important because of their growing presence in the marketplace and participation in international
, alliances could serve as a bridge to future collaborative efforts between U.S. national
security for
, developing government, business, and professional relationships with people
in other countries provides opportunities for the United States to further the principles upon which
U.S. national security relies—competition, economic stability, and democracy.
organizations. Third
ces and U.S. allies. For example, civil multinational alliances such as the International Space Station and the international search and rescue satellite consortium, Cospas-Sarsat, involve multiple countries partnering to use
space for common public global purposes. Finally
Soft power prevents extinction
Joseph Nye, Harvard, US MILITARY PRIMACY IS FACT - SO, NOW, WORK ON 'SOFT POWER' OF PERSUASION,
April 29, 2004, p, http://www.ksg.harvard.edu/news/opeds/2004/nye_soft_power_csm_042904.htm
Soft power co-opts people rather than coerces them. It rests on the ability to set the agenda or
shape the preferences of others. It is a mistake to discount soft power as just a question of image,
public
relations, and ephemeral popularity. It is a form of power - a means of pursuing national interests. When America discounts the importance of its attractiveness to other countries, it pays a price. When US policies lose their legitimacy
and credibility in the eyes of others, attitudes of distrust tend to fester and further reduce its leverage. The manner with which the US went into Iraq undercut American soft power. That did not prevent the success of the four-week military campaign,
but it made others less willing to help in the reconstruction of Iraq and made the American occupation more costly in the hard-power resources of blood and treasure. Because of its leading edge in the information revolution and its past investment in
But not all the important types of power come from
the barrel of a gun. Hard power is relevant to getting desired outcomes, but transnational issues
such as climate change, infectious diseases, international crime, and terrorism cannot be resolved
by military force alone. Soft power is particularly important in dealing with these issues, where
military power alone simply cannot produce success, and can even be counterproductive. America's
success in coping with the new transnational threats of terrorism and weapons of mass destruction
will depend on a deeper understanding of the role of soft power and developing a better balance of
hard and soft power in foreign policy.
military power, the US probably will remain the world's single most powerful country well into the 21st century.
5
Contention 3 is Warming
SPS is the ideal clean energy to solve warming – minimal pollution and
resource use
Garretson, a Council on Foreign Relations (CFR) International Fellow in India, previously the Chief of
Future Science and Technology Exploration for Headquarters Air Force, Directorate of Strategic Plans and
Programs, 09 (Peter A., “Sky’s No Limit: Space-Based Solar Power, The Next Major Step In The Indo-US
Strategic Partnership?”, http://spacejournal.ohio.edu/issue16/papers/OP_SkysNoLimit.pdf)
While no energy source is entirely benign, the SBSP concept has significant things to recommend it
for the environmentally conscious and those wanting to develop green energy sources. An ideal
energy source will not add to global warming, produce no greenhouse gasses, have short energy
payback time, require little in the way of land, require no water for cooling and have no adverse
effects on living things. Space solar power comes very close to this ideal. Almost all of the
inefficiency in the system is in the space segment and waste heat is rejected to deep space instead
of the biosphere.14 SBSP is, therefore, not expected to impact the atmosphere. The amount of heat
contributed by transmission loss through the atmosphere and reconversion at the 19 receiver-end
is significantly less than an equivalent thermal (fossil fuel
), nuclear power plant, or terrestrial solar plant, which rejects significantly more heat to the biosphere on
a per unit (per megawatt) basis.15 The efficiency of a Rectenna is above 80 per cent (rejects less than 20 per cent to the biosphere), whereas for the same power into a grid, a concentrating solar plant (thermal) is perhaps 15 per cent efficient
(rejecting 85 (per cent) while a
(rejecting 60 per cent to the biosphere). The high efficiency of the receivers
also means that unlike thermal and nuclear power plants, there is no need for active cooling and so
no need to tie the location of the receiver to large amounts of cooling water, with the accompanying
environmental problems of dumping large amounts of waste heat into rivers or coastal areas.
fossil fuel plan is likely to be less than 40 per cent efficient
ONLY SPS supplies the power needed for a sustainable energy transition
James M. Snead, P.E., is a senior member of the American Institute of Aeronautics and Astronautics (AIAA) a
past chair of the AIAA’s Space Logistics Technical Committee, and the founder and president of the
Spacefaring Institute LLC, 5/4/2009, “The vital need for America to develop space solar power”, The Space
Review, http://www.thespacereview.com/article/1364/1
A key element of a well-reasoned US energy policy is to maintain an adequate surplus of
dispatchable electrical power generation capacity. Intelligent control of consumer electrical power
use to moderate peak demand and improved transmission and distribution systems to more
broadly share sustainable generation capacity will certainly help, but 250 million additional
Americans and 5 billion additional electrical power consumers worldwide by 2100 will need
substantially more assured generation capacity. Three possible energy sources that could achieve
sufficient generation capacity to close the 2100 shortfall are methane hydrates, advanced nuclear
energy, and SSP. The key planning co
nsideration is: Which of these are now able to enter engineering development and be integrated into an actionable sustainable energy transition plan?
Methane hydrate is a combination of methane and water ice where a methane molecule is trapped within water ice crystals. The unique conditions necessary for forming these hydrates exist at the low temperatures and elevated pressures under
estimate that the
hydrates? The issues are the technical feasibility of recovering methane at industrial-scale
levels (tens to hundreds of billions BOE per year) and doing so with acceptable environmental
impact
water, under permafrost, and under cold rock formations. Some experts
undersea methane hydrate resources are immense and may be able to meet world energy needs for a century or more. Why not plan
to use methane
. While research into practical industrial-scale levels of recovery with acceptable environmental impact is underway, acceptable production solutions have not yet emerged. As a result, a rational US energy plan cannot yet include
methane hydrates as a solution ready to be implemented to avoid future energy scarcity. Most people would agree that an advanced nuclear generator scalable from tens of megawatts to a few gigawatts, with acceptable environmental impact and
adequate security, is a desirable long-term sustainable energy solution. Whether this will be an improved form of enriched uranium nuclear fission; a different fission fuel cycle, such as thorium; or, the more advanced fusion energy is not yet known.
, until commercialized reactor designs are
demonstrated and any environmental and security issues associated with their fueling, operation,
and waste disposal are technically and politically resolved, a rational US energy plan cannot yet
include advanced nuclear energy as a solution ready to be implemented to avoid future energy
scarcity. We are left with SSP. Unless the US federal government is willing to forego addressing the
very real possibility of energy scarcity in dispatchable electrical power generation, SSP is the one
renewable energy solution capable of beginning engineering development and, as such, being
incorporated into such a rational sustainable energy transition plan. Hence, beginning the
engineering development of SSP now becomes a necessity. Planning and executing a rational US
energy policy that undertakes the development of SSP will jump-start America on the path to
Research into all of these options is proceeding with significant research advancements being achieved. However
6
acquiring the mastery of industrial space operations we need to become a true spacefaring nation
. Of
course, rapid advancements in advanced nuclear energy or methane hydrate recovery or the emergence of a new industrial-scale sustainable energy source may change the current circumstances favoring the start of the development of SSP. But not
knowing how long affordable easy energy supplies will remain available and not knowing to what extent terrestrial nuclear fission and renewable energy production can be practically and politically expanded, reasonableness dictates that the serious
engineering development of SSP be started now.
And, we’re nearing the point of no return—holding the line on current
emissions while transitioning to solar power key to check feedback cycles
David Biello, award winning journalist and associate editor for Scientific American, 9/9/10, Scientific
American, “How Much Global Warming Is Guaranteed Even If We Stopped Building Coal-Fired Plants Today?”,
http://www.scientificamerican.com/article.cfm?id=guaranteed-global-warming-with-existing-fossil-fuelinfrastructure{jchen}
Humanity has yet to reach the point of no return when it comes to catastrophic climate change,
according to new calculations. If we content ourselves with the existing fossil-fuel infrastructure we
can hold greenhouse gas concentrations below 450 parts per million in the atmosphere and limit
warming to below 2 degrees Celsius above preindustrial levels—both common benchmarks for
international efforts to avoid the worst impacts of ongoing climate change—according to a new
analysis in the September 10 issue of Science. The bad news is we are adding more fossil-fuel
infrastructure—oil-burning cars, coal-fired power plants, industrial factories consuming natural
gas—every day.
A team of scientists analyzed the existing fossil-fuel infrastructure to determine how much greenhouse gas emissions we have committed to if all of that kit is utilized for its entire expected lifetime. The
answer: an average of 496 billion metric tons more of carbon dioxide added to the atmosphere between now and 2060 in "committed emissions". That assumes life spans of roughly 40 years for a coal-fired power plant and 17 years for a typical car—
potentially major under- and overestimates, respectively, given that some coal-fired power plants still in use in the U.S. first fired up in the 1950s. Plugging that roughly 500 gigatonne number into a computer-generated climate model predicted CO2
and 150 ppm higher than
preindustrial atmospheric concentrations. Still, we are rapidly approaching a point of no
return,
levels would then peak at less than 430 ppm with an attendant warming of 1.3 degrees C above preindustrial average temperature. That's just 50 ppm higher than present levels
cautions climate modeler Ken Caldeira of the Carnegie Institution for Science's Department of Global Ecology at Stanford University, who participated in the study. "There is little doubt that more CO2-emitting devices will be
built," the researchers wrote. After all, the study does not take into account all the enabling infrastructure—such as highways, gas stations and refineries—that contribute inertia that holds back significant changes to lower-emitting alternatives, such
as electric cars. And since 2000 the world has added 416 gigawatts of coal-fired power plants, 449 gigawatts of natural gas–fired power plants and even 47.5 gigawatts of oil-fired power plants, according to the study's figures. China alone is already
responsible for more than a third of the global "committed emissions," including adding 2,000 cars a week to the streets of Beijing as well as 322 gigawatts of coal-fired power plants built since 2000. The U.S.—the world's largest emitter of
greenhouse gases per person, among major countries—has continued a transition to less CO2-intensive energy use that started in the early 20th century. Natural gas—which emits 40 percent less CO2 than coal when burned—now dominates new
But the U.S. still generates
half of its electricity via coal burning—and what replaces those power plants over the next
several decades will play a huge role in determining the ultimate degree of global climate
change. Coal-burning poses other threats as well, including the toxic coal ash that can spill from the
impoundments where it is kept; other polluting emissions that cause acid rain and smog; and
power plants (nearly 188 gigawatts added since 2000) along with wind (roughly 28 gigawatts added), a trend broadly similar to other developed nations such as Japan or Germany.
the soot that
causes and estimated 13,200 extra deaths and nearly 218,000 asthma attacks per year, according to a report from the Clean Air Task Force, an environmental group. "Unfortunately, persistently elevated levels of fine particle pollution are common
plants, diesel trucks, buses and
cars." Of course, those are the same culprits contributing the bulk of greenhouse gas emissions. Yet
"programs to scale up 'carbon neutral' energy are moving slowly a
across wide swaths of the country," reveals the 2010 report, released September 9. "Most of these pollutants originate from combustion sources such as power
t best," notes physicist Martin Hoffert of New York University in a perspective on
the research also published in Science on September 10. "The difficulties posed by generating even [one terawatt] of carbon-neutral power led the late Nobel laureate Richard Smalley and colleagues to call it the 'terawatt challenge'." That is because
power plants. At least 10
terawatts each from nuclear; coal with carbon capture and storage; and renewables, such as solar
and wind, would be required by mid-century to eliminate CO2 emissions from energy use. As
Caldeira and his colleagues wrote: "Satisfying growing demand for energy without producing CO2
emissions will require truly extraordinary development and deployment of carbon-free sources of
energy, perhaps 30 [terawatts] by 2050."
all carbon-free sources of energy combined provide a little more than two of the 15 terawatts that power modern society—the bulk of that from nuclear and hydroelectric
Substantial U.S. action will guarantee spillover
Trevor Houser, visiting fellow at the Peterson Institute for International Economics; Shashank Mohan,
research analyst with the Rhodium Group; --AND-- Robert Heilmayr, research analyst at the World
Resources Institute, 09
[“A Green Global Recovery? Assessing US Economic Stimulus and the Prospects for International
Coordination,” http://www.iie.com/publications/papers/houser0309.pdf]
The G-20 group of developed and developing countries has emerged as the lead forum for orchestrating an
international response to the economic crisis. At their meeting last November, G-20 leaders pledged to work
together to combat the global recession through coordinated fiscal stimulus. Washington is not alone in
looking to meet long-term energy and environmental goals while bolstering short-term economic growth.
Japan and South Korea have both trumpeted their stimulus plans as “Green New Deals,” China has earmarked
much of its $586 billion in spending for energy and environmental projects, and the United Kingdom and
7
Germany have followed suit.14 The G-20 will meet again in April to compare notes on the recovery effort and
take stock of each country’s plan of attack. Leaders will be
looking to coordinate their respective stimulus packages to ensure the greatest economic bang for the buck. Given that this same
group of countries will be tackling climate change later in the year—either in a small grouping like the Major Economies Process, or through the UN-led negotiations in Copenhagen—they would be wise to assess the cumulative effect of these efforts on global carbon dioxide
provides three
benefits: 1. Investments by one country in an emerging low-carbon technology reduce the cost of that
technology for everyone. Coordinating government-driven R&D and governmentfunded demonstration
projects can maximize the energy and environmental benefits of every public dollar spent. 2.Efficiency
investments that reduce energy demand in one country impact energy prices around the world and thus 14.
Michael Casey, “UN welcomes Korea, Japan green stimulus plans,” Associated Press, January 22, 2009; Li Jing,
“NDRC: 350b yuan to pour into environment industry,” China Daily, November 27, 2008, available at www.
chinadaily.com.cn (accessed on February 2, 2009). the cost-benefit analysis used by national policymakers
when evaluating domestic programs. 3. The cumulative effects of green recovery programs on national
emissions will shape international climate negotiations and the type of commitments that are made as part of
a multilateral climate agreement.
emissions and work together to ensure that various green stimulus efforts complement each other as well as longterm emissions reduction goals. Discussion of the energy and environmental components of national recovery efforts
Runaway warming causes extinction
Deibel ‘7 (Terry L. Professor of IR @ National War College, 2007. “Foreign Affairs Strategy: Logic for
American Statecraft”, Conclusion: American Foreign Affairs Strategy Today)
Finally, there is one major existential threat to American security (as well as prosperity) of a nonviolent
nature, which, though far in the future, demands urgent action. It is the threat of global warming to the
stability of the climate upon which all earthly life depends. Scientists worldwide have been observing
the gathering of this threat for three decades now, and what was once a mere possibility has passed
through probability to near certainty. Indeed not one of more than 900 articles on climate change
published in refereed scientific journals from 1993 to 2003 doubted that anthropogenic warming is
occurring. “In legitimate scientific circles,” writes Elizabeth Kolbert, “it is virtually impossible to find
evidence of disagreement over the fundamentals of global warming.” Evidence from a vast international
scientific monitoring effort accumulates almost weekly, as this sample of newspaper reports shows: an
international panel predicts “brutal droughts, floods and violent storms across the planet over the next
century”; climate change could “literally alter ocean currents, wipe away huge portions of Alpine
Snowcaps and aid the spread of cholera and malaria”; “glaciers in the Antarctic and in Greenland are
melting much faster than expected, and…worldwide, plants are blooming several days earlier than a
decade ago”; “rising sea temperatures have been accompanied by a significant global increase in the
most destructive hurricanes”; “NASA scientists have concluded from direct temperature measurements
that 2005 was the hottest year on record, with 1998 a close second”; “Earth’s warming climate is
estimated to contribute to more than 150,000 deaths and 5 million illnesses each year” as disease
spreads; “widespread bleaching from Texas to Trinidad…killed broad swaths of corals” due to a 2degree rise in sea temperatures. “The world is slowly disintegrating,” concluded Inuit hunter Noah
Metuq, who lives 30 miles from the Arctic Circle. “They call it climate change…but we just call it breaking
up.” From the founding of the first cities some 6,000 years ago until the beginning of the industrial
revolution, carbon dioxide levels in the atmosphere remained relatively constant at about 280 parts per
million (ppm). At present they are accelerating toward 400 ppm, and by 2050 they will reach 500 ppm,
about double pre-industrial levels. Unfortunately, atmospheric CO2 lasts about a century, so there is no
way immediately to reduce levels, only to slow their increase, we are thus in for significant global
warming; the only debate is how much and how serous the effects will be. As the newspaper stories
quoted above show, we are already experiencing the effects of 1-2 degree warming in more violent
storms, spread of disease, mass die offs of plants and animals, species extinction, and threatened
inundation of low-lying countries like the Pacific nation of Kiribati and the Netherlands at a warming of
5 degrees or less the Greenland and West Antarctic ice sheets could disintegrate, leading to a sea level of
rise of 20 feet that would cover North Carolina’s outer banks, swamp the southern third of Florida, and
inundate Manhattan up to the middle of Greenwich Village. Another catastrophic effect would be the
collapse of the Atlantic thermohaline circulation that keeps the winter weather in Europe far warmer
than its latitude would otherwise allow. Economist William Cline once estimated the damage to the
United States alone from moderate levels of warming at 1-6 percent of GDP annually; severe warming
8
could cost 13-26 percent of GDP. But the most frightening scenario is runaway greenhouse warming,
based on positive feedback from the buildup of water vapor in the atmosphere that is both caused by
and causes hotter surface temperatures. Past ice age transitions, associated with only 5-10 degree
changes in average global temperatures, took place in just decades, even though no one was then
pouring ever-increasing amounts of carbon into the atmosphere. Faced with this specter, the best one
can conclude is that “humankind’s continuing enhancement of the natural greenhouse effect is akin to
playing Russian roulette with the earth’s climate and humanity’s life support system. At worst, says
physics professor Marty Hoffert of New York University, “we’re just going to burn everything up; we’re
going to heat the atmosphere to the temperature it was in the Cretaceous when there were crocodiles at
the poles, and then everything will collapse.” During the Cold War, astronomer Carl Sagan popularized a
theory of nuclear winter to describe how a thermonuclear war between the Untied States and the Soviet
Union would not only destroy both countries but possibly end life on this planet. Global warming is the
post-Cold War era’s equivalent of nuclear winter at least as serious and considerably better supported
scientifically. Over the long run it puts dangers form terrorism and traditional military challenges to
shame. It is a threat not only to the security and prosperity to the United States, but potentially to the
continued existence of life on this planet.
And, there’s no question about warming—it’s real and anthropogenic
Stefan Rahmstorf, Professor of Physics @ Potsdam University, Member of the German Advisory Council on
Climate Change, 2008 (Global Warming: Looking Beyond Kyoto, ed. Ernesto Zedillo, Prof. IR @ Yale, p. 4249)
It is time to turn to statement B: human activities are altering the climate. This can be broken into two parts.
The first is as follows: global climate is warming. This is by now a generally undisputed point (except by
novelist Michael Crichton), so we deal with it only briefly. The two leading compilations of data measured
with thermometers are shown in figure 3-3, that of the National Aeronautics and Space Administration
(NASA) and that of the British Hadley Centre for Climate Change. Although they differ in the details, due to the
inclusion of different data sets and use of different spatial averaging and quality control procedures, they
both show a consistent picture, with a global mean warming of 0.8°C since the late nineteenth
century. Temperatures over the past
ten years clearly were the warmest since measured records have been available. The year 1998 sticks out well above the longterm trend due to the occurrence of a major El Nino event that year (the last El Nino so far and one of the strongest on record). These events are
examples of the largest natural climate variations on multiyear time scales and, by releasing heat from the ocean, generally cause positive anomalies in global mean temperature. It is remarkable that the year 2005 rivaled the heat of 1998 even though no El Nino event
occurred that year. (A bizarre curiosity, perhaps worth mentioning, is that several prominent "climate skeptics" recently used the extreme year 1998 to claim in the media that global warming had ended. In Lindzen's words, "Indeed, the absence of any record breakers during
the past seven years is statistical evidence that temperatures are not increasing.")33 In addition to the surface measurements, the more recent portion of the global warming trend (since 1979) is also documented by satellite data. It is not straightforward to derive a reliable
surface temperature trend from satellites, as they measure radiation coming from throughout the atmosphere (not just near the surface), including the stratosphere, which has strongly cooled, and the records are not homogeneous' due to the short life span of individual
satellite data show trends that are fully consistent
with surface measurements and model simulations." If no reliable temperature measurements existed, could
we be sure that the climate is warming? The "canaries in the coal mine" of climate change (as glaciologist
Lonnie Thompson puts it) ~are mountain glaciers. We know, both from old photographs and from the
position of the terminal moraines heaped up by the flowing ice, that mountain glaciers have been in retreat all
over the world during the past century. There are precious few exceptions, and they
satellites, the problem of orbital decay, observations at different times of day, and drifts in instrument calibration.' Current analyses of these
are associated with a strong increase in precipitation or local
cooling.36 I have inspected examples of shrinking glaciers myself in field trips to Switzerland, Norway, and New Zealand. As glaciers respond sensitively to temperature changes, data on the extent of glaciers have been used to reconstruct a history of Northern Hemisphere
temperature over the past four centuries (see figure 3-4). Cores drilled in tropical glaciers show signs of recent melting that is unprecedented at least throughout the Holocene-the past 10,000 years. Another powerful sign of warming, visible clearly from satellites, is the
shrinking Arctic sea ice cover (figure 3-5), which has declined 20 percent since satellite observations began in 1979. While climate clearly became warmer in the twentieth century, much discussion particularly in the popular media has focused on the question of how
"unusual" this warming is in a longer-term context. While this is an interesting question, it has often been mixed incorrectly with the question of causation. Scientifically, how unusual recent warming is-say, compared to the past millennium-in itself contains little information
about its cause. Even a highly unusual warming could have a natural cause (for example, an exceptional increase in solar activity). And even a warmin g within the bounds of past natural variations could have a predominantly anthropogenic cause. I come to the question of
causation shortly, after briefly visiting the evidence for past natural climate variations. Records from the time before systematic te mperature measurements were collected are based on "proxy data," coming from tree rings, ice cores, corals, and other sources. These proxy
data are generally linked to local temperatures in some way, but they may be influenced by other parameters as well (for example, precipitation), they may have a seasonal bias (for example, the growth season for tree rings), and high-quality long records are difficult to
obtain and therefore few in number and geographic coverage. Therefore, there is still substantial uncertainty in the evolution of past global or hemispheric temperatures. (Comparing only local or regional temperature; as in Europe, is of limited value for our purposes,' as
regional variations can be much larger than global ones and can have many regional causes, unrelated to global-scale forcing and climate change.) The first quantitative reconstruction for the Northern Hemisphere temperature of the past millennium, including an error
estimation, was presented by Mann, Bradley, and Hughes and rightly highlighted in the 2001 IPCC report as one of the major new findings since its 1995 report; it is shown in figure 3_6.39 The analysis suggests that, despite the large error bars, twentieth-century warming is
indeed highly unusual and probably was unprecedented during the past millennium. This result, presumably because of its symbolic power, has attracted much criticism, to some extent in scientific journals, but even more so in the popular media. The hockey stick-shaped
curve became a symbol for the IPCC, .and criticizing this particular data analysis became an avenue for some to question the credibility of the IPCC. Three important things have been overlooked in much of the media coverage. First, even if the scientific critics had been right,
this would not have called into question the very cautious conclusion drawn by the IPCC from the reconstruction by Mann, Bradley, and Hughes: "New analyses of proxy data for the Northern Hemisphere indicate that the increase in temperature in the twentieth century is
likely to have been the largest of any century during the past 1,000 years." This conclusion has since been supported further by every single one of close to a dozen new reconstructions (two of which are shown in figure 3-6). Second, by far the most serious scientific criticism
raised against Mann, Hughes, and Bradley was simply based on a mistake. 40 The prominent paper of von Storch and others, which claimed (based on a model test) that the method of Mann, Bradley, and Hughes systematically underestimated variability, "was [itself] based on
incorrect implementation of the reconstruction procedure."41 With correct implementation, climate field reconstruction procedures such as the one used by Mann, Bradley, and Hughes have been shown to perform well in similar model tests. Third, whether their
reconstruction is accurate or not has no bearing on policy. If their analysis underestimated past natural climate variability, this would certainly not argue for a smaller climate sensitivity and thus a lesser concern about the consequences of our emissions. Some have argued
that, in contrast, it would point to a larger climate sensitivity. While this is a valid point in principle, it does not apply in practice to the climate sensitivity estimates discussed herein or to the range given by IPCC, since these did not use the reconstruction of Mann, Hughes, and
Bradley or any other proxy records of the past millennium. Media claims that "a pillar of the Kyoto Protocol" had been called into question were therefore misinformed. As an aside, the protocol was agreed in 1997, before the reconstruction in question even existed. The
overheated public debate on this topic has, at least, helped to attract more researchers and funding to this area of paleoclimatology; its methodology has advanced significantly, and a number of new reconstructions have been presented in recent years. While the science has
moved forward, the first seminal reconstruction by Mann, Hughes, and Bradley has held up remarkably well, with its main features reproduced by more recent work. Further progress probably will require substantial amounts of new proxy data, rather than further
refinement of the statistical techniques pioneered by Mann, Hughes, and Bradley. Developing these data sets will require time and substantial effort. It is time to address the final statement: most of the observed warming over the past fifty years is anthropogenic. A large
number of studies exist that have taken different approaches to analyze this issue, which is generally called the "attribution problem." I do not discuss the exact share of the anthropogenic contribution (although this is an interesting question). By "most" I imply mean "more
than 50 percent.” The first and crucial piece of evidence is, of course, that the magnitude of the warming is what is expected from the anthropogenic perturbation of the radiation balance, so anthropogenic forcing is able to explain all of the temperature rise. As discussed here,
the rise in greenhouse gases alone corresponds to 2.6 W/tn2 of forcing. This by itself, after subtraction of the observed 0'.6 W/m2 of ocean heat uptake, would Cause 1.6°C of warming since preindustrial times for medium climate sensitivity (3"C). With a current "best guess';
aerosol forcing of 1 W/m2, the expected warming is O.8°c. The point here is not that it is possible to obtain the 'exact observed number-this is fortuitous because the amount of aerosol' forcing is still very' uncertain-but that the expected magnitude is roughly right. There can
be little doubt that the anthropogenic forcing is large enough to explain most of the warming. Depending on aerosol forcing and climate sensitivity, it could explain a large fraction of the warming, or all of it, or even more warming than has been observed (leaving room for
r: there is no viable alternative explanation. In the scientific
literature, no serious alternative hypothesis has been proposed to explain the observed global warming.
Other possible causes, such as solar activity, volcanic activity, cosmic rays, or orbital cycles, are well observed,
but they do not show trends capable of explaining the observed warming. Since
natural processes to counteract some of the warming). The second important piece of evidence is clea
1978, solar irradiance has been measured directly from satellites and
shows the well-known eleven-year solar cycle, but no trend. There are various estimates of solar variability before this time, based on sunspot numbers, solar cycle length, the geomagnetic AA index, neutron monitor data, and, carbon-14 data. These indicate that solar activity
probably increased somewhat up to 1940. While there is disagreement about the variation in previous centuries, different authors agree that solar activity did not significantly increase during the last sixty-five years. Therefore, this cannot explain the warming, and neither
can any of the other factors mentioned. Models driven by natural factors only, leaving the anthropogenic forcing aside, show a cooling in the second half of the twentieth century (for an example, See figure 2-2, panel a, in chapter 2 of this volume). The trend in the sum of
natural forcings is downward. The only way out would be either some as yet undiscovered unknown forcing or a warming trend that arises by chance from an unforced internal variability in the climate system. The latter cannot be completely ruled out, but has to be
considered highly unlikely. No evidence in the observed record, proxy data, or current models suggest that such internal variability could cause a sustai ned trend of global warming of the observed magnitude. As discussed twentieth century warming is unprecedented over
the past 1,000 years, (or even 2,000 years, as the few longer reconstructions available now suggest), which does not 'support the idea of large internal fluctuations. Also, those past variations correlate well with past forcing (solar variability, volcanic activity) and thus
appear to be largely forced rather than due to unforced internal variability." And indeed, it would be difficult for a large and sustained unforced variability to satisfy the fundamental physical law of energy conservation. Natural internal variability generally shifts heat around
different parts of the climate system-for example, the large El Nino event of 1998, which warmed, the atmosphere by releasing heat stored in the ocean. This mechanism implies that the ocean heat content drops as the atmosphere warms. For past decades, as discussed, we
observed the atmosphere warming and the ocean heat content increasing, which rules out heat release from the ocean as a cause of surface warming. The heat content of the whole climate system is increasing, and there is no plausible source of this heat other than the heat
9
trapped by greenhouse gases. ' A completely different approach to attribution is to analyze the spatial patterns of climate change. This is done in so-called fingerprint studies, which associate particular patterns or "fingerprints" with different forcings. It is plausible that the
greenhouse gases. For example, a characteristic of greenhouse gases is that heat is trapped
closer to the Earth's surface and that, unlike solar variability, greenhouse gases tend to warm more in winter,
and at night. Such studies have used different data sets and have been performed by different groups of
researchers with different statistical methods. They consistently conclude that the observed spatial pattern of
warming can only be explained by greenhouse gases.49 Overall, it has to be considered, highly likely' that the
observed warming is indeed predominantly due to the human-caused increase in greenhouse gases. ' This
paper discussed the evidence for the anthropogenic increase in atmospheric CO2 concentration and the effect
of CO2 on climate, finding that this anthropogenic increase is proven beyond reasonable doubt and that a
mass of evidence points to a CO2 effect on climate of 3C ± 1.59C global-warming
pattern of a solar-forced climate change differs from the pattern of a change caused by
for a doubling of concentration. (This is, the classic IPCC range; my
personal assessment is that, in-the light of new studies since the IPCC Third Assessment Report, the uncertainty range can now be narrowed somewhat to 3°C ± 1.0C) This is based on consistent results from theory, models, and data analysis, and, even in the absence-of any
computer models, the same result would still hold based on physics and on data from climate history alone. Considering the plethora of consistent evidence, the chance that these conclusions are wrong has to be considered minute. If the preceding is accepted, then it follows
logically and incontrovertibly that a further increase in CO2 concentration will lead to further warming. The magnitude of our emissions depends on human behavior, but the climatic response to various emissions scenarios can be c omputed from the information presented
here. The result is the famous range of future global temperature scenarios shown in figure 3_6.50 Two additional steps are involved in these comput ations: the consideration of anthropogenic forcings other than CO2 (for example, other greenhouse gases and aerosols) and
the computation of concentrations from the emissions. Other gases are not discussed here, although they are important to get quantitatively accurate results. CO2 is the largest and most important forcing. Concerning concentrations, the scenarios shown basically assume that
ocean and biosphere take up a similar share of our emitted CO2 as in the past. This could turn out to be an optimistic assumption; some models indicate the possibility of a positive feedback, with the biosphere turning into a carbon source rather than a sink under growing
climatic stress. It is clear that even in the more optimistic of the shown (non-mitigation) scenarios, global temperature would rise by 2-3°C above its preindustrial level by the end of this century. Even for a paleoclimatologist like myself, this is an extraordinarily high
temperature, which is very likely unprecedented in at least the past 100,000 years. As far as the data show, we would have to go back about 3 million years, to the Pliocene, for comparable temperatures. The rate of this warming (which is important for the ability of
ecosystems to cope) is also highly unusual and unprecedented probably for an even longer time. The last major global warming trend occurred when the last great Ice Age ended between 15,000 and 10,000 years ago: this was a warming of about 5°C over 5,000 years, that is,
sea level was 25-35 meters higher than now due to
smaller Greenland and Antarctic ice sheets), extreme events (for example, hurricane activity is expected to increase in a warmer climate), and ecosystem
loss. The second part of this paper examined the evidence for the current warming of the planet and discussed what is known about its causes. This part
a rate of only 0.1 °C per century. 52 The expected magnitude and rate of planetary warming is highly likely to come with major risk and impacts in terms of sea level rise (Pliocene
warming is already a measured and well-established fact, not a theory. Many different lines of
evidence consistently show that most of the observed warming of the past fifty years was caused by human
activity. Above all, this warming is exactly what would be expected given the anthropogenic rise in
greenhouse gases, and no viable alternative explanation for this warming has been proposed in the scientific
literature. Taken together., the very strong evidence accumulated from thousands of independent studies,
has over the past decades convinced virtually every climatologist around the world (many of whom were
initially quite skeptical, including myself) that anthropogenic global warming is a reality with which we need
to deal.
showed that global
10
Contention 4 is Competitiveness
The U.S. is ceding tech leadership—our space program is dying. The timeframe
is now.
Dominic Gates, Seattle Times aerospace reporter, 6/12/11, “Boeing's Albaugh worries about 'intellectual
disarmament' of U.S.”, The Seattle Times,
http://seattletimes.nwsource.com/html/businesstechnology/2015304417_albaughside13.html
Jim Albaugh is worried about the future of American technological supremacy in the world. "The
biggest fear I have is what I call the intellectual disarmament of this country," said the Boeing
Commercial Airplanes chief, who is also this year's chairman of the Aerospace Industries
Association, the trade group for U.S. defense, space and aviation companies. "We still are the leader
in aerospace," he added. "Are we going to be the leader in aerospace in another 20 years?" Albaugh
is troubled that the nation's lead in aerospace, the fruit of Cold War military and space-race
projects, will be allowed to wither through lack of government funding of new challenges.
In a wide-ranging
interview in advance of the global aviation gathering at the Paris Air Show, he ticked off a list of broad national problems that transcend Boeing: • Brain drain of talented immigrants: "The best and brightest used to come to the United States and stay,"
Albaugh said. "Now, the best and brightest come to the United States, get trained, and leave, and go back and compete against us." • Defense cuts: "There is no industrial base policy in the Department of Defense other than market forces," he said.
. There are no [all-new] airplanes being developed
for the Department of Defense probably for the first time in 100 years." • Competition from China: "The law of large numbers
"Right now, the Boeing Company is the only company in the United States that has a design team working on a new airplane
would dictate that they are going to have more smart people than we are going to have. And their government has identified aerospace as an industry that they've targeted," Albaugh said. "The
handle the complex systems integration?" When Defense Secretary Robert
Gates visited China in January, the Chinese military made a very public test flight of its previously
secret J-20 Stealth fighter. "A lot of people saw that as a military threat," Albaugh said. "I didn't. I
saw it more as an economic threat. They will sell that airplane around the world and will take away
a lot of the market that's been enjoyed by U.S. defense contractors." • NASA cuts and private space
ventures: "They are trying to commercialize space. ... Getting the reliability requires a lot of redundancy, which requires a lot of cost," Albaugh
question is, can they be innovative and can they
said. "I think it's going to be a money pit for a lot of them." He lamented the U.S. government's withdrawal from space exploration as the space-shuttle program winds down: "My prediction is that the
Chinese will walk on the moon before we launch an American into orbit again in a U.S. spacecraft."
SPS development prompts industrial spin-off tech and attracts private capital
Lyle M. Jenkins, Jenkins Enterprises, Project Engineer, NASA Lyndon B. Johnson Space Center, December
2009, “Development of Space-Based Solar Power”, Intech,
http://www.intechopen.com/articles/show/title/development-of-space-based-solar-power//jchen
Summary Space-Based Solar Power is a huge project. It might be considered comparable in scale to
the national railroads, highway system, or electrification
it would
require a corresponding amount of political will to realize its benefits. Most of America’s spending
in space does not provide any direct monetary revenue. SBSP will create new markets and produce
new products. Great powers have historically succeeded by finding or inventing products and services not just to sell to themselves, but to sell to others. Today, investments in space are
measured in billions of dollars. The energy market is trillions of dollars and will generate substantial new wealth
for our nation and our world. Investments to develop SBSP have significant economic spin-offs.
They open up or enable the other new industries such as space industrial processes, space tourism,
enhanced telecommunications, and use of off-world resources. After the fundamental technological
risks have been defined, shifting SBSP from a research and development project to a financial and
production program is needed. Several major challenges will need to be overcome to make SBSP a reality, including the creation of low cost space access and a
supporting infrastructure system on Earth and in space. The opportunity to export energy as the first marketable commodity
from space will motivate commercial sector solutions to the challenges. The delivered commodity
can be used for base load terrestrial electrical power, wide area broadcast power, carbon-neutral synthetic fuels production, military tactical support or as an in-space satellite energy
project rather than the Manhattan or Apollo endeavors. However, unlike such purely national projects, this
project also has components that are analogous to the development of the high volume international civil aviation system. Such a large endeavor includes significant international and environmental implications. As such
utility.
11
SPS development ensures continued American tech and scientific
competitiveness
NSSO, National Space Security Organization, joint office to support the Executive Agent for Space and the
newly formed Defense Space Council, 10/10/2007, Space‐Based Solar Power As an Opportunity For
Strategic Security, Phase 0 Architecture Feasibility Study, http://www.nss.org/settlement/ssp/library/finalsbsp-interim-assessment-release-01.pdf
FINDING: The SBSP Study Group found that SBSP offers a path to address the concerns over US
intellectual competitiveness in math and the physical sciences expressed by the Rising Above the
Gathering Storm report by providing a true “Manhattan or Apollo project for energy.” In absolute
scale and implications, it is likely that SBSP would ultimately exceed both the Manhattan and Apollo
projects which established significant workforces and helped the US maintain its technical and
competitive lead. The committee expressed it was “deeply concerned that the scientific and
technological building blocks critical to our economic leadership are eroding at a time when many
other nations are gathering strength.” SBSP would require a substantial technical workforce of
high‐paying jobs. It would require expanded technical education opportunities, and directly
support the underlying aims of the American Competitiveness Initiative.
12
Contention 5 is Solvency
Solar Power Satellites are sustainable, cheap, and technologically feasible
Lior, Noam Lior, University of Pennsylvania, Department of Mechanical Engineering and Applied
Mechanics, Philadelphia, PA, April 2011 “Solar orbital power: Sustainability analysis”,
http://www.sciencedirect.com/science/article/pii/S0360544210005931, Date accessed June
24, 2011
We have analyzed some economic, environmental and social aspects of sustainability for electricity
production in solar space power plants using current technology. While space solar power is still
way too expensive for launches from the Earth, there are several technological possibilities to
reduce this price. For a large scale application of orbital power stations both environmental impact
and costs can be significantly reduced. The first option is to build and employ reusable space vehicles for launching the satellites, instead of rockets, which is the
main recommendation by NASA, and the second option is to build the satellites and rockets in space (e.g. on the Moon). An old NASA estimate shows that this would be economical for as few as 30
orbital satellites with 300 GWe of total power [17]. The costs could be even further reduced, if the first satellite is launched into the low Earth orbit, and then uses its produced energy to lift itself into
a higher GEO orbit or even to the Moon [35]. If the satellites and rockets are then built on the Moon in robotic factories, we estimate that:- The environmental impact of the orbital solar power plants
. Measured by CO2 emissions, it would be about
0.5 kg per W of useful power, and this number would even decrease with improved technology and
larger scope;- The production cost of the orbital solar power plants could also become significantly
lower than for any Earth-based power plant except perhaps nuclear fusion. It
satellites seem to be
connected to a significant loss of jobs. It is however difficult to estimate the benefits of a large
amount of cheap clean energy, which would most likely more than offset the negative effects of lost
jobs, and we estimate that about 3 jobs would be created in the economy per 1 MW of installed
useful power. One could therefore expect a net positive effect of solar power satellites on
sustainability. These effects seem to be the most positive, if thermal power satellites are used,
which are built in a robotic factory on the Moon and then launched into the GEO orbit. The concept presented in this paper has some significant advantages over many other proposed concepts
for large scale energy production on Earth. For example, nuclear fusion promises to become a clean and cheap source of energy, however even in the best case scenario
it can’t become operational before 2040. Solar orbital power concept can become operational in
less than a decade and produce large amounts of energy in two decades. It is also important that
the price as well as environmental impact of solar orbital power are expected to decrease with
scale. In addition to expected increase in employment this makes solar orbital power an important
alternative to other sustainable energy sources.
would become significantly lower than for any Earth-based power plant except perhaps nuclear fusion
is estimated as about US $1 per W of useful power, and
would also decrease with improved technology and larger scope;- The social impact of cheap and clean energy from space is more difficult to estimate, because space power
Earth orbit, and it would take many thousands of years before the SPS's orbit could possibly decay to cause atmospheric entry. Notably, large scale space development using asteroid-derived fuel propellants will insure that dead satellites in low orbit
do not crash to Earth, even old satellites
Funding isn’t enough - government R&D is key to successful SPS
George Friedman, is an American political scientist and author. He is the founder, chief intelligence officer,
financial overseer, and CEO of the private intelligence corporation Stratfor, 2011 “The Next Decade: Where
We’ve Been and Where We’re
Going”,http://books.google.com/books?id=y5plTzPTw8YC&pg=PA235&dq=Space+based+solar+power&hl=e
n&ei=99cDTq3bHIfEgAfTypSODg&sa=X&oi=book_result&ct=result&resnum=10&ved=0CGAQ6AEwCQ#v=on
epage&q=Space%20based%20solar%20power&f=false, Date accessed June 23, 2011
At the same time we must prepare for long-term increases in energy generation from
nonhydrocarbon sources-sources that are cheaper and located in areas that the United States will
not need to control by send-ing in armies. In my view, this is space-based solar power. Therefore, what should be
under way and what is under way is private-sector development of inexpensive booster rockets. Mitsubishi has invested inspace-based solar power to the tune of about $21 billion. Eutope's EAB is
also investing, and California`s Pacific Gas and Electric has signed a con-tract to purchase solar energy from space by 2016, although I think ful-fillment of that contract on that schedule is unlikely.
However, whether the source is space-based solar power or some other technology, the president must make certain that development along several axes is under way and that the potential for
, is the U.S. Department of
Defense. Thus the government will absorb the cost of early develop-ment and private investment
will reap the rewards. The We are in a period in which the state is more powerful than the mar-ket,
and in which the state has more resources. Markets are superb at exploiting existing science
and early technology, but they are not nearly as good in basic research. From aircraft to
nuclear power to moon Hightsto the Internet to global positioning satellites, the state is much
better at investing in long-term innovation. Government is inefficient, but that inefficiency and the ability to absorb the cost of inefficiency are at the heart of
building them is realistic. Enormous amounts of increased energy are needed, and the likely source of the technology, based on history
basic research. When we look at the projects we need to undertake in the coming decade, the organization most likely to execute them successfully is the Department of Defense. There is nothing
13
particularly new in this intertwining of technology, geopolitics, and economic well-being. The Philistines dominated the Levantine coast because they were great at making armor. To connect and
control their empire, the Roman army built roads and bridges that are still in use. During a war aimed at global domination, the German military created the foundation of modern rocketry; in
countering, the British came up with radar. Lending powers and those contending for power constantly find themselves under military and economic pressure. They respond to it by inventing
extraordinary new technologies. The United States is obviously that sort of power. It is currently under economic pressure but declining military pressure. Such a time is not usually when the United
. The government is heavily Funding one area we have discussed, finding cures
for degenerative diseases. The Department of Defense is funding a great deal of research into
robotics. But the fundamental problem, energy, has not had its due. For this decade, the choices are
pedestrian. The danger is that the president will fritter away his authority on proj-ects such as
conservation, wind power, and terrestrial solar power, which can’t yield the magnitude of results
required. The problem with natural gas in particular is that it is pedestrian. But like so much of
what will take place in this decade, accepting the ordinary and obvious is called for Hrs t-followed
by great dreams quietly expressed.
States undertakes dramatic new ventures
The USfg is key to aerospace competition - export controls and mergers have
weakened the private sector
ICAF, the Industrial College of the Armed Forces, a senior service school providing graduate level
educationto sernior members of the US armed forces, Spring 2007, “The Final Report: The Space Industry”
Industrial College of the Armed Forces, http://www.dtic.mil/cgibin/GetTRDoc?AD=ADA475093&Location=U2&doc=GetTRDoc.pdf
The U.S. government has long understood that access to space and space capabilities are essential
to U.S. economic prosperity and national security. U.S.
space policy from 1962 to 2006 served to ensure national leadership in space and governance of space activities,
including science, exploration, and international cooperation. The current Administration has issued five space-specific policies to provide goals and objectives for the U.S. Space Program. In addition to the National Space Policy, these policies are
Space Exploration; Commercial Remote Sensing; Space Transportation; and Space-Based Positioning, Navigation, and Timing. Each policy endeavors to maintain U.S. space supremacy, reserving the right to defend assets in space, and to continue to
exploit space for national security and economic prosperity. 9 America’s success in space is
dependent on government involvement, motivation, and inspiration. It is significant that the
Bush Administration has taken the time and effort to update all of the U.S. space policies. The
consolidation of the major space industry players and a general down-turn in the commercial space
market demand, coupled with export restrictions, has left the U.S. space industry reliant on the
government for revenue and technology development.
Federal development is key to spur the private sector
MORRING 07, Frank: Senior Space Technology Editor, Aerospace Daily and
Defense Report
[“NSSO backs space solar power,” http://www.aviationweek.com/aw/generic/story_channel.jsp?channel=space&id=news/solar101107.xml]
As a clean source of energy that would be independent of foreign supplies in the strife-torn Middle East
and elsewhere, space solar power (SSP) could ease America's longstanding strategic energy
vulnerability, according to the "interim assessment" released at a press conference and on the Web site
spacesolarpower.wordpress.com. And the U.S. military could meet tactical energy needs for forwarddeployed forces with a demonstration system, eliminating the need for a long logistical tail to deliver
fuel for terrestrial generators while reducing risk for eventual large-scale commercial development of
the technology, the report says. "The business case still doesn't close, but it's closer than ever," said
Marine Corps Lt. Col. Paul E. Damphousse of the NSSO, in presenting his office's report. That could
change if the Pentagon were to act as an anchor tenant for a demonstration SSP system, paying abovemarket rates for power generated with a collection plant in geostationary orbit beaming power to U.S.
forces abroad or in the continental U.S., according to Charles Miller, CEO of Constellation Services
International and director of the Space Frontier Foundation. By buying down the risk with a
demonstration at the tactical level, the U.S. government could spark a new industry able to meet not just
U.S. energy needs, but those of its allies and the developing world as well. The technology essentially
exists, and needs only to be matured. A risk buy-down by government could make that happen, 4
experiments on materials that might be used in building the large structures needed to collect sunlight
in meaningful amounts. The Internet-based group of experts who prepared the report for the NSSO
recommended that the U.S. government organize itself to tackle the problem of developing SSP; use its
resources to "retire a major portion of the technical risk for business development; establish tax and
14
other policies to encourage private development of SSP, and "become an early
demonstrator/adopter/customer" of SSP to spur its development. That, in turn, could spur development
of space launch and other industries. Damphousse said a functioning reusable launch vehicle - preferably
single-stage-to-orbit - probably would be required to develop a full-scale SSP infrastructure in geostationary
orbit. That, in turn, could enable utilization of the moon and exploration of Mars under NASA's vision for
space exploration.
15
Download