On August 14, 2003 modern technology in New York City stopped

advertisement
Stability and Fragility of the Electric Power
Grid in the U.S.
Harin J. Contractor
Scott Duncombe
Troid O. Edwards
Chris Whaley
1
Executive Summary
Blackouts are mainly caused by two physical features of the electrical grid. The
first is the shutting down of electrical lines, caused by the overloading of these lines
above their suggested capacity. The second is the grid's reliability, or its ability to
maintain a synchronized AC frequency by adequately managing load and supply. As the
market becomes deregulated, managing both of these features has become increasingly
difficult without increased information on the electrical grid's current load.
Electricity consumers currently pay average instead of marginal prices. Through
peak price increases, marginal pricing will decrease electricity demand during peak
periods. Such a mechanism allows consumers to purchase cheaper electricity during nonpeak periods. It also reduces the capacity strains during peak periods. If correctly
implemented, marginal prices can be set so that the electrical grid never reaches full
capacity, which will greatly reduce the likelihood of catastrophic blackouts.
The current confusing market structure for electricity that consists of three distinct
parts, generation, transmission and distribution is a result of efforts to move from
regulated natural monopoly to a deregulated product chain structure. Smart Grid
technologies, which encompass SCADA (system control and data acquisition) and AMI
(advanced metering infrastructure), offer a practical means to modernize the national grid
to increase reliability and energy efficiency. This revolutionary change in customer
participation in energy markets is realized through the combination of AMI with personal
energy management systems (PEM). Such powerful combinations of these two
technologies deliver both cost and consumption information to the end-user in a real-time
2
application that ensure customer participation resulting in reliability and energy
efficiency gains.
There are steps and policy recommendations that the Utilities, states, and the U.S.
can take to move to a more effective, efficient, and environmentally conscious way to
transmit power. SmartGrids are a sensible solution that will not only bring competitive,
demand responsive rates, but reliability to a traditionally unstable system. Pushing all
states to adopt the Energy Policy Act of 2005 will help clear many of the hurdles to move
toward a single grid system and implementation of Smart Meters. Moving toward a Cap
and Trade market will give strong incentives on for Utilities and consumers to move to an
AMI system with renewable alternatives. Establishing ‘Real Time Pricing’ (RTP) is
critical to move toward a SmartGrid system, hence giving providers a guaranteed rate of
return and decoupling rates are essential to RTP. This is a long and expensive transition,
but benefits are already being seen around the world. Once integrated grids with two way
responses are established, energy will have a stable system that saves money and energy
from wasteful consumption.
3
Background
On August 14, 2003 modern technology in New York City stopped. On the tails
of a summer heat wave, the electricity used to power air conditioners, light bulbs, and
traffic signals came to a halt. New York was not alone--all along the Northeast, from
Detroit to Toronto, the electrical power grid failed for three days. The 2003 Northeast
Blackout impacted 50 million people in the United States Canada, and caused over $10
billion in economic losses. As a direct result of the Blackout, 11 people in New York
City died, mainly from the unrelenting heat or from an inability to contact emergency
personnel. Although looting marred past blackouts, during the 2003 Blackout many New
York restaurants served food for free, as their perishable food was going to go bad
anyways.
The 2003 Blackout was not a random event. Instead, several technical problems
and mistakes lead to the Blackout. The days preceding the Blackout were among the
hottest of the year. On the day of the blackout, the temperature in New York City was 93
degrees. On top of their normal uses of electricity, people across the Northeast had their
air conditioners on full blast. As a result, there was an exceptionally high demand for
electricity on the eastern corridor. As an abnormal amount of electricity flowed through
the power lines, they began to heat up as they reached their full capacity. The increased
heat caused the lines to elongate and sag. Eventually, a power line operated by
FirstEnergy in rural Ohio stretched just low enough to hit an underlying tree, causing the
line to go down. In normal circumstances, backup generators would kick in and power
from other sources would be diverted to replace the inoperable line. However, because
demand was so high, there were no other power sources with the capacity to increase
4
their production. The backup generators also quickly failed to meet such a large
electricity demand. As other lines carried increased loads, they also began to fail
and downed power lines cascaded throughout the entire Northeast grid, bringing all
electricity distribution to a standstill.1
Many people blamed the 2003 Blackout on technical failures and the need for
more investment in the electrical grid. Although at the surface, these supply-side and
technical failures are the basic cause of the Blackout, they are not the underlying or the
most important cause. Mechanical failures are outweighed by the absence of consumer
incentives, such as charging more for electricity when demand is high, to reduce
electricity consumption during peak periods. However, policy officials discarded
methods to operate the current electrical grid more efficiently because electricity would
cost more when people demand it most. Of the rejected incentives, the easiest to
establish are marginal costs, the real-time price based on the simultaneous costs of
production. Instead, policy officials advocated massive spending to “modernize” the
failing power grid. However, regardless of how many new power lines regulators build,
until new technology that allows consumers to face such incentives, particularly marginal
costs for power, blackouts like the 2003 Northeast blackout are inevitable.
The Physics of Failure
There are two fundamental material causes of the 2003 blackout. First there is the
physical capacity of electrical power lines, which is a function of the heat generated by
those wires by electrical current. Second, there is the problem of the reliability of the
interconnected grid. Short term failures, such as overloading of power electrical lines,
are compounded by the grid’s poor reliability. To better understand how a smart grid
5
could prevent these types of failures, we first must understand how these breakdowns
occur for individual electrical power lines, but more importantly how the structure of the
grid propagates these errors.
The North American electrical grid is system of generators and consumers,
interconnected by thousands of miles of a copper wiring distribution system. It is the
largest single machine on earth, and its complexity exponentially increases as it becomes
more interconnected and less centralized. Originally, the grid was built by utilities with
vertical monopolies, meaning they owned the power plant, the high voltage, long distance
transmission lines, the lower voltage sub transmission lines, and the final distribution
network which ultimately delivered power to the user, illustrated in Figure 1. In the last
half century, and especially in the last decade, the grid has begun to be opened to
deregulate the electrical market. Yet these market changes are not supplemented by
upgrades in the transmission facilities, a fact which has dire consequences as the act of
delivering becomes increasingly complex.
6
Generation
Privately
Owned
Power Plant
(generation)
Transmission
Distribution
Substation
Meter
Figure 1: Shows a simplified version of the Grid. Originally, all of these components were owned and operated by
the same entity. Since deregulation, power can be generated, transmitted, and distributed all by different entities.
Deregulation increased congestion on power lines, leading to overloading like that
which caused the 2003 Blackout. Prior to deregulation, one power company’s
transmission lines would only transmit electrical power owned by that company. That is,
a line would transmit electrical power that the company was either actively generating or
buying from another utility. Deregulation changed this practice, opening electrical lines
for other utilities to buy and sell power through them. This created an incentive for
power companies to sell extra capacity in their electrical lines, causing congestion. This
congestion meant that the power companies have less spare capacity to handle sudden
emergencies, leading to overloading and line failures.
A line failure is caused when electrical lines surpass their capacity, causing the
wire to hit a ground and setting off a circuit breaker. Lines are rated depending on
temperature, current, and the site’s conditions. As current increases, the power lines heat
7
up exponentially according to the physical relationship I2R (I is current, R is the wire’s
resistance). As the lines heat up, the copper expands and therefore the wire sags between
electrical poles. If the sagging wire hits something to ground the wire, like a tree, it was
cause the wire to draw even more current, triggering a circuit breaker to shut the line
down. According to Kirchhoff’s Law, where current flows along all paths in a parallel
system, disconnecting one path will increase the current along all other paths, further
increasing capacity for the entire grid and possibly triggering more lines to go down.
In the case of the 2003 Blackout, a failure of power line in Ohio created a
cascading effect where electrical lines already running at high capacity were overloaded
major sources of generation were disconnected from major sources of consumer
demand.1 The failure in Ohio during the summer of 2003, when power usage is at its
highest due to air conditioning usage, caused a power company to draw heavily from
other generating sources. This caused an increase in the current on several electrical lines
which had not been well maintained, which when coupled with the summer’s high
temperature caused a wire to sag into a tree and fail. More lines failed as capacity is
further distributed on other overstressed lines, cutting generators off of the grid. Severe
voltage dips in the State of Ohio, as generators were cut out of the grid, caused Ohio to
draw heavily from Michigan, tripping more lines. Faults in Michigan and Ohio took
down major transmission lines which were moving power towards the Eastern Seaboard;
this caused an incredible imbalance in load and supply, causing more generators to fail
and blacking out cities from Cleveland to Ontario to New York City. Running power
lines near capacity meant the system was unable to deal with the increased stressed,
causing the blackout to cascade across the entire power grid.
1
Lerner, Eric J. "What’s Wrong with the Electric Grid?" The Industrial Physicist, 2003: 8-13.
8
This narrative of the 2003 Blackout shows the importance of proper line
maintenance and making sure that lines are not run at full capacity, so as to be able to
handle sudden changes. But it also reveals the importance of grid reliability, which is
maintained by balancing load and supply. As load, or demand, in the Eastern Corridor
surpassed supply from Western power sources in Michigan and Ohio, the frequency of
the AC Power Source slowed down. The grid’s frequency, 60 Hz, is an important
constant of AC power grid. The grid’s structure works to keep this feature constant and
is subdivided into several ‘interconnects’ which maintain a highly synchronized
frequency. Inside of these interconnects there can be multiple power companies and
utilities, all which must stay on within their specific interconnection’s frequency range,
which for the Easter Interconnect is 20 mHz.2 Figure 1 illustrates the frequency
boundaries for the Easter Interconnect. The importance of a stable AC frequency, or the
reliability of electrical grids, was demonstrated by the 2003 Blackout when a sudden
spike in load can threw off the AC frequency and cause vital generators and transformers
to shut down, creating further disruptions to the AC frequency.
North American Electric Reliability Council. "Standard BAL-004-1 — Time Error Correction."
NERC. September 13, 2007. ftp://www.nerc.com/pub/sys/all_updl/standards/rs/BAL-004-1.pdf
(accessed June 3, 2008).
2
9
Figure 2: Frequency Range for the Eastern Interconnection. At a lower frequency, the maximum voltage is
delivered at higher rates, whereas at a higher frequency the maximum voltage is delivered at a lower rate.
A key part of maintaining an electrical network’s frequency is balancing load and
supply, we will investigate supply first. Supply is managed by using different types of
power sources, and generally fit into one of three categories based on their operating cost
and responsiveness: Base Load, Intermediate, and Peaking. Base Load generators
generally a power source with very low operating cost, generating 60-70% of the annual
energy requirements power constantly and only shutting down for maintenance or
emergencies due to their long restart time. Intermediate generators tend to be less
efficient than Base Load, but require less capital and can easily be ramped up or cooled
off to change power output. Intermediate sources account for 20-30% of the annual
energy output. Peak generators are used only to cover times of maximum load and can be
turned out in less than ten minutes to meet a sudden demand. Peak generation generally
10
accounts for 5% of the total energy requirements for a system. As these three categories
indicate, electrical supply is created from a variety of sources which can be called upon a
variety of speeds. Power companies must balance the perceived load by creating supply
from Base Load, Intermediate, and Peak units.3
Table 1: Possible Load Management Techniques4
Utility Controlled
Supply Side
Demand Side
Energy Storage
Interruptible Power
Power Pooling
Remote Control of
Customer Load
Customer Controlled
Backup or Storage
End-Use
Modification
Distributed Energy
Load Deferral
(Or on Peak
Generation)
Customer Energy
Load Curtailment
- Voluntary in
Storage
-
response to
incentives.
Or as part of a
contractual
obligation.
In the last paragraph, we described how supply is created from a variety of
generating sources, each with variable abilities to respond to change. But there is a
second component to the AC frequency, and that is load; excess supply creates a higher
frequency whereas insufficient supply (high load) creates a slower frequency. Power
companies lack up to the minute information on the existing load, instead it is estimated
based on models which track consumers’ patterns. This lack of information severely
limits the action of both power companies and consumers. Power companies take action
due to modeling or when they see sudden disturbances at the high voltage transmission
levels, severely limiting their possible courses of action. See Table 1 for possible steps
companies can take. Most of these solutions, especially on the consumer side, are
3
4
Casazza, Jack, and Frank Delea. Understanding Electric Power Systems. Pascataway: IEEE Press, 2003.
Casazza, Jack, and Frank Delea. Understanding Electric Power Systems. Pascataway: IEEE Press, 2003, 49.
11
impossible as consumers have very little knowledge of the grid’s current load conditions,
and are generally charged for electricity based on the average load for a larger time
period. Even action by the utility is limited to what can be done with supply, as their
demand options are limited to cutting off customers. The information lag on consumer
load limits preventative action on both the part of the power company and the consumer,
leading to more breakdowns.5
The problems associated with the 2003 Blackout, overloading of wires and the
grid’s reliability, are a function of the new rules overstretching an old electrical grid that
was not designed to be deregulated. Since the implementation of Federal Energy
Regulatory Commission’s (FERC) Order 888 in 2000, which opened electrical
transmission to long distance sales of electricity, there has been a startling increase in two
key indicators of grid stability: transmission loading relief procedures and hourly
deviation of the grid’s frequency. A transmission loading relief (TLR) procedure is
implemented to immediately shifting power from one line to another; May 2000 had six
times the number of TLRs than May 1999, indicating the increase in line congestion
since the new rules. The hourly frequency deviation has increased as well, from 1.3 mHz
in May 1999, to 4.9 mHz in May 2000, to 7.6 mHz in 2001.6 While these deviations are
within the Eastern Interconnection’s frequency limits, 80 mHz, the deviation shows the
new stresses which are effecting the grid. The new rules, which should have made
electricity cheaper, have had a detrimental effect on the grid due to its older design,
which is reflected in these statistics. If deregulation is going to work, both power
companies and consumers need more information to better maintain the grid’s stability.
5
6
Casazza, Jack, and Frank Delea. Understanding Electric Power Systems. Pascataway: IEEE Press, 2003..
Fairley, Peter. "The Unruly Power Grid." IEEE Spectrum, 2004.
12
Networking
For most of the 19th century, American electric utilities were considered to be
vertically integrated regulated “natural monopolies” and treated as such. Thus, the
industry was considered best supervised by regulators than the laws of supply and
demand. In 1996, the State of California decided that with a little deregulation, markets
would govern the flow of electricity.
Through a trial by fire, the energy industry model transformed utilities from the vertically
integrated monopoly into three distinct and separate entities (see figure 1). In theory the
separate entities are generation, transmission and distribution.
The responsibilities of the distinct organizations fall along their specific job
responsibilities. One firm is responsible for generating of electricity. The next step,
intended to be handled by a separate firm, oversees the transmission (T) of high voltage
energy to a destination (substations) closer to where it will be employed. The last step of
getting power to the end-user was left to a distribution company, typically an Investor
Owned Utility (IOU) in most cases.
13
Market Structure
Smart Grid
SCADA
Generation
AMI
Transmission
Substation
Privately
Owned
Power Plant
(generation)
Meter
NERC
Independent
Generators
Distribution
Independent Power
Exchange
PUCs
Public/Private
Partnerships
Figure 3: Illustrate the three distinct functions of a competitive electricity market
Not all states applied the “California model” in its totality, if they chose to even
deregulate at all. Therefore, some states are in fact still regulating vertically integrated
monopolies at the distribution level. This is where the regulatory confusion begins.
Because of states’ rights issues, the federal government does not regulate distribution
level service outside of power quality and safety regulation. Distribution oversight is
managed by state public utility commission or similar regulatory bodies (PUCs).
Transmission level regulation on the other hand is regulated by the Federal Energy
Regulatory Commission (FERC), but managed by nongovernmental organizations
(NGO) called Independent System Operators (ISO) on a regional level. ISOs are also
charged with managing the market for electricity by linking distribution companies with
14
generation volumes on a real-time electronic market and all scheduled deliveries. Yet,
there is another NGO called the North American Energy Reliability Corporation covering
the United States and Canada, which manages the larger regional transmission networks
that cover both nations and our interconnections. Lastly, the generation of energy is
managed by the Department of Energy (DOE) for the most part only augmented by states
on local issues.
Smart Grids
A smart grid is the interconnection of what tends to be two distinct
communication platforms employed in the energy industry, SCADA and AMI.
SCADA, system control and data acquisition, systems monitor the transmission part of
the transition and distribution system (T&D) network by relaying critical system
information regarding the health and flow of electrons over transmission assets back to
grid managers/ISOs and NERC. The SCADA network physically operates from the
generator step-up transformer to the substation step-down transformer (Figure below).
15
The SCADA platform also has forward looking control features as well that
enable operators to shut down lines or bring generation on or off line as needed.
2-way distribution communication is a relatively new development. The majority of
distribution networks only have the capability to send information down downstream to
ancillary distribution equipment.
Automated Meter Reading (AMR) is a bit more advanced and can be found on
larger distribution networks. The reduced costs of meter reading are the primary benefits
of AMR to an IOU. The system reads the meter data and relays it back to the utility via a
wireless network.
Advanced Metering Infrastructure, AMI, is an evolutionary shift that enables truly
advanced functions never before seen in an energy management solution. By linking
every end-user on the network to a Smart Meter, unlike the meters that most American
homes have today, AMI truly enables 2-way communication that simultaneously sends
information between T&D companies and the end-user. This will all be done via some a
personal energy management system (PEM).
The PEM system will offer the end-user a degree of flexibility in how they choose
to receive price and consumption information, and how they use the information. PEMs
encompass a few different forms such as web-portals, paging networks, or even smart
thermostats/in-home displays (IHD). Even though the benefits of AMI are drastically
reduces in the absence of PEM systems, AMI still offers IOUs a robust communication
platform that increases system reliability. Yet, PEMs are virtually useless without AMI.
16
The combination of AMI with PEM, enabled by a smart meter, is not a cheap upgrade to
a distribution system. Yet, the savings that can be achieved over time through increased
reliability far outweigh the costs.
AMI
2-Way
Communication
Smart
Meters
Zigbee Enabled
PEM
• Paging
Networks
• IHD
• Web Portals
Zigbee is a low power mesh network communication protocol used to enable home automation through smart energy management
Figure 5: Illustrate the three steps toward energy efficiency
Further look at the Problems
In response to the power outage, people wondered how a catastrophe of such
magnitude could happen in the United States. Most of the attention focused on the basic
infrastructure and regulation of the power grid. Bill Richardson, the former head of the
Department of Energy, characterized the United States as a "Superpower with a third
world power grid.”2 Politicians and the media quickly clamored for more investments in
electrical lines. The Department of Energy advocated changing the grid physically to
accommodate increased capacity demands. “In reports before and after the blackout, the
DOE urged expanding transmission lines and easing environmental rules that limit their
construction.3” The logic for such proposals is straightforward—if the electrical grid
17
reaches capacity simply expands capacity so utilities can switch electricity from line to
line without overloading the grid. Ultimately, The U.S.-Canada Power System Outage
Task Force Final Report recommended 46 implementations to make sure such a huge
outage will never happen in the United States again.4 Most of the regulations focused on
expanding supply-side regulation but only two focused on regulating consumer demand.
However, even with perfect regulation and billions of dollars of new investments,
massive power outages will almost certainly happen again, just as they happen regularly
in nearly every country in the world.5 Because the sources of electrical blackouts often
lie less in infrastructure problems, and more in consumer-use problems, increased supplyside regulation will not prevent future blackouts. Instead, increased regulation often
underlies power blackouts because regulators often focus explicitly on infrastructure and
fail to address important consumer-demand factors. Regulators often shy away from
addressing demand problems because electricity cannot be stored over periods of time
and most electricity is consumed less than a second after being produced. This physical
constraint of power generation prevents using excess supply to meet excess demand, as in
other markets and will obviously contribute to future blackouts.
However, many other industries, like the telephone industry, face similar
constraints, but manage to efficiently manage their distribution. These industries do so
by using incentives to manipulate supply and demand to prevent system failures and their
industry-equivalents of blackouts. The fact that excess capacity cannot meet excess
demand later is irrelevant because incentives ensure future demand meets future capacity.
Since the telephone industry was deregulated, reliability has increase while price has
decreased enormously.6 If this industry, which has the same physical constraints, can use
18
markets effectively, so can the electrical industry. Yet, in the electricity industry,
consumers have no incentive to reduce their consumption when transmission lines reach
their breaking point.
Problem of Average Cost
While nearly every market works exceptionally well with marginal costs,
electricity markets still use average costs. As Nobel Laureate Vernon Smith writes,
The typical consumer does not know how much he has purchased and
what it will cost him until the end of the month when he gets his bill. If he
should go to the trouble of looking at his meter at 3 p.m. on a hot summer
day [like August 14, 2003] with the air conditioning on high, and the
clothes dryer cranking away, he will note that the little monitoring wheels
are spinning much faster than when he inspects it at 3 a.m.7
Currently, when consumers pay their power bill, they pay for their usage over the
entire month. The utility company sets a price, often restricted by a regulator, which
covers electricity use for all times of the day, regardless of other factors. Thus, the only
incentive consumers have to reduce their electricity consumption is to use less total
power. However, in reality, electricity does not cost the same to produce at all times.
When demand increases, more resources are needed to distribute power as lines become
congested and electricity needs to be re-routed. Yet in 2003, when the lines we filled to
capacity, electricity was much more expensive to produce and distribute than the utilities
like FirstEnergy were able to charge.
In another blackout, the 2002 rolling blackouts in California, during peak
periods, utilities spent $1.10 per kilowatt-hour to distribute electricity, but were only able
to charge the regulated $0.11 per kilowatt-hour.8 In nearly every other situation, the price
of electricity would have risen as demand increased. The increase in the marginal price
19
of electricity reflects the fact that more people are trying to use a limited supply. If left
uncorrected, the lines congest and heat up, if a tree happens to be close enough to the
line, then it could shut off electricity for everyone. When the grid is fragile, because they
only face average costs, consumers do not have any incentive to use electricity smarter.
No amount of new construction will correct for this illogical fact of electricity markets.
Need for Marginal Cost Pricing
In the markets for other goods and services, prices are more or less reflective of
demand and supply at that time. Using another telecommunications example, most
cellular telephone plans have a cost per minute during the day and unlimited calling in
the evening. The purpose of unlimited evening calling is to encourage customers to
substitute their non-essential calls to when fewer people are calling, and the costs of
transmission are cheaper. Likewise, hotels charge more during peak periods because
seasonal renters require the owners to build more rooms. Because the off-peak renters do
not strain capacity, they are charged less.9 In such markets, because consumption
decreases with an increase in prices, the increased price during peak periods decreases the
quantity consumed compared to if the prices were constant across periods. Conversely,
people pay lower prices during peak periods because costs are lower.
If our goal is to prevent events like the 2003 Blackout, then using marginal prices to limit
peak electricity demand only makes sense. The peak flow determines the capacity of the
electrical grid. For regulators to build an infallible grid, they must build enough lines to
support the highest amount of electricity that will ever be demanded. An obvious
solution is to endlessly build more power lines to match the absolute highest capacity. As
demand increases over time, utilities must build more and more lines to handle that one
20
day a year when demand is exceptionally high. For the rest of the year, large portions of
the grid sit unused. However, when do we as a society decide how much we want to pay
to enable inefficient electricity use? Instead of spending billions in an effort that
encourages waste, why not simply try to limit peak demand? Using the current grid more
efficiently will save billions of dollars and while still guaranteeing more reliable energy.
To allow the marginal cost of production to equilibrate such that the electrical grid never
reaches capacity, we developed the following pricing model. This model works off of the
model Electricity Equilibrium Spot Pricing Model developed by Arthur Vany and David
Walls in their 1999 paper.10
bi = ∑jTi for all i
dj = ∑jTij for all j
Tij ≥ 0 for all (i, j)
For all paths k of kij let
δak = 1 if a belongs to path k and 0 otherwise
Denote upper bound of flow on the line as Va.
Transportation cost of electricity on the line is flow-dependant such that Ca(va) is the
cost of the flow on each line. Due to line losses to inefficiencies and congestion, Ca(va)
is increasing and concave (ie C’ > 0, C’’ < 0).
The equilibrium price conditions for the network equilibrium is
Si(bi) + ∑ δakCa(va) – Ij(dj) = 0 if hk > 0, k for all kij
≥ 0 if hk = 0, k for all kij
Where Si(bi) is the inverse supply function and Ij(dj) is the inverse demand function.
By law of one price and competitive pricing, the supply price at i must equal the costs.
Thus, the spot price of electricity at any given time is simultaneously determined to equal
the supply price at i plus the transportation costs for each line.
Under this model, the quantity supplied and the price charged balance so that in
equilibrium, consumption is smoothed across periods and lines.
21
The World is getting “Smart”
The Smart Grid is no longer a theory, cities and countries around the world are
taking steps in this direction. Currently there are many small initiatives on the local, city,
and region level to move to a single, real-time, Smart Grid system. The European Union
appears to be the first that is pushing, not only on a nation scale, but a continental scale to
work on a single Smart Grid.
Particularly in the EU, many nations already share or have energy co-operations.
Hence, moving to this Smart Grid system is in Europe’s best interest. In 2005, the EU
pushed for the creation of a research consortium called SmartGrids that published a
blueprint for how each EU nation can move to a state of the art, integrated grid system.
The consortium is critical in assisting EU nations to meet their climate goals. In
March 2007, the political leaders of the EU declared that by 2020, 20% of their energy
needs will be met by renewable power sources7. A lofty goal of 50% renewable energy
by 2050 is also on the dockets, this coincides with the goal of a fully integrated Smart
Grid system.
This implementation will not come cheap. International Energy Agency estimates
that it will cost the EU up to 750 billion Euros over the next three decades to have a
modernized system8. Like many other developed countries, the power grid is out of date
and to create a Smart Grid system, massive infrastructure overhauls are needed. Much of
Europe’s power systems are generated from central power sources, not a grid system.
7
Stephen Gardner, “The single European energy grid is coming” http://www.climatechangecorp.com/content.asp?ContentID=5309
8
ibid
22
Hence transitioning to this process as well as creating cross border grid infrastructure will
take a lot of time, effort, and money.
Some countries are further ahead than others. The Italian utility company, Enel
SpA, has 27 million customers9. Over a 5 year time frame, they implemented “smart
meters” to its entire consumer base. Smart meters are an essential first step toward a
smart grid system. According to Sentec, a British Utility research firm, the Italian effort
cost up to 2.1 billion Euros10. The Italians have seen an immediate impact as they’ve
seen savings of 500 million Euros per year, hence creating a pay back time frame of four
years.
Though the Europeans are planning for the future, the world’s first SmartGrid
system is in Boulder, Colorado. The $100 million project is the work of Xcel Energy,
Colorado’s largest utility company. They chose Boulder as a place to launch this
experimental system because of the demographics of the city. Boulder is home to a high
number of educated persons has 85% have internet access and 70% have at least a
Bachelor’s degree11. The project is set to go live in 2009; if successful it could serve as a
model for many other cities, states, and regions in the U.S.
Policy Recommendations
There are many obstacles standing in the way of a fully integrated SmartGrid
system within the U.S. Some of the barriers lie in political pressures from various
lobbying groups of energy companies, natural resources, and regional co-operation.
Many of the problems also lie in integrating all the ISO’s and work together for the same
9
Sentec Presentation, www.cambridgeenergy.com/archive/2006-06-28/cef28jun06colby.pp
ibid
11
CNNMoney.com,
http://money.cnn.com/news/newsfeeds/articles/apwire/84d8ae9f2cf76c4c1ff0ddfa18f1fe85.htm
10
23
outcome. Another huge obstacle is the lack of incentives for PUC’s, ISO’s and states to
move in this direction.
There are a few policy recommendations that utilities, states, and the country
could take to move in this direction of SmartGrids. According to the Energy Policy Act
of 2005, all states are suppose to move into the direction of implementing “smart meters”
and get more Demand response from consumers. Vast majority of states have adopted
the measures in the act, yet 11 states did not ratify the act. States such as Tennessee,
West Virginia, and Kentucky have very little incentive to ratify because they have such
low costs for energy due to the abundant amount of natural resources in those states.
Hence, the Department of Energy, regional energy commissions, and the Federal
government need to apply more pressure and walk the states through advantages of
adapting into the Act.
A large reason why electricity markets are an exception to market
mechanisms that are the best method of regulating costs is the fact that consumers pay
average and not marginal costs for electricity is technology. Currently, meters only show
total electricity consumption for the month. The utility company gets this information
and simply multiplies it by the price. Thus, consumers do not know how much it costs at
each time to use electricity and have no incentive to reduce consumption when demand is
high. Even if consumers paid the marginal cost of electricity, they wouldn’t know what it
was. Hence, real time pricing is significant in order to move to SmartGrids and
efficiency.
Certain key factors need to occur before RTP and smart meters can be
implemented nationwide. A guaranteed rate of return needs to be established for utilities
24
to invest in the infrastructure needed to implement smart meters and eventually Smart
Grids. State commissions and PUC’s need to create a set of standard laws/guidelines that
establish open architecture or Zigbee enabled. According to the Energy Act of 2005, all
states and their utilities will put in “Smart Meters” in every home, but the in home
display or Personal Energy Management System (PM) will be installed by the consumer.
This is where the open architecture comes in; it allows for Honeywell and GE to compete
to place their PM in consumer homes. State of the art PMs will allow users to access,
monitor, and change their energy usage at all times from the web, on their laptop, and
their phone. This user interface and interaction will create true demand responses. AMI
brings the stability that utilities and states are looking for, but it’s the user controlled
efficiency via the PM that will appeal to consumers and citizens.
As mention with the guidelines, governments and PUC’s also need to set or
establish guidelines for reliability standards. Without these standards, utilities will not be
able to get the loans and investments necessary to build this complex infrastructure. The
reliability standards allow states and PUC’s to guarantee a positive rate of return for
utilities.
Decoupling of the electric rates is crucial for SmartGrids. Currently energy
companies and utilities charge based on transmission; hence they get paid for the amount
of energy they produce. Decoupling means that companies will get paid based on how
much energy users consume. Similar to a toll road, if instead of charging each driver
based on the exact costs they would impose on other drivers through congestion, the toll
road charged a premium price for rush hour periods, drivers would have an incentive to
not drive when demand is high. Although less precise, the same end goal is still
25
achieved. Utility companies currently use this method with large industrial plants. To
encourage substitution to night production, when there is less demand for
electricity; large factories can buy cheaper power at night, precisely because the marginal
cost of electricity is lower. This solution does not implement any new technology to
solve such a technical problem. Instead, it develops ways to use existing technology
more efficiently.
Decoupling can occur by government regulation or by companies
themselves. Moving toward real-time pricing, utilities will get the instant savings with
the AMI system as well as the efficiency gains from the technology. To maximize
savings and profits, companies should move to a consumption model.
The Cap and Trade model provides companies and utilities to work to a market
standard to reduce pollution, emissions, and other negative externalities. By putting a
price premium on externalities, hence raising the cost of producing energy, firms will
most likely pass the costs off to consumers. The Cap and Trade market gives utilities
more incentives to reduce energy usage. Consumers, who do not want to pay for a
company’s pollutants, will embrace a system where they can control their energy usage.
Hence, the model sets up well to gain benefits from the SmartGrid system while reducing
greenhouse gases.
26
Figure 6: Graphing showing the benefits of a Cap and Trade market
There are concerns over initial price hikes during the transition to a competitive
and integrated market. Low-income and the elderly citizens are the most adversely
effected by price changes, hence within the pricing model, steps need to be taken to
ensure equity in the energy consumption. “Financial insurance is a protection against
price volatility.” Companies can charge a risk premium to insure against price volatility
and physical outages. These insurance premiums will support public purpose programs
for needy citizens.
American regulators should realize that market forces are best able to prevent
catastrophic events like the 2003 Blackout. Instead of relying on a highly politicized
body to oversee the distribution of a fundamental aspect of modern society, we should
use market forces, which work well in nearly every other instance. Like other goods,
consumers use less electricity when prices are higher, and substitute less essential uses to
when prices are lower. This alignment of supply with prices ensures the stability of the
electrical grid and prevents events like the 2003 Blackout.
27
Regulators and policy officials should encourage consumers to adopt smart-grid
technology, which would decrease the cost of power for consumers and prevent
catastrophic grid failures. If this course is followed and consumers know exactly how
much they are paying for power in a competitive market, they are undoubtedly better off.
If our goal is to prevent events like the 2003 Blackout, then limiting peak
electricity demand only makes sense. The peak flow determines the capacity of the
electrical grid. For regulators to build an infallible grid, they must build enough lines to
support the highest amount of electricity that will ever be demanded. An obvious
solution is to endlessly build more power lines to match the absolute highest capacity. As
demand increases over time, utilities must build more and more lines to handle that one
day a year when demand is exceptionally high. For the rest of the year, large portions of
the grid sit unused. However, when do we as a society decide how much we want to pay
to enable inefficient electricity use? Instead of spending billions in an effort that
encourages waste, why not simply try to limit peak demand? Using the current grid will
save billions of dollars and ensure more reliable energy.
1
Pacific Northwest Laboratory. “August 2003 Blackout” http://eiopc.pnly.gov/research/2003blackout.stm
Wikipedia. “Northeast Blackout of 2003” http://en.wikipedia.org/wiki/2003_North_America_blackout
3
Lerner, Eric. “What’s Wrong with the Electrical Grid.” The Industrial Physicist. Vol. 9, No. 5.
4
Lynne Kiesling and Michael Giberson. “Analyzing the Blackout Report’s Recommendations:
Alternatives for a Flexible, Dynamic Grid. Electricity Journal 17 (July 2004): 51-59
5
IEEE Spectrum. “The Unruly Power Grid: Advanced mathematical modeling suggests that big blackouts
are inevitable.” August 2004.
6
Vernon Smith. “Power to the People” The Wall Street Journal August 12, 2005.
7
Vernon Smith. “Power to the People” The Wall Street Journal October 29, 2002.
8
Vernon Smith. “Power to the People” The Wall Street Journal October 29, 2002.
9
Vernon Smith. “Power to the People” The Wall Street Journal October 29, 2002.
10
Vany and Walls “Co-integration analysis of spot electricity prices: insights on transmission efficiency in
the western US.” Energy Economics 21 (1999) 435-448.
2
28
Download