Understanding Electric Utility Customers – 2014 Update Review of Recent Studies 3002001268 13426410 13426410 Understanding Electric Utility Customers – 2014 Update Review of Recent Studies 3002001268 Technical Update, September 2014 EPRI Project Manager J. Robinson ELECTRIC POWER RESEARCH INSTITUTE 3420 Hillview Avenue, Palo Alto, California 94304-1338 PO Box 10412, Palo Alto, California 94303-0813 USA 800.313.3774 650.855.2121 askepri@epri.com www.epri.com 13426410 DISCLAIMER OF WARRANTIES AND LIMITATION OF LIABILITIES THIS DOCUMENT WAS PREPARED BY THE ORGANIZATION(S) NAMED BELOW AS AN ACCOUNT OF WORK SPONSORED OR COSPONSORED BY THE ELECTRIC POWER RESEARCH INSTITUTE, INC. (EPRI). NEITHER EPRI, ANY MEMBER OF EPRI, ANY COSPONSOR, THE ORGANIZATION(S) BELOW, NOR ANY PERSON ACTING ON BEHALF OF ANY OF THEM: (A) MAKES ANY WARRANTY OR REPRESENTATION WHATSOEVER, EXPRESS OR IMPLIED, (I) WITH RESPECT TO THE USE OF ANY INFORMATION, APPARATUS, METHOD, PROCESS, OR SIMILAR ITEM DISCLOSED IN THIS DOCUMENT, INCLUDING MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE, OR (II) THAT SUCH USE DOES NOT INFRINGE ON OR INTERFERE WITH PRIVATELY OWNED RIGHTS, INCLUDING ANY PARTY'S INTELLECTUAL PROPERTY, OR (III) THAT THIS DOCUMENT IS SUITABLE TO ANY PARTICULAR USER'S CIRCUMSTANCE; OR (B) ASSUMES RESPONSIBILITY FOR ANY DAMAGES OR OTHER LIABILITY WHATSOEVER (INCLUDING ANY CONSEQUENTIAL DAMAGES, EVEN IF EPRI OR ANY EPRI REPRESENTATIVE HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES) RESULTING FROM YOUR SELECTION OR USE OF THIS DOCUMENT OR ANY INFORMATION, APPARATUS, METHOD, PROCESS, OR SIMILAR ITEM DISCLOSED IN THIS DOCUMENT. REFERENCE HEREIN TO ANY SPECIFIC COMMERCIAL PRODUCT, PROCESS, OR SERVICE BY ITS TRADE NAME, TRADEMARK, MANUFACTURER, OR OTHERWISE, DOES NOT NECESSARILY CONSTITUTE OR IMPLY ITS ENDORSEMENT, RECOMMENDATION, OR FAVORING BY EPRI. THE FOLLOWING ORGANIZATION, UNDER CONTRACT TO EPRI, PREPARED THIS REPORT: Energy Resource Economics, LLC This is an EPRI Technical Update report. A Technical Update report is intended as an informal report of continuing research, a meeting, or a topical study. It is not a final EPRI technical report. NOTE For further information about EPRI, call the EPRI Customer Assistance Center at 800.313.3774 or e-mail askepri@epri.com. Electric Power Research Institute, EPRI, and TOGETHER…SHAPING THE FUTURE OF ELECTRICITY are registered service marks of the Electric Power Research Institute, Inc. Copyright © 2014 Electric Power Research Institute, Inc. All rights reserved. 13426410 ACKNOWLEDGMENTS The following organization, under contract to the Electric Power Research Institute (EPRI), prepared this report: Energy Resource Economics, LLC 5524 Heathrow Drive Knoxville, TN 37919 Principal Investigator T. Flaim This report describes research sponsored by EPRI. This publication is a corporate document that should be cited in the literature in the following manner: Understanding Electric Utility Customers -- 2014 Update: Review of Recent Studies. EPRI, Palo Alto, CA: 2014. 3002001268. 13426410 iii 13426410 ABSTRACT How customers use and value electricity has been an ongoing subject of study and debate. During the past several decades, dozens of pilots and field trials have been implemented, some in conjunction with federal funding that supported investments in advanced metering infrastructure (AMI). This report is the third in a series of EPRI reports that review the research that has been done to characterize how customers use and value electricity, concentrating on large residential field trials that have been completed in the past decade or so involving time-differentiated rates, feedback, or control technology. The previous reports include: • • Understanding Electric Utility Customers: What we know and what we need to know. EPRI, Palo Alto, CA: February 2012. 1023562 Understanding Electric Utility Customers – What We Know and What We Need to Know. EPRI, Palo Alto, CA: November 2012. 1025856. This report focuses on five Consumer Behavior Studies that were funded in part through the Smart Grid Investment Grant Program from funding authorized through the American Recovery and Reinvestment Act (ARRA) of 2009.1 These five studies were selected because they had interim or final reports that summarize their major impacts and findings. Their results generally confirm what previous field trials have found regarding how customers respond to time-differentiated rates, and how feedback and control technology may boost that response. In addition, they have added to our understanding of how different recruitment strategies (opt-in vs. opt-out) might affect overall program results. We anticipate updating this research over time as new field trials and rigorous evaluations of ongoing programs are completed. Keywords Control technology Critical Peak Pricing (CPP) Critical Peak Rebate (CPR) Dynamic pricing pilots Elasticity of substitution Feedback In home display (IHD) Opt-in and Opt-out Peak Time Rebate (PTR) Time of Use (TOU) Variable Peak Pricing (VPP) 1 For more information, see: https://www.smartgrid.gov/recovery_act/consumer_behavior_studies. 13426410 v 13426410 EXECUTIVE SUMMARY How customers use and value electricity has been an ongoing subject of study and debate. During the past several decades, dozens of pilots and field trials have been implemented, some in conjunction with federal funding that supported investments in advanced metering infrastructure (AMI). These pilots have been conducted in different regions by different utilities in different climates and offered to customers with different demographic characteristics. Thus, careful analysis is required in order to identify what we know and what we don’t know about how customers respond to different inducements, including pricing structures, feedback about electricity use and control technology. This report is the third in a series of EPRI reports that review the research that has been done to characterize how customers use and value electricity, concentrating on large field trials that have been completed in the past decade or so. It focuses on five Consumer Behavior Studies (CBSs) that were funded in part through the Smart Grid Investment Grant Program from funding authorized through the American Recovery and Reinvestment Act (ARRA) of 2009.2 These five studies were selected because they had interim or final reports that summarize their major impacts and findings. The rates and enabling technologies (feedback and control technology) that were the subject of the studies are summarized in Table 1. Their results generally confirm what previous field trials have found regarding how customers respond to time-differentiated rates, and the effect of feedback and control technology on that response. Considering peak period load impacts in percentage terms, results vary widely across all five studies, from a low of zero to a high of 37%. Accounting for differences in relative peak and off-peak prices by considering elasticities of substitution, we see the same range as previous studies: ~0.10 for time-varying rates combined with feedback but without control technology, two to three times those levels in most cases when control technology such as programmable communicating thermostats (PCTs) are used, with higher impacts for PCTs that are utilitycontrolled versus customer-controlled. Other notable findings from the five CBS reports reviewed: • • • 2 In most cases, in-home displays (IHDs) offered in conjunction with a dynamic rate did not have any statistically significant incremental impact on peak load reduction compared to the rate alone; SMUD also found there was no material impact of the IHD offer on customers’ willingness to accept the a dynamic rate. In FirstEnergy’s pilot, customers with utility-controlled PCTs had the largest event load reductions compared to customers that had IHDs or PCTs that they controlled themselves. However, these customers also exhibited the largest snapbacks after events, with the postevent usage increasing by 71% and 54%, respectively, of the total event-period reduction by midnight. Considering energy savings over longer time periods (the conservation effect), SMUD found statistically significant monthly energy savings in the 2-4% range for only two of the eight For more information, see: https://www.smartgrid.gov/recovery_act/consumer_behavior_studies. 13426410 vii • treatment groups. Green Mountain Power estimated the impact of IHDs on monthly energy consumption and found a 4% savings impact. In SMUD’s pilot, average participation rates were about five times higher for opt-out versus opt-in recruitment approaches, and peak period load impacts were about half. Extending these results to an entire service territory, this may suggest an opt-out program approach could result in higher peak period load reductions than an opt-in approach, although questions remain regarding the relative costs and customer satisfaction of both approaches. In exploring what we know about how customers respond to alternative electric service plans, three stages of customer decision-making need to be considered: participation, performance, and persistence. As outlined above, the five studies reviewed here have added to our knowledge regarding performance – how customers perform with regard to energy impacts. Regarding participation, we still know relatively little about how to identify customers that might be likely to accept different offerings, although all of the consumer behavior studies collected demographic and premise information about participants, and we may see relevant analyses regarding drivers of participation in the future. We also still know the least about persistence, largely because the field trials have been of limited duration. However, future CBS reports will likely include multi-year impact analyses to help address the persistence question, as will fullscale program implementations that can be tracked over the years. Table 1 General Features of Consumer Behavior Studies Oklahoma Gas & Electric (OG&E) Pilot Features Sacramento FirstEnergy Marblehead Municipal Illuminating Municipal Utility District Company Light District (SMUD) Green Mountain Power (GMP) Type of Rate Structure Time of Use (TOU) Critical Peak Pricing (CPP) Peak Time Rebate (PTR) Variable Peak Pricing (VPP) Real Time Pricing (RTP) In-home Display Near-real-time Web Portal PCTs - customer controlled PCTs - utility controlled Water Heater Load Controls Feedback Control Technology 13426410 viii CONTENTS 1 INTRODUCTION .................................................................................................................... 1-1 Department of Energy’s Consumer Behavior Studies.......................................................... 1-1 DOE SGIG Studies Selected for Review ............................................................................. 1-1 2 OVERVIEW OF STUDY DESIGNS AND RESULTS .............................................................. 2-1 Study Designs ...................................................................................................................... 2-1 Pilot Design Structures ................................................................................................... 2-2 Customer Recruitment ................................................................................................... 2-3 Rate Structures .............................................................................................................. 2-3 Feedback ....................................................................................................................... 2-3 Control Technology ........................................................................................................ 2-4 Study Results: Load Impacts ............................................................................................... 2-4 Load Impacts for Peak or Event Hours .......................................................................... 2-4 Load Impacts beyond Peak/Event Hours: Energy Savings ............................................ 2-6 Study Results: Elasticities of Substitution ............................................................................ 2-6 3 NOTABLE FINDINGS ............................................................................................................ 3-1 Estimates of Snapback Under Utility Control vs. Customer Control .................................... 3-1 Opt-in vs. Opt-out Recruitment ............................................................................................ 3-3 4 SUMMARY ............................................................................................................................. 4-1 Participation: Who participates in optional pricing and/or enabling technology programs and why? ..................................................................................................................................... 4-1 Performance: How much do customers respond to these programs? ................................. 4-2 Persistence: Will customers continue to participate and respond over time? ...................... 4-2 A CBS SUMMARY FOR OKLAHOMA GAS & ELECTRIC ..................................................... A-1 B CBS SUMMARY FOR FIRSTENERGY ................................................................................ B-1 C CBS SUMMARY FOR MARBLEHEAD MUNICIPAL LIGHT DEPARTMENT ..................... C-1 D CBS SUMMARY FOR SACRAMENTO MUNICIPAL UTILITY DISTRICT (SMUD) ............. D-1 E CBS SUMMARY FOR GREEN MOUNTAIN POWER .......................................................... E-1 13426410 ix 13426410 LIST OF FIGURES Figure 1-1 DOE Consumer Behavior Studies Reviewed ........................................................... 1-2 Figure 2-1 Peak or Event Hour Load Impacts for CBS Studies (Opt-in Treatments only) ......... 2-5 Figure 2-2 Elasticities of Substitution for OG&E and FirstEnergy .............................................. 2-7 Figure 3-1 FirstEnergy - Event Hour Load Impacts, Control Group vs. Treatment Group ......... 3-3 Figure 3-2 Participation Rates for Pilots with Opt-in vs. Opt-out Enrollment.............................. 3-5 Figure 3-3 SMUD Pilot: Average Peak Load Impacts for Opt-in vs. Opt-out (Default) Treatments (percent of reference load)................................................................................ 3-5 13426410 xi 13426410 LIST OF TABLES Table 2-1 General Features of Consumer Behavior Studies ..................................................... 2-1 Table 3-1 FirstEnergy Treatment Groups .................................................................................. 3-1 Table 3-2 FirstEnergy Changes in Event Day Hourly Loads, Control Group vs. Treatment Groups ................................................................................................................................. 3-2 13426410 xiii 13426410 1 INTRODUCTION Department of Energy’s Consumer Behavior Studies The American Recovery and Reinvestment Act (ARRA) of 2009 provided $4.5 billion in funding to the Department of Energy to help modernize the electricity grid through a wide variety of programs, including investments in advanced metering infrastructure (AMI). One of the initiatives under the ARRA umbrella was DOE’s Smart Grid Investment Grant (SGIG) program, which provided funding for a number of Consumer Behavior Studies (CBSs). The CBSs were aimed at advancing the industry’s understanding of how consumers respond to dynamic pricing, improved control technology, and feedback regarding electricity usage. The specific goal was to “. . . advance the electric industry’s understanding of consumer behavior by addressing uncertainties surrounding questions of impacts and acceptance using statistically rigorous experimental methods.” 3 Five of the ten Consumer Behavior Studies that were funded by DOE were selected for review in this report. These five studies had interim or final reports as of May 2014. They also met the criteria EPRI established previously, 4 that studies should: • • • • • • Involve substantial scale and scope; Be completed relatively recently, thus allowing in some cases for the use of new technologies such as advanced metering, communication and control technology; Employ a rigorous experimental design with emphasis on randomized controls and treatment groups; Analyze interventions such as feedback and control technology, that can influence behavior alone or in conjunction with pricing structures; Provide detailed information about the design implementation and evaluation in public available reports; and Report comprehensive metrics, including, where applicable, not only load impacts but also price elasticities, and the effects of other non-price factors (including demographics, premise characteristics, etc.) DOE SGIG Studies Selected for Review The five studies selected for evaluation in this report met all of the above criteria, with the exception of reporting comprehensive metrics. All the studies reported load impacts, but only two reported price elasticities. In some cases, this was because the report was an interim report and the intention is to report price elasticities and other effects in the final report. 3 4 https://www.smartgrid.gov/sites/default/files/doc/files/CBS_White_Paper_draft_to_NREL_10_11_2011.pdf. EPRI 1025856, p. 2-1. 13426410 1-1 The five studies selected for review include: • • • • • Oklahoma Gas & Electric (OG&E) Positive Energy Together® pilot (Norman, OK; June 1Sept. 30 2010)5 FirstEnergy’s Consumer Behavior Study: Preliminary Evaluation for the Summer 2012 Marblehead Municipal Light Department, ENERGYSENSE CPP Pilot, Final Evaluation Report Sacramento Municipal Utility District (SMUD) Smart Pricing Option Pilot, Interim Load Impact Evaluation Green Mountain Power Critical Peak Events During the Summer/Fall of 2012 Each of these studies is summarized in detail in Appendices A-E at the end of this report. Figure 1-1 shows the locations of the five CBS studies. First Energy Ohio Marblehead Municipal Light Dept. Green Mountain Power Oklahoma Gas & Electric Sacramento Municipal Utility Dist. Figure 1-1 DOE Consumer Behavior Studies Reviewed This OG&E study was reviewed previously in EPRI 1025856. It is included again here because it is a Consumer Behavior Study, and was the the first CBS to be completed. OG&E also filed a final report, but it used a different analytical approach and does not include the comprehensive metrics required for comparison purposes. 5 13426410 1-2 2 OVERVIEW OF STUDY DESIGNS AND RESULTS Study Designs The general study design features of interest include the overall experimental design, approach to customer recruitment, type of rate structure, feedback to customers regarding usage, and control technologies. These features are summarized in Table 2-1 for each of the five studies reviewed here. Table 2-1 General Features of Consumer Behavior Studies Oklahoma Gas & Electric (OG&E) Pilot Features Sacramento FirstEnergy Marblehead Municipal Illuminating Municipal Utility District Company Light District (SMUD) Green Mountain Power (GMP) Pilot Design Structure Randomized Controlled Trial Randomized Encouragement Design Within Subjects Customer Recruitment Opt-in Opt-in - Recruit & delay Opt-out Time of Use (TOU) Critical Peak Pricing (CPP) Peak Time Rebate (PTR) Variable Peak Pricing (VPP) Real Time Pricing (RTP) In-home Display Near-real-time Web Portal PCTs - customer controlled PCTs - utility controlled Water Heater Load Controls Type of Rate Structure Feedback Control Technology 13426410 2-1 Pilot Design Structures All five of the Consumer Behavior Study designs used a variation on a randomized controlled trial (RCT). An RCT provides the most robust results because it accounts for all intervening factors.6 It requires that subjects be randomly selected and assigned to the treatment or control group from the population of interest. This design requirement means that, in order for an experimental design to be a true RCT, the treatment subjects must actually receive the treatments, i.e., they must be subscribed to the pricing plan offered or take and adopt the technology offered. The results will then provide inference to how the population would react if the treatment were administered to all. A variation on an RCT that avoids the need to mandate a treatment that customers might not be inclined to adopt, is to first recruit volunteers, and then assign them to the treatment or control randomly. Those assigned to the control group can be offered participation at a later date, or not at all. These approaches are referred to as recruit and delay or recruit and deny, respectively. The results are robust, but inference extends only to would-be volunteers in the larger population. Other means must be employed to establish who would volunteer. Yet another variation on an RCT is a randomized encouragement design (RED). Customers are randomly assigned to treatment and control groups ahead of any recruitment. Customers in the treatment groups are then recruited, or encouraged, to take the treatment. In the analysis, all encouraged customers, even those that do not accept the treatment, are compared to the control group, and adjustments using the recruitment uptake rate are necessary to account for selfselection bias. The implications of this approach are that i) a relatively large pool of customers is needed to draw the control group and the groups to be encouraged, ii) the uptake rate should be relatively high in order to be able to detect effects, and iii) data are required for all customers assigned to the encouraged or control groups, even those that do not accept the offer. The benefit of this approach is that no delaying or denying is required, and two sets of impacts are calculated: those that are extensible to an entire service territory, as well as those for the wouldbe volunteers in the territory. Marblehead, OG&E, and SMUD executed an RCT as defined above, and each used a recruit and delay approach. Green Mountain Power’s pilot was designed and analyzed as a RED, and SMUD also used a RED design and analysis approach for a subset of their pilot treatments. FirstEnergy’s pilot design was consistent with a RED, but it was not analyzed as such; it employed a model-based treatment estimation mechanism. For a complete discussion of alternative designs see: Quantifying the Impacts of Time-Based Rates, Enabling Technology, and Other Treatments in Consumer Behavior Studies - Protocols and Guidelines, EPRI: Palo Alto, CA: July 2013. 3002000282. 6 13426410 2-2 Finally, SMUD also used a within-subjects approach to evaluate some of their event-based treatments.7 Customer Recruitment All five studies used an opt-in approach to recruitment for at least some of their treatments, i.e., customers were solicited in various ways (mail, e-mail, telephone, etc.) and were invited to participate. Customers would then have to affirmatively reply to be enrolled in the study (or be assigned to a control group). As discussed, Marblehead, OG&E, and SMUD employed a recruit and delay approach, whereby customers who replied affirmatively were either enrolled in the pilot or the control group in the first year, and customers who were assigned to the control group were offered the chance to obtain the rate and/or technology at the end of the pilot treatment period. In the opt-out approach, customers are invited to participate in the program and are automatically enrolled unless they decline to participate. SMUD tested both opt-in and opt-out approaches, the results of which are described in more detail below. Rate Structures A variety of dynamic rates structures were tested, including time of use (TOU); critical peak pricing (CPP); peak time rebate (PTR – also called critical peak rebate, CPR); and variable peak pricing (VPP). Real-time pricing (where prices can change hourly) is listed in the table for completeness, but none of these studies tested RTP. These rate structures range from the least dynamic (TOU) to the most dynamic (RTP), and are described in detail in previous EPRI work.8 Three of the five utilities tested more than one rate structure (OG&E, SMUD, and Green Mountain Power). Feedback EPRI defines feedback as information about a customer’s electricity consumption that is provided to them on some form of ongoing basis, and has developed feedback type categories that distinguish different levels of feedback information and the frequency with which it is provided.9 Four of the utilities tested the impact of Type 5 feedback, which means customers were provided real-time, premise-level energy consumption information. Three of the utilities (Green Mountain Power, OG&E, and SMUD) provided this feedback to their customers via inhome displays (IHDs). In SMUD’s case, the pilot design was such that the IHD offer was what was evaluated, versus the impact of the IHD itself. One utility, OG&E, also provided some of their treatment customers access to a web portal with real-time feedback. Within-subjects refers to comparing treatment group customers to themselves on event versus non-event days. It is only appropriate for event-based impact evaluations, and is considered to be less rigorous than randomized experiments. For a more detailed discussion see EPRI 3002000282. 8 A System for Understanding Retail Electric Rate Structures. EPRI, Palo Alto, CA: July 2011. 1021962. 9 Guidelines for Designing Effective Energy Information Feedback Pilots: Research Protocols. EPRI, Palo Alto, CA: 2013. 1020855. Understanding Electric Utility Customers – Summary Report, What We Know and What We Need to Know. EPRI, Palo Alto, CA: November 2012. 1025856, p. 4-1. 7 13426410 2-3 Control Technology Three of the five utilities offered customers control technology. FirstEnergy and OG&E offered programmable communicating thermostats (PCTs) for customers with central HVAC systems. FirstEnergy offered the options of either utility control or customer control. With utility control, FirstEnergy controlled the customer’s thermostat during event periods, although the customer could override this control at any time. With customer control, customers were alerted of events via the thermostat, but it was up to them to make any changes. In the case of OG&E, customers controlled their own thermostats (i.e., no utility control). After their first test summer without control technology, Marblehead offered PCTs to customers with central AC, and load control swtiches for customers with electric water heaters. However, their impacts could not be evaluated as limited numbers were installed and operational due to technical issues. Study Results: Load Impacts Load Impacts for Peak or Event Hours All the CBSs reported load impacts for peak or event hours (peak hours for time of use, event hours for CPP and PTR, etc.) The load reductions (in percentages) across the studies are shown in Figure 2-1. Several points are immediately apparent. First, the load impacts vary widely, from a low of zero to a high of 37%. Second, it is extremely difficult to detect a systematic explanation for why the response levels vary across studies. One of the most obvious limitations is that load impacts do not account for differences in relative prices. That is why it is so important to estimate price elasticities, which only two of the five studies do (at this point in time). Despite the limitations associated with comparing load impacts alone as a performance metric, a few observations can be made about the results presented here. First, it is generally (but not always) true that load impacts increase greatly when customers are offered (and accept) control technology. It was also true that in three out of four cases, IHDs offered in conjunction with a dynamic rate did not have any incremental load impact compared to the rate alone. The exception is in the SMUD pilot, where the offer of an IHD had a relatively small (~3%) but statistically significant impact on load reductions when compared to customers on the TOU rate who were not offered an IHD; however, comparing SMUD CPP customers with and without the IHD offer, the difference was not statistically significant. In the case of the GMP pilot, the treatment which combined an IHD with the CPP rate had a 3% lower load impact than the CPP rate alone, although it it not known if this 3% difference is statistically significant. There are also pilot-specific factors that explain some of the variation in results. For example, Marblehead offered PCTs to all customers with central AC and load switches to all customers who had electric water heating. But technology deployment problems were such that few customers actually received and were able to operate either technology. Thus, no control technology results are reported in Figure 2-1. 13426410 2-4 Figure 2-1 Peak or Event Hour Load Impacts for CBS Studies (Opt-in Treatments only) NOTE 1: Results reported here are for the first test summer (summer 2011). The following year, Marblehead offered PCTs to all customers having central AC and load switches to all customers who had electric water heating. However, technology deployment problems were such that few customers actually received and were able to operate the technology. NOTE 2: The results for GMP's PTR treatments were not statistically different from zero for three of the four events called during the first year of the pilot. NOTE 3: All ‘with PCT’ results listed refer to customer-controlled PCTs during events (versus utility controlled), unless noted otherwise. NOTE 4: IHD refers to customers who accepted an IHD, except for SMUD, which tested the effect of the IHD offer only. 13426410 2-5 Load Impacts beyond Peak/Event Hours: Energy Savings While all the CBS studies estimated load impacts during peak or event hours, some also attempted to estimate energy savings over longer time periods. FirstEnergy and Marblehead estimated average daily energy impacts for event days only, i.e., the change in usage across all 24 hours in the event-day, by treatment. FirstEnergy found that average event day usage reduction ranged from being not statistically significant to 3.0%.10 Marblehead found that event-day energy consumption was reduced 12.1% in the first year, but the reductions were much smaller (3.9%) in year 2. (Milder weather was thought to be a contributing factor.) SMUD estimated monthly energy savings (sometimes referred to a conservation effect), and found that differences were not statistically different from zero for six out of eight treatments. Only the opt-in CPP with an IHD offer and the default TOU with an IHD offer found savings that were statistically significant, at 4% and 2%, respectively.11 Green Mountain Power estimated the impact of IHDs on monthly energy consumption averaged across its CPP and PTR customers. They found that customers with IHDs had a 4% reduction in monthly kWh consumption relative to customers without IHDs. (See Appendix E for more details.) Study Results: Elasticities of Substitution Figure 2-2 shows the elasticities of substitution12 for the two CBS pilots that reported them, OG&E and FirstEnergy. (SMUD did not estimate price elasticities for the interim report, but plans to do so for the final report.) 10 For PTR customers with utility-controlled 4-hour events and utility-controlled 6-hour events, respectively. PTR customers with either the IHD or a customer-controlled PCT had overall event day reductions in the 2% range. See Appendix B for more details. 11 Stephen S. George, Michael Perry, Elizabeth Hartmann, Christine Hartmann, SMUD Smart Pricing Option Pilot, Interim Load Impact Evaluation, Prepared for the Sacramento Municipal Utility District, Freeman, Sullivan & Co., San Francisco, CA, September 19, 2013, p. 65. 12 Elasticity of substitution (EoS) defines the substitution of energy used between high and low priced period (e.g., peak and off-peak) attributable to the pricing structure. “It is defined as the percentage change in the ratio of electricity usage between time periods that is due to a one percent change in the inverse ratio of those period's electricity prices, all other factors held constant. For example, a 0.1 EoS means that a 1% change in the ratio of peak to off-peak prices would lead to a 0.1% change in the ratio of off-peak to peak consumption. The convention is to report these elasticities as positive numbers.” Understanding Electric Utility Customers – Summary Report: What We Know and What We Need to Know. EPRI, Palo Alto, CA: 2012.1025856, p. 3-14. 13426410 2-6 0.35 0.30 0.25 0.20 0.15 0.10 0.05 0.00 Oklahoma Gas & Electric (OG&E) FirstEnergy Illuminating Company CPP alone CPP with real-time web portal 0.11 CPP with PCT 0.24 CPP with IHD 0.10 CPP with portal+PCT+IHD 0.24 PTR alone PTR with PCT 0.09 PTR with PCT + utility control (4hrs) 0.30 PTR with PCT + utility control (6hrs) 0.28 PTR with IHD 0.09 TOU alone TOU with real-time web portal 0.11 TOU with PCT 0.24 TOU with IHD 0.10 TOU with portal+PCT+IHD 0.24 VPP alone VPP with real-time web portal 0.07 VPP with PCT 0.25 VPP with IHD 0.07 VPP with portal+PCT+IHD 0.25 Figure 2-2 Elasticities of Substitution for OG&E and FirstEnergy The results for OG&E (previously reported) and FirstEnergy are consistent with pilots reviewed in previous EPRI reports: • • OG&E customers on CPP or TOU with PCTs had price elasticities about 2 times larger than customers on the same rates with feedback (either the real-time portal or the IHD) OG&E customers on VPP with PCTs exhibited price response 3.6 times higher than those with feedback (either the real-time portal or the IHD) FirstEnergy (where customers were on a PTR rate) tested the effect of a PCTand an IHD, as well as the impact of utility control vs. customer control of that technology. • Impact of Utility Control vs. Customer Control. Customers with a PCT which was controlled by the utility, in all cases with customer over-ride capability, had a price response 13426410 2-7 • that was three times larger than customers with either a PCT or an IHD but without utility control (i.e., the customer had to take action to reduce load). These results suggest that utility control of PCTs during events can lead to higher impacts than customer control, although it is interesting to note that OG&E customer-controlled PCTs yielded similar results to FirstEnergy’s utility-controlled PCTs. OG&E did not test utility control of the PCTs during that phase of their pilot, although it would be interesting to determine if that increases impacts even further. Impact of 4-hour vs. 6-hour event length. FirstEnergy also found that there was virtually no difference between the average hourly price response for events that were four hours in duration compared to events that were six hours in duration, when the utility controlled the PCT. 13426410 2-8 3 NOTABLE FINDINGS We noted in a previous report that “. . . very little progress will be made if more utilities simply duplicate the field trials that have already been completed, especially those that simply measure and report load impacts. Instead, more focused research on particular issues is needed to resolve major uncertainties and narrow the research gaps.”13 One of the stated goals of the DOE SGIG Consumer Behavior Studies was to advance our collective understanding of how customers respond to electricity prices and how new control and information technology might enable customers to obtain greater value from their energy services.14 There are two notable areas of analysis that have emerged from the reviewed consumer behavior studies: 1. Impact of utility versus customer control concerning level of snapback after events. 2. Impact of “opt-in” vs. “opt-out” recruitment. Estimates of Snapback Under Utility Control vs. Customer Control Snapback is the change in load immediately following a peak period event. If the snapback is large enough, it is possible that it could create a new peak at a different time of day. Snapback has been noted in earlier studies of the impact of time-based rates, but few recent studies have focused on it. FirstEnergy tested the effects of load reductions during PTR events for four treatment groups (described in more detail in Appendix B). Table 3-1 summarizes the four treatment groups which vary event duration, customer vs. utility control, and type of technology (PCT vs. IHD). Table 3-1 FirstEnergy Treatment Groups Treatment: B-1 B-2 C-2 B-3 Event duration: 4-hour 4-hour 6-hour 4-hour Who controls? Customer Utility Utility Customer Technology PCT PCT PCT IHD EPRI 1025856, p. 2-1. For further information, see: https://www.smartgrid.gov/document/us_department_energy%E2%80%99s_approach_conducting_consumer_behav ior_studies_within_smart_grid_inve 13 14 13426410 3-1 Table 3-2 shows the changes in load (kW) per hour on event days for the treatment groups compared to the control group. Figure 3-1 displays the same information graphically. Table 3-2 FirstEnergy Changes in Event Day Hourly Loads, Control Group vs. Treatment Groups15 Hour 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Total Event LI Avg Hourly LI Total Snapback Avg Hourly Snapback Control Group versus: PCT PCT Utility- PCT UtilityCustomer4hr (B2) 6hr (C2) 4hr (B1) 0.1 0.026 0.05 0.031 0.02 0.049 0.021 0.029 0.039 0.025 0.032 0.034 0.011 0.012 -0.002 0.028 -0.002 0.026 0.016 -0.002 0.031 -0.053 0.003 0.036 -0.076 -0.004 -0.034 -0.084 0.031 0.006 -0.126 -0.018 -0.005 -0.133 -0.011 0.086 -0.115 0.025 0.087 -0.11 0.099 -1.077 -0.173 -1.034 -1.066 -0.195 -0.995 -0.898 -0.263 -0.793 -0.687 -0.261 -0.556 -0.456 -0.043 0.687 -0.43 0.046 0.636 0.927 0.104 0.481 0.836 0.072 0.283 0.46 0.023 0.157 0.171 0.068 0.145 0.101 -0.89 -3.38 -4.61 -0.22 -0.84 -0.77 0.27 2.39 2.50 0.05 0.40 0.50 IHD-4hr (B3) 0.026 -0.008 -0.008 -0.004 -0.011 -0.038 0.023 0.032 -0.059 0.011 0.022 0.047 0.046 -0.034 -0.164 -0.163 -0.216 -0.194 0.014 -0.002 -0.005 0.068 0.109 0.016 -0.74 -0.18 0.20 0.03 D. Hansen, M. Hillbrink (Christensen Associates Energy consulting) and R. Boisvert (Cornell University), FirstEnergy’s Consumer Behavior Study: Preliminary Evaluation for the Summer of 2012. Prepared by EPRI for FirstEnergy Corp. EPRI, Palo Alto, CA: August 2013. 3002001870, p. 5-3. 15 13426410 3-2 1 PCT Customer-4hr (B1) PCT Utility-4hr (B2) PCT Utility-6hr (C2) 0.5 IHD-4hr (B3) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 kWh 0 -0.5 -1 -1.5 Hour Figure 3-1 FirstEnergy - Event Hour Load Impacts, Control Group vs. Treatment Group These results show that the utility-controlled PCT treatment groups (both the 4-hour and 6-hour) had the largest load reductions, but also the largest snapback. The total event hour load impact and the average hourly load impact are shown at the bottom of Table 3-2. Total snap back and average hourly snapback are shown in the last two rows of Table 3-2. For the two utility-controlled groups: • • For customers with the 4-hour event window, “By midnight, the total post-event usage increase was estimated to be 2.4 kWh (over a typical day’s load) for the customers with the 4-hour event window, which represents 71 percent of the total reduction in event-period usage. For customers with the 6-hour event window, the rebound is 2.5 kWh, 54 percent of the total reduction in event-period usage.”16 (emphasis added) The extent to which snapback is a problem will depend upon several factors, including the relative costs of energy during peak and off-peak hours. But it is clearly a factor that should be considered in determining when events should be called and for how long, and the relative advantages of utility vs. customer control. Opt-in vs. Opt-out Recruitment Most of the field trials conducted in the past decade have employed an opt-in approach to customer recruitment whereby customers are invited to participate in a new program or rate offering, and then must affirmatively choose to do so. Only two pilots have employed an opt-out 16 Ibid., p. 5-2 (emphasis added). 13426410 3-3 approach whereby customers are informed that they have been enrolled in a program, but offered the opportunity to opt out if they do not wish to participate. In an opt-in program, utilities must persuade customers to switch from their current service to the new energy service plan (ESP). Inertia is a factor that must be overcome to recruit customers, because the path of least resistance is to simply stay where they are. Customers must be persuaded to make an effort to consider the alternative and enroll in the program. In the case of opt-out recruitment, inertia is a factor that promotes higher participation because because the path of least resistance is to allow themselves to be enrolled in the new ESP. Many consumer advocates and regulators prefer opt-in recruitment because they believe customers should not be defaulted into participating in a program they do not decide for themselves that they want. Thus, it is no surprise that among the field trials that have been conducted over the past decade and that EPRI has reviewed, only two have tested an opt-out approach to customer recruitment, the Commonwealth Edison Customer Applications Program (ComEd CAP17), and now the SMUD pilot reviewed here. The fact that inertia has a powerful effect on customer participation rates is demonstrated in Figure 3-2, which shows the participation rates for those studies which reported them. Participation rates are defined as the number of customers enrolled in the program as a percentage of customers recruited. For the two opt-out pilots, the participation rates are nearly 100%. However, participation is only one of three factors that determine overall program impacts. Equally important are performance (how customers respond once they are on the rate or receive feedback or a control technology) and persistence (whether customers remain in on the rate over time and continue to respond over time). Figure 3-3 shows the peak period load reductions for the opt-in vs. the opt-out (default) TOU and CPP treatments for SMUD’s study. The results show that the average peak period load reductions for the opt-out treatments are about half those for the opt-in treatment groups. 17 Reviewed in EPRI 1023644 and 1024865 (ComEd CAP) and in EPRI 1023562 and 1025856. 13426410 3-4 Pilot Participation Rates - Opt-in vs. Opt-out Enrollment 98% 19% 9% 4% 7% 3% 11% 96% 20% 18% 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% Opt-in Opt-out Figure 3-2 Participation Rates for Pilots with Opt-in vs. Opt-out Enrollment18 30% 26% 25% 22% 20% 15% 10% 10% 13% 12% 6% 13% 8% 5% 0% Figure 3-3 SMUD Pilot: Average Peak Load Impacts for Opt-in vs. Opt-out (Default) Treatments (percent of reference load)19 18 The participation rates are for pilots EPRI has reviewed that reported participation rates. The first five pilots were summarized in EPRI 1024867. The last three are summarized in the appendices to this report. 19 Stephen S. George, et al., op. cit., pp. 61 and 67. 13426410 3-5 By combining the effect of both the participation rate and the response rate, the total load impacts that are ultimately achieved can be compared. Here is how the SMUD study authors summarize their findings: While the average impact of default customers is lower . . ., the acceptance rate for TOU is much higher among default customers than opt-in customers. The acceptance rate for the opt-in treatment for TOU with an IHD offer was 17.5% whereas the initial dropout rate (prior to going on the rate) for default TOU with an IHD offer was only 3%. Thus, if 100,000 customers who met the sample selection criteria had been offered TOU on an opt-in basis during the pilot period compared to defaulting 100,000 customers onto the rate and allowing them to drop out, the aggregate peak-period load reduction would have equaled roughly 4.2 MW (0.24 kW x 100,000 x 0.175) for the opt-in program + 0.5 MW, and nearly three times as much for the default program, at 11.4 MW (0.12 kW x 100,000 x 0.97), + 2.2 MW.20 This summary refers to the level of impacts between the two designs. It does not resolve which is most cost effective – that requires comparing the cost savings associated with the impacts with the costs to implement the program. When the final analysis of the SMUD pilot is completed, it could help improve our overall understanding of which recruitment approaches are likely to be most cost-effective overall – including the costs of recruitment, billing, customer service, infrastructure, and customer satisfaction, as well as the net benefits of the program in terms of aggregate load impacts. 20 Stephen S. George, et al., op. cit., p. 62. 13426410 3-6 4 SUMMARY The results of five Consumer Behavior Studies funded by the DOE Smart Grid Investment Grant program were reviewed in this report. The pilots were selected based on the existence of either interim or final reports. As outlined in previous EPRI reports, in exploring what we know about how customers respond to alternative electric service plans, three stages of customer decision-making need to be considered in order to assess, with reasonable accuracy, what the likely aggregate response might be from offering any alternative rate structure and/or enabling technology: participation, performance, and persistence. Participation: Who participates in optional pricing and/or enabling technology programs and why? Who and how many customers will sign up for an optional rate, if it is offered? We still know relatively little about how to identify likely candidates by observable characteristics (demographics, premise characteristics), which would help identify target markets, or by attitudes and beliefs, which would help in the design of marketing materials. All CBSs collected demographic and premise information about participants, and we may see relevant analyses regarding drivers of participation in future reports. For the most part, however, the five studies reviewed here did not include insights into these drivers, with some exceptions. OG&E found that in general, participation rates correlated positively with household income, with lower income customers proving more difficult to recruit.21 They also considered average household age, and found there was less variance in participation rates with this attribute as compared to household income. In SMUD’s case, customers that were on SMUD’s Energy Assistance Program Rate (EAPR), a tariff for low income customers, enrolled at a proportionately higher rate for all opt-in offerings (with and without an IHD offer) than did nonEAPR customers.22 A systematic comparison across studies would be needed to determine how the SMUD and OG&E findings are comparable, if at all. SMUD also tested whether offering a free IHD would increase customer acceptance of a dynamic rate, and found that it did not, at least not in any material way. There was no statistically significant impact of the IHD offer on CPP uptake, and only a small statistically significant impact of about one percentage point on TOU uptake.23 Finally, the SMUD report also offers insight about whether customers should be recruited on an opt-in basis (the most prevalent approach to recruiting customers today), or on an opt-out basis (which only two utilities, Commonwealth Edison and SMUD have tested.). The SMUD interim Mike Farrell, Angela Nichols, and Katie Chiccarelli, Chapter 1-- Overview: OG&E Consumer Behavior Study Evaluation Report, Oklahoma Gas & Electric, August 2012. 22 Ibid., p. 53. 23 Ibid., pp. 51-52. 21 13426410 4-1 results suggest that opt-out enrollment could yield greater overall load impacts than an opt-in approach, because they achieved higher participation rates, which could more than offset the lower overall response to prices compared to customers who have opted in. Performance: How much do customers respond to these programs? The results of the five studies reviewed here are difficult to compare because only two studies estimated price elasticities (which measure response to relative prices and therefore produce results that can be compared more directly than simple load impacts alone). However, the two studies that estimated price response (OG&E and FirstEnergy) found elasticities of substitution in the same range as previous studies: ~0.10 for TOU, CPP, and PTR with feedback but without control technology; twice those levels in most cases when control technology such as PCTs are used; and over three times those levels in the OG&E trial with the VPP rate, and in the FirstEnergy trial when the utility controlled the PCT rather than the customer. However, FirstEnergy also found that the snapback for the utility-controlled case was proportionately higher as well. Persistence: Will customers continue to participate and respond over time? We still know the least about persistence, largely because the field trials have been of limited duration, including the five CBS studies reviewed here. One exception is Marblehead, which reported impacts for two summers, and found the second summer’s CPP impacts were substantially lower: mild weather and a change in evaluation methodology in the second summer are thought to be main contributing factors to this. Other CBSs will include multi-year impacts in their final reports, and this will be helpful, because fundamental questions about persistence remain: Will customers who elect to participate in a dynamic pricing program (with or without enabling technology) stay in the program over time? If so, will their response to price signals and information persist, improve or worsen? What types of information and feedback could be provided to improve customer response and satisfaction? Some utilities are moving from field trials to full scale programs, especially as advanced metering infrastructure, smart meters and other supporting systems are implemented. Full scale programs create different evaluation challenges (most notably, establishing an appropriate control group for measuring impacts), but offer important advantages, such as the ability to learn how to create programs that provide customer benefits and improve customer satisfaction over time. 13426410 4-2 A CBS SUMMARY FOR OKLAHOMA GAS & ELECTRIC Name of Study OG&E Smart Study TOGETHER Impact Results, Final Report – Summer 201124 Reference C. Williamson, Project Manager, Global Energy Partners, Walnut Creek, CA, February 29, 2012. Location Phase I: Norman, Oklahoma Phase II: Norman, Oklahoma and the area surrounding Oklahoma City Customer Class Residential and Commercial Time/Duration Summer Peak Period Only Phase I: Four months (June 1 through Sept. 30, 2010) Phase II: Four months (June 1 through Sept. 30, 2011) Pilot Design Randomized control trial Sample Sizes (pp. 3-3 to 3-4) Includes control and 8 treatment groups: Phase I Residential: 2,630 residential customers Phase I Commercial: insufficient recruitment for analysis Phase II Residential: 2412 Phase II Commercial: 712 Regression Analysis: 4,423 Description of Treatments and Recruitment Methods Eight treatments covering two rates and four technology combinations: • Treatments (GEP Report, p. 8) • Two dynamic rates: o TOU-CP (time of use + critical price overlay) o VPP + CP (day-ahead variable peak pricing + CP overlay) Four technology combinations: o Web Portal (all customers received) o In Home Display (IHD) o Programmable Communicating Thermostat (PCT) o Combination of the above three Two feedback treatments were provided in conjunction with the two rate treatments: Feedback Treatments • Real-time web portal (Type 5) was “. . . an energy information website providing customers with 15 minute interval data updated every 15 minutes, neighborhood comparisons, bill estimates, environmental impacts, as well as tips and tools to manage energy consumption.” (p. 1-3) C. Williamson. OG&E Smart Study Together Impact Results, Final Report – Summer 2011. Global Energy Partners, Walnut Creek, CA. February 29, 2012. (GEP Report.) This report covers both Phase I and Phase II. The Phase I report is: The Cadmus Group, Inc. and OG&E Corp. 2010 Positive Energy Together® Dynamic Pricing Impacts. Oklahoma Gas & Electric. Portland, OR and Oklahoma City, OK. October 3, 2011, which was summarized in Understanding Electric Utility Customers - 2012 Update, Review of Recent Studies, EPRI, Palo Alto, CA: 2012. 1024402. 24 13426410 A-1 Name of Study OG&E Smart Study TOGETHER Impact Results, Final Report – Summer 201124 In-home Display (Type 5) displayed near real-time, whole-premise loads, plus estimated monthly cost and current price. (p. 1-3) Note: because the feedback was provided along with a rate in all treatment groups, the study design cannot detect effects of the feedback alone. • Control Technology Treatments A Programmable Controllable Thermostat (PCT) was offered to customers with central AC. A PCT is a “ . . . a customer controlled device with current pricing information which allows automation of comfort settings based on current energy prices.” p. 1-3). “Customer controlled” means OG&E did not control the thermostats during events, but left it to the customers to choose to program their thermostats to coincide with event times. The pricing signals were communicated via the thermostats, and customer could access them online for realtime control as well. Note: because the PCTs were provided in conjunction with the two rate treatments, the study design cannot detect effects of the PCTs alone. Other incentives In Phase I, technologies were provided free of charge with first year bill protection. (It is assumed the same applied for Phase II.) Installation Method In Phase I, after (OG&E) . . . randomized the technology assignments, a contractor handled the recruiting and installation.25 (It is assumed the same applied for Phase II). Recruitment Approach (p. 2-1) • Opt-in recruitment. See EPRI 1025480 for a further discussion of recruitment channels used.26 • Based on their random assignment, customers who volunteered were assigned to either a TOU-CP or a VPP-CP. Customers in the control group were left on their existing standard rates. • After selecting a rate option, customers were randomly assigned to one of the information delivery options. Customers without central A/C were not eligible for the PCT or the combination of all three technologies. Customers without internet access could not access the web portal. In these two cases, customers were then randomly assigned to one of the technology options which they could utilize. Residential & Commercial TOU-CP Prices for 2011 (¢/kWh) Residential TOU-CP Price Period Dynamic rates (p. 1-2, 1-3) Commercial TOU-CP Number of days in summer 2011 at each price level Off-peak 4.2 4.7 36 On-Peak (2:00-7:00 p.m. weekdays, June 1 - Sept 30, excluding weekends and holidays) 23.0 30.0 86 Critical Events (can be exercised on 2 hours notice by OG&E 46 60 7* Note: the 7 days that critical events are also included in the 86 days having on-peak prices. C. Williamson, OG&E Smart Study TOGETHER Impact Results - Interim Report – Summer 2010, Report No. 1299-01, Global Energy Partners: March 28, 2011, p. 3-2. See this report for a description of some of the problems that were encountered during installation and tracking, and how it affected the assignment to treatment groups. 26 Customer Participation in Behavioral Programs: A Review of Recruitment Experiences. EPRI, Palo Alto, CA: 2012. 1025480. 25 13426410 A-2 Name of Study OG&E Smart Study TOGETHER Impact Results, Final Report – Summer 201124 Day-Ahead Variable Peak Pricing - General Description (p. 1-2) The VPP-CP was designed using the existing Residential TOU rate structure. The peak period TOU price is replaced with one of the four variable prices listed below. The price signal is sent to participating customers by 5:00 p.m. the day before, with a single price applying to the entire five-hour window each weekday. As shown in the table below, ". . .there are four defined price levels (low, standard, high and critical) to simplify communications of price level. The prices assigned to each price level are based on the underlying Standard and TOU tariffs. Low prices, at 4.5 ¢/kWh, are similar to Off-peak energy prices, Standard prices equate to the standard tariff summer season tail-block price, and High and Critical prices reflect the peak period energy prices. (p. 1-2) Residential and Commercial VPP-CP Prices (¢/kWh) Price Period Residential TOU-CP Commercial TOU-CP Number of days in summer 2011 at each price level Low and Off-peak 4.5 5.0 63 Standard 11.3 10.0 25 High 23.0 30.0 28 Critical 46.0 60.0 6 Critical Events 46.0 60.0 7* Note: the 7 critical events fell on different day types. Thus the total critical event days are also included in the number of days at the various price levels. Analysis Methods and Results Residential Customers, Phase I and II: • Analysis Methods (p. 2-2 to 2-3; p. 4-1) For Phase I, customers are analyzed by calculating the difference between participant and control group loads. “For phase I participants, the savings for each day type were estimated as follows. We first calculated the average hourly load for each day type for each customer. . . .Then for each rate-technology combination and for the control group, we calculated an average day type load shape for each demographic combination segment (age and income), and then weighted those segments based on the OG&E service territory proportions to each segment. This adjusted the relative representation of the different demographic segments to correspond to OG&E’s service territory instead of to the roughly sample sizes from the sample design. We also calculated the associated variances and 90% confidence intervals for these estimates.” (p. 4-1) For Phase II, “. . . customers are analyzed by adjusting the difference between participant and control group loads by the pre-participation differences of the two groups, using a difference of differences method.” (p. 2-2) More specifically, “. . . this ‘first difference’ was subtracted from the control group load for each day type and rate-technology segment combination during the summer of 2011 to create an adjusted control group load, with the segments then combined as described above (for Phase I). This calculation results in a control group load adjusted for the pre-existing differences between the participant and control groups. Because this adjustment differs across the rate/technology groups, the adjusted control group loads also differ. The last step of the process for Phase II is to calculate the savings as the difference between this adjusted control group and the participant group for each day type and rate-technology combination. Commercial Customers, Phase II Only. Commercial Customers are analyzed as a separate group in Phase II only (p. 2-3). There were insufficient data to analyze commercial customer response in Phase I. “The commercial group for • 13426410 A-3 Name of Study OG&E Smart Study TOGETHER Impact Results, Final Report – Summer 201124 Phase II was treated the same way as the residential group, except that there was no segmentation for the commercial customers, so the step of combining segments was not needed.” (p. 4-1) Regression Analysis (Phase I and II), Residential Customers Only • For Phase I and II, all data are combined and a statistical regression model is used “ . . .which quantifies the variability from all other known sources (appliances, building size, etc.) and removes that from the estimate of load impact. This allows estimating the load impact for different temperatures independent of price.” (p. 2-1) • “For this analysis, the on-peak energy was calculated for each customer for each nonholiday weekday by summing the energy for each interval during the on-peak period. Both summers were included, and event days were excluded, since on-peak energy use would be influenced by the events on those days. This on-peak energy was the dependent variable in the regression model. The model looks at on-peak energy as a function of numerous other variables, and estimates the coefficients of the variables in that function. • The variables included information from the survey, from weather databases, and information related to participation. Conceptually, information not related to participation went into the model to estimate the baseline on-peak energy use based on all customers (Phase I and Phase II participants and control groups), and the participation (in some cases interacted with weather data) estimate the program impacts. The number of Cooling Degree Days (CDD) for each day was used in the model, both with and without participation, so that the relationship between energy use and temperature as well as savings and temperature is quantified by the model. • Once the model was estimated, the savings for each price level and technology were estimated using the model coefficients based on several daily average temperatures.” (p. 41 to 4-2) Group Control TOU-CP VPP-CP Total 511 - - 511 Web Portal Only - 273 298 571 IHD, Portal - 223 232 455 PCT, Portal - 221 227 448 All 3 Treatments - 212 215 427 511 929 972 2,412 Treatment Group Control TOU-CP VPP-CP Total Control 239 - - 239 Web Portal Only - 98 101 199 IHD, Portal - 48 46 94 PCT, Portal - 46 36 82 All 3 Treatments - 51 47 98 Total 239 243 230 712 Control Sample Sizes Used, Phase II Residential, p. 3-3 Total Sample Sizes Used, Phase II Commercial, p. 3-3 13426410 A-4 Name of Study OG&E Smart Study TOGETHER Impact Results, Final Report – Summer 201124 Start Time (pm) End Time (pm) Notice (hours) Original VPP Price Level Daily Low Daily High 8-Jul 1:00 7:00 2 high 79 99 19 15-Jul 1:00 9:00 3 high 76 103 19.5 8-Aug 4:00 6:00 2 critical 79 107 23 24-Aug 4:00 6:00 Day ahead high 78 107 22.5 1-Sep 3:00 7:00 2 high 73 102 17.5 13-Sep 1:00 5:00 Day ahead Standard 73 101 17 27-Sep 4:00 6:00 Day ahead Low 56 86 1 Date Event Days Called, Phase II (2011), p. 3-4 70o Base CDD Residential Customer Analysis, Energy Savings: Load Impacts Measured • For Phase I and Phase II, for weekend non-event days, weekday event days, and weekday non-event days, the authors report for each rate and technology combination, (and for different price levels for the VPP rate): • The baseline average hourly consumption during peak and off-peak periods • The change in peak and off-peak consumption (average impacts kW and %) Using the regression analysis described above, the authors also estimate the change in average on-peak consumption for each rate and technology combination as a function of average daily temperature. (pp. 4-5, 4-6). Commercial Customer Analysis Energy Savings: Same analysis as outlined above was performed for commercial customers. Residential Customer, Demand Savings: On-peak demand savings were estimated, by temperature level, using the regression analysis outlined above. Summary of Results in the Final Report: There was no overall analysis of results, no overarching model of behavior (such as a demand model) was estimated to summarize what was learned. Instead, the report summarizes data for individual events, etc. in 20 tables reporting load changes by rate-technology combinations, each table containing as many as 172 numbers. Average peak load impacts were estimated using a model which estimated load response by temperature level. Then the results are plotted by day-type in 80 different figures. It is difficult to assess the overall impacts on OG&E in the summer of 2010 and 2011. Extrapolating the results to other utilities’ situations would be impossible. Electricity Price Impacts No price elasticities were estimated. Price of substitutes? The effect of the price of substitutes was not considered in this study. Customer information Age and income data were used in Phase I to weight the load shapes to reflect the OG&E customer population as a whole. Premise Information In Phase I, customers were designated as either having central A/C. Exogenous factors Weather. There was no comprehensive model attempting to estimate the relative impact of weather. However, the report contains dozens of cross tabulations that show the load impacts for various combinations of temperature and humidity. Customer OG&E conducted extensive customer research both in the design and the assessment stage of the pilot. They reported a high level of customer satisfaction overall, attributing the success in part to 13426410 A-5 Name of Study Satisfaction? OG&E Smart Study TOGETHER Impact Results, Final Report – Summer 201124 the customer research and feedback.27 Self-reports of behavior changes? Not in this report. Persistence? Persistence of response effects over time was not specifically assessed. Mike Farrell, Angela Nichols, Katie Chiccarelli. Smart Study TOGETHER Research Summary, Oklahoma Gas & Electric, August 2012. Available at: https://smartgrid.gov/sites/default/files/doc/files/Chapter_2_Research_Summary.pdf 27 13426410 A-6 B CBS SUMMARY FOR FIRSTENERGY Name of Study FirstEnergy’s Consumer Behavior Study: Preliminary Evaluation for the Summer 201228 Reference D. Hansen, M. Hilbrink and R. Boisvert, EPRI 3002001870, August 2013.29 Location (p. 2-1) Participants were selected from a specific geographic area of FirstEnergy’s Illuminating Company, a 34-circuit area east of Cleveland, Ohio. This area is the site of several other DOE Smart Grid Modernization Initiative projects, which accommodated the installation of Advanced Metering Infrastructure (AMI) to support the metering and communication system needed to implement the CBS design. Customer Segment Residential customers. Time/Duration • Recruitment: fall of 2011 and winter of 2012. • Events: June 1 to August 31, 2012. A randomized control trial (RCT) was used Control Customers Pilot Design (p. 2-3 to 2-4) • An initial population of 15,000 customers were sent a pre-qualifying questionnaire. • 6688 customers responded (42%) of which 26 were excluded due to service types that would not support smart metering. • The remaining 5,489 were offered the installation of a smart meter. • These customers were then divided into two groups, one with central air conditioning and one without. A control group was drawn from each group (250 for the PCT treatment group and 200 for the IHD treatment group) Treatment Customers • Sample Sizes • For each treatment, subjects were selected randomly as candidates and offered the treatment. Those that accepted the offer were enrolled in the pilot. Those that did not were removed from consideration in any other treatment. • Control group – 250 • Treatment groups (726 across 5 treatment groups) (see details below) • Total sample size: 976 Description of Base Rates and Treatments Base Rates Pricing Treatments • All treatment and control customers were served on the utility’s base residential (nontime differentiated) rate. Two Peak Time Rebate (PTR) treatments were tested: D. Hansen, M. Hillbrink (Christensen Associates Energy consulting) and R. Boisvert (Cornell University), FirstEnergy’s Consumer Behavior Study: Preliminary Evaluation for the Summer of 2012. Prepared by EPRI for FirstEnergy Corp. EPRI, Palo Alto, CA: August 2013. 3002001870. 29 The study was funded in part by the U.S. Department of Energy Smart Grid Investment Grant, Consumer Behavior Study program. 28 13426410 B-1 Name of Study FirstEnergy’s Consumer Behavior Study: Preliminary Evaluation for the Summer 201228 • PTR of $0.40 /kWh for a 4-hour period • PTR of of $0.40 /kWh for a 6-hour period • PTR is a limited duration call option that overlays a customer’s basic rate for electricity service. Under this service, the utility pays a credit every kWh below the customer baseline load (CBL). If the customer uses more than the CBL, there is no charge or penalty. • Calculation of the CBL: “The PTR payment to treatment customers was calculated by comparing the customer’s usage during the event period to its average usage from the five prior non-event, non-holiday weekdays (called the baseline usage). In addition an adjustment to the baseline was made if the customer used more electricity prior to the event. This adjustment was to discourage customers from pre-cooling so that not only would demand reduction be achieved, but customers would be encouraged to reduce their overall event-day usage as well.” (p. 2-2) • Total number of events that were called: 15 Feedback Treatments In-home displays (IHDs) that provide real-time usage information (Type 5) were offered to customers without central air conditioning (p. 2-1) Information No information treatments were mentioned in the report Programmable Communicating Thermostats (PCTs) under two types of control were tested: Control Technology Treatments • Utility controlled: utility controls thermostat during events, optional customer override • Customer controlled: customer decides whether to pre-program and whether to adjust in real-time their thermostat to respond to events Total customers, control and treatment groups = 976 Enabling Technology Treatments Sample sizes by control and treatment groups (p. 2-4) PTR - Price and Event Durations PCT Customer Control Control Group - Basic residential price (no PTR) PCT - Utility Control 250 In-Home Display 200 40¢/kWh - 4 hours 91 172 93 40¢/kWh - 6 hours NA 170 NA Other incentives None mentioned Installation Method (p. 2-1) Contractors installed the PCT and “. . . showed the customer how to program the PCT as well as how to over-ride an event.” Recruitment Approach (p. 2-4) Opt-in recruitment approach was used. Eligible customers were recruited in waves (designed to achieve targeted sample sizes) using a combination of direct mail, e-mail and phone solicitation, and invited to participate. The report has valuable information about the number of customers marketed in seven marketing waves from October 5, 2011 through February 17, 2012. (See p. 2-6) The overall percent enrolled in each wave ranged from a low of 4% and a high of 19%. 13426410 B-2 Name of Study FirstEnergy’s Consumer Behavior Study: Preliminary Evaluation for the Summer 201228 Overall recruitment and Participation rates are as follows (p. 3-3): Number of customers who: Were offered the technology Accepted the technology Acceptance Rate PCT 4,194 433 10.3% IHD 475 92 19.4% Total 4,669 525 11.2% Analysis Methods and Results Analysis Methods – (Section 4) Two basic analytical strategies were used to estimate the treatment effects: 1. Event-day treatment effects are estimated using a fixed effects model to estimate “. . . how electricity usage was affected by the PTR and enabling technology treatments. The estimates reveal differences between treatment and control group customer usage levels, controlling for differences in loads on non-event days, weather conditions, day of-week effects, and customer-specific characteristics that do not vary over time (i.e., the customer fixed effects).” (p. 4-1) 2. The second approach “relies on economic theory in addition to statistical models that impose the principle (sic) tenets of utility maximization for consumers.” (p. 4-1) Two different price elasticities are estimated: a. The elasticity of substitution – which “. . . measures the percentage change in the ratio of average hourly peak usage to the average hourly off-peak usage due to a one percent change in the inverse price ratio – the ratio of the average hourly off-peak price to the average hourly peak price. . . . based on a constant elasticity of substitution (CES) demand model.” b. The own-price elasticity – which “. . . measures the percentage change in average hourly daily use of electricity due to a one percent change in the hourly daily price of electricity. This own-price elasticity of demand is based on a log-linear demand model specification.” (pp. 4-1, 4-2) Average Percentage Load Impacts Event duration: 4-hour 4-hour 6-hour 4-hour Who controls? Customer Utility Utility Customer PCT PCT PCT IHD Technology Average Load Impacts (percentage) Load Impacts Measured (pp. 5-4 to 5-5) - for event hours -8.0 -30.0 -28.0 -11.0 - average daily usage -2.0 -1.5 -3.0 -2.3 The results show that: • event load reductions were larger for utility-controlled PCTs than for the customercontrolled PCTs • average daily usage showed a slight (but much smaller) decrease, reflecting a post-event snap back effect, especially for the utility controlled treatment groups. (p. 5-4) 13426410 B-3 Name of Study FirstEnergy’s Consumer Behavior Study: Preliminary Evaluation for the Summer 201228 Elasticities of Substitution from CES Models Event duration: 4-hour 4-hour 6-hour 4-hour Who controls? Customer Utility Utility Customer PCT PCT PCT IHD Technology Elasticities of Substitution – CES Model (pp. 5-12, Table 5-5) Elasticities of Substitution - average across all events - not weather adjusted* - weather adjusted at average event-day THI** 0.094 0.299 0.275 0.087 0.096 0.300 0.275 0.087 *all coefficients are significant (p<0.01) **Temperature/Humidity Index Daily (Own) Price Elasticities Daily demand elasticities were estimated, but were not statistically significant and, thus, are not reported here. Other Factors Price of substitutes? Not considered in this analysis Income Yes. Evaluation of Customer Characteristics (pp. 3-6 thru 3-11) • Size of home • Type of home (single family, duplex, apartment, mobile, etc.) • Highest level of education • Household income “There are some important differences in the demographic characteristics between customers with and without CAC (central air conditioning). . . . CAC customers report large average home sizes, are more likely to live in single-family homes, have higher education attainment, and report higher family income.” (pp. 3-6 to 3-7) The authors reported “. . .there are no statistically significant differences in these demographic characteristics between the respective treatment and control groups. . . . with a couple of important exceptions. The distribution of home size for the utility-controlled, 6-hour PCT customers is different from the distribution for its control group, and the distribution of income for IHD treatment customers differs from that of its control group.” (p. 3-7) Premise Information Yes, size and type of home. Exogenous factors Weather (temperature and humidity) Customer Satisfaction? Report did not specify whether customer satisfaction was assessed. Self-reports of behaviors changed? No. Persistence? No. This report is the analysis of the first phase of the deployment. 13426410 B-4 C CBS SUMMARY FOR MARBLEHEAD MUNICIPAL LIGHT DEPARTMENT Name of Study Marblehead Municipal Light Department EnergySense CPP Pilot, Final Evaluation Report30 Reference GDS Associates, Inc., 201331 Location (p. 1) The Town of Marblehead, Massachusetts, a seacoast town 18 miles north of Boston Customer Segment Residential customers, total population = 10,000 customers. Time/Duration • Two consecutive summers (2011 and 2012) A randomized control trial (RCT) experimental design was used Pilot Design ( p. 6-7) • Initial population = 10,000 • After screening for eligibility based on a minimum monthly electric usage of 200 kWh, and a Marblehead billing address, the eligible pool of customers fell to 6,065 accounts • A total of 532 customers ultimately enrolled in the program, representing a participation rate of 9% (higher than the assumed rate of 5% used for planning purposes) • “Customers who volunteered to participate were randomly assigned to either the year one treatment or control group using a recruit and delay strategy.” • Customers were surveyed by phone to determine whether they had central air conditioning or electric water heaters. The survey data were used in conjunction with billing data to stratify the sample “. . .to reduce the potential that random assignment could lead to significant differences between the two groups compared with a simple random assignment.” (p. 6) Sample Sizes for Control and Treatment Groups (pp. 6-7) Sample Sizes for Control Group Sample Sizes (pp. 6-7) No Electric WH With Electric WH Central AC 49 (19.3%) 17 (6.7%) No AC or Room AC only 163 (64.2%) 25 (9.8%) Sample Sizes for CPP Group Central AC 54 (20.5%) 19 (7.2%) No AC or Room AC only 155 (58.7%) 36 (13.6%) 30 GDS Associates, Inc., Marblehead Municipal Light Department, EnergySense CPP Pilot, Final Evaluation Report. DE-030000308, Manchester, NH. June 2013. 3002001870. 31 1155 Elm Street, Suite 702, Manchester, NH 03101. 13426410 C-1 Name of Study Marblehead Municipal Light Department EnergySense CPP Pilot, Final Evaluation Report30 Description of Base Rates and Treatments Rate Component Base Rates and Pilot Rate Structures (p. 7) Feedback treatments Standard Rate CPP Rate Basic Monthly Charge $4.25 per month $4.25 per month Non-CPP kWh 14.25 ¢/kWh 9.9¢/kWh CPP kWh 14.25 ¢/kWh $1.05/kWh Feedback Type 5: “ . . .all pilot customers [treatment and control] were provided with access to an online web portal which allowed them to monitor their consumption in real-time, view historical usage statistics” and obtain an estimate of their monthly bill. (p. 8) • In home displays were considered, but ultimately not offered because they considered to offer too little incremental benefit relative to the cost of deployment. (p. 10) • “Ways you can save” booklet provided simple energy conservation tips and was offered to all customers (control and treatment) • CPP group only – was provided a flyer providing ways to conserve during CPP events Information (p. 10) In the Spring of 2012, all pilot participants who reported having central AC and/or electric water heating were offered the following Wi-Fi enabled control technologies: Control technology (pp. 9-10) Installation Method (p. 9) Other incentives (p. 9) • Central air conditioning (CAC) customers were offered a programmable controllable thermostat (PCT). PCT was the CT80 model produced by Radio Thermostat of America, a large touch-screen thermostat with 7-day programming capability. It was reportedly compatible with “nearly every HVAC system, but compatibility problems were encountered limited MMLD’s ability to fully deploy and evaluate the PCTs.” (p. 10) • Electric water heating customers were offered a water heater load switch (the EnTek MC140RAC Remote Appliance Controller (Figure 4, p. 10) There were also problems encountered with the installation of the water heater switches (which required a C-wire to operate) which prevented the installation of about 25 switches, but once installed, the switches worked as expected. (p. 16) • Both CAC and electric water heating were offered both a PCT and a water heater load switch • Equipment installation problems resulted in implementation rates that were about half the requested rates. The majority of installation issues were due to either equipment deficiencies or contractor training. (p. 15) • Overall technology acceptance rates are summarized by customer equipment category (CAC and/or electric water heating) for year 1 and year 2 of the pilot. Acceptance rates were highest for customers with electric water heater (53-59%). The average acceptance rate for all technology combinations was in the range of 10 to 12% across both years. (See Table 5, p. 15) Actual technology installation rates were about half the acceptance rates due to equipment deficiencies and problems with contractor training. (See page 15). Customers were responsible for having the equipment installed by a licensed professional. (p. 9) A number of installation problems were encountered and are described in the report. (See p. 16) • The control technologies were offered free of charge. A credit was offered to the customer, upon successful implementation. • Bill protection was provided to customers in the first year on the CPP rate. In year 2, bill protection was only offered to year 1 control group customers who became participants in year 2. (p. 12) 13426410 C-2 Name of Study Recruitment Approach (p. 6) Marblehead Municipal Light Department EnergySense CPP Pilot, Final Evaluation Report30 • Opt-in recruitment approach was used. • Recruit and delay approach. (p. 6) • Overall recruitment rate = 9% Analysis Methods and Results Analysis Methods – (Section IV, p. 23) Analysis Methods – (continued, pp. 26-30)) Three methods were tested: 1. Adjusted comparison of means 2. Panel regression 3. Individual regressions Modeling Approach for Year 1 (Summer 2011): Adjusted Comparison of Means The authors noted that all three methodologies are subject to sampling error and that the regression approaches are subject to model specification error. Since in year one, the three hottest days of the summer of 2011 were all event days, “. . . the possibility of specification error in the regression may be an issue concerning the relationship between temperature and electricity consumption on hot days. The control and treatment groups in year one were developed using randomized assignment within usage and AC stratum to minimize the sampling bias. Therefore it was decided that the adjusted comparison of means represents the best methodology for measuring impacts in year one, given the randomized control design of the pilot program.” Modeling Approach for Year 2 (Summer 2012): Individual Regression Modeling “In year two, the lack of a control group precluded the adjusted comparison of means approach. Therefore the MMLD team used the individual regression approach to evaluate impacts in year two. . . .Individual regression modeling involves creating a regression model to describe hourly loads for each customer in the pilot. The approach relies on pre- and post-treatment data to estimate load impacts. The results of the individual regression approach are more robust if the pilot includes alternating or repeated patterns for treatment (e.g., multiple hot days that are CPP events and similar hot days that are not CPP events). . . . Unfortunately, many of the hottest days in both years fell on weekends, and weekend days were not eligible to be CPP events. Therefore, weekends had to be excluded from the analysis given how different load patterns are on weekdays versus weekends.” (p. 26-27) The data set for analysis therefore included: • 7 non-event days • 8 event days • 15 days total The model is used to estimate the hourly demand for an individual customer. “To estimate the hourly impacts of an event, the model estimate is calculated with and without the event-day variables for each event hour. The difference represents the model-estimated average impact of the critical peak events.” 13426410 C-3 Name of Study Marblehead Municipal Light Department EnergySense CPP Pilot, Final Evaluation Report30 Average Load Impacts from CPP Customers During Event Hours, Year 1 and Year 2 Year 1 Impacts Year 2 Impacts Change in Event Hour* Loads kW Percentage Load Impacts Measured (pp. 26-33) - 0.74 kW - 0.37 kW -36.7% - 21.3% Change in Daily Energy Consumption kWh Percentage - 5.0 kWh - 1.5 kWh - 12.1% - 3.9% *Event Hours are from 12:00 noon to 6:00 p.m. Year 2 impacts were considerably lower than for year 1. Year 2 weather was considerably milder than year, so a lower impact would not be unexpected. Also, different methodologies were used to estimate impacts in Year 1 and Year 2, which could also be a contributing factor. Price Elasticities (Arc elasticities) (p. 31) Only Arc Price Elasticities were estimated, based on an average change in load and the average change in price from the base price (9¢/kWh) to the CPP price ($1.05/kWh). • Year 1: average reduction in load was 0.74 kW, and the Arc elasticity was -0.035. • Year 2: average load reduction was 0.37 kW, resulting in an Arc elasticity of -0.020 Other Factors Price of substitutes? Evaluation of Customer Characteristics (pp. 19-22) Not considered in this analysis • Size of home • Type of home (single family, duplex, apartment, town or row house, etc.) • Highest level of education • Household income • Having at least one person in the household working full time for pay • Someone at home during weekdays, from 1:00 to 5:00 p.m. at least once a week. • Have programmable thermostat • Remembered receiving information asking them to participate in the pilot program Note: information was collected and the sample characteristics was compared to U.S. Census data to address how representative the sample was to the larger Marblehead population. However, no attempt was made to correlate the load response with these characteristics. Premise Information Yes, see above. Exogenous factors Weather (temperature and humidity) Customer Satisfaction? Overall response rates to the post-pilot surveys were 45% response from the control group and 60% response from the treatment group in year 1. The year 2 overall response rate declined to 48%. “In year one 86% of the treatment group reported a positive experience while in year two the combined groups responded with 85% having an overall positive experience.” (p. 42) Self-reports of behaviors changed? Customers were asked if they had a programmable thermostat, and if so, did they program them to change temperatures during the day. (p. 20) Persistence? Comparisons were made between year 1 and year 2 response. Year 2 was considerably lower than year 1. Year 2 weather was considerably milder than year, so a lower impact would not be unexpected. Also, different methodologies were used to estimate impacts in Year 1 and Year 2, which could also be a contributing factor. 13426410 C-4 D CBS SUMMARY FOR SACRAMENTO MUNICIPAL UTILITY DISTRICT (SMUD) SMUD SmartPricing Option Pilot, Interim Load Impact Evaluation32 Name of Study Reference Freeman, Sullivan & Co., San Francisco, CA, Sept. 19, 201333 Location SMUD service territory (Sacramento area). Customers were selected from the population of customers with smart meter installations (~50,000). Customer Class Residential customers. Time/Duration (SMUD Report, p. 33; FSC Report, p. 10) Pilot Designs • Recruitment began: Oct 24, 2011 • Pilot began (New Rates in effect): June 1, 2012 • Pilot ended: September 30, 2013 • Final Evaluation: April 30, 2014 The research plan tested three methodologies: 1. Randomized Control Trial (RCT) using a recruit and delay approach 2. Random Encouragement Design (RED) 3. Within Subjects Sample size requirements varied by treatment, based on the following assumptions (SMUD Report, p. 14): Sample Sizes • 15% acceptance of opt-in rates by June 1, 2012 • 50% opt out of the default rates by June 1, 2012 • 60% IHD acceptance across treatments by June 1, 2012 20% attrition for each treatment and control group from June 1, 2012 to September 30, 2013. Based on these assumptions, the targeted sample requirements ranged from a low of 150 (for the Opt-in CPP with no IHD offer) to a high 1,884 for the Opt-in TOU (with no IHD offer). • Reference for full interim report is Lupe R. Jimenez, Sacramento Municipal Utility District, Jennifer M Potter, Sacramento Utility District, and Stephen S. George, Freeman, Sullivan & Co. SmartPricing Options, Interim Evaluation, An interim evaluation of the pilot design, implementation, and evaluation of the Sacramento Municipal Utility District’s Consumer Behavior Study, prepared for the U.S. Department of Energy, Lawrence Berkeley National Laboratory, Sacramento, CA, October 23, 2013. (Cited herein as SMUD Report) 33 Full reference for load impact evaluation (a subsection of the full report) is: Stephen S. George, Michael Perry, Elizabeth Hartmann, Christine Hartmann, SMUD Smart Pricing Option Pilot, Interim Load Impact Evaluation, Prepared for the Sacramento Municipal Utility District, Freeman, Sullivan & Co., San Francisco, CA, September 19, 2013. (Cited herein as FSC Report). 32 13426410 D-1 SMUD SmartPricing Option Pilot, Interim Load Impact Evaluation32 Name of Study Description of Base Rates and Treatments The description of SMUD’s base (standard) rates, Energy Assistance Program Rates (for low income customers), and the treatment (TOU and CPP rates) are summarized in this table: NOTES to Table: Category Regular Pricing SPO Pricing Standard SPO Pricing EAPR Rate Fixed Charge ($/month) Critical Peak Onpeak Offpeak base Off-peak base plus Off-peak nondiscounted base plus Energy rates (¢/kWh) Standard $ 10.00 — — 10 18 — EAPR $ 3.50 — — 7 13 18 TOU $ 10.00 — 27 8 17 — CPP $ 10.00 75 — 9 17 — TOU-CPP $ 10.00 75 27 7 14 — TOU $ 3.50 — 20 6 12 17 CPP $ 3.50 50 — 6 12 17 TOU-CPP $ 3.50 50 20 5 10 15 SPO = SmartPricing Options, the name of the SMUD pilot. On-peak hours are 4:00 p.m. to 7:00 p.m. weekdays, excluding holidays. All other hours are off-peak hours. The TOU and CPP rates only apply during the summer months. All rates are inclining block rates (i.e., prices increase with usage). The first 700 kWh in the billing period are priced at the base rate. Consumption above 700 kWh during the billing period are billed at the “base plus rate”, as shown in the table. The EAP rates are discounted, as shown in the table. Two types of recruitment were tested (Opt-in and Opt-out). Within each group, different rate options were combined with an offer of an in-home display (IHD). The treatments and their acceptance rates of the offer are shown in the following table: Recruitment Approach IHD Offer Acceptance Rate No 18.80% Yes 18.20% No 16.40% Yes 17.50% CPP Yes 95.90% TOU Yes 97.60% TOU-CPP Yes 92.90% Rate CPP Treatments and Acceptance Rates (SMUD report, p. 11; FSC Report, p. 2) Opt-In TOU Opt-Out (Default) The authors note that these opt-in acceptance rates (of 16 to nearly 18%) are significantly higher than those that have been found in other pilots where rates on the order of 5% or less are more common. “The fact that SPO . . . (SmartPricing Options) . . . “obtained acceptance rates approaching 20% from the general population in a single campaign suggests that other utilities can achieve similar acceptance rates using a well researched and concerted marketing effort.” 13426410 D-2 SMUD SmartPricing Option Pilot, Interim Load Impact Evaluation32 Name of Study They also note that Salt River Project and Arizona Public Service have also received very high levels of enrollment in their optional TOU rates. (FSC Report, p. 51) “Based on personal correspondence between Stephen George and representatives from APS and SRP conducted for a confidential client, as of late 2010, Arizona Public Service had roughly 51% of residential customers, and 65% of residential kWh served, enrolled on one of five TOU rates. Around the same time Salt River Project had 28% of its residential accounts on one of two TOU rates, and estimate that it has nearly 50% of its target market of high use customers on these rates.” (FSC Report, p. 51, footnote 28) • All SMUD residential customers have access to day-lagged interval data through the My Account (Web Portal, Type 4). Data for customers on time-variant rates is formatted to show usage by rate period. (p. 14, footnote 12) • IHDs (Type 5) were offered to specific treatment groups to assess impacts on offer acceptance rate, as well as usage (more detail below) Feedback treatments Control Technology Treatment None Treatment Group Recruitment Opt-in Targeted sample sizes by control and treatment groups (p. 14 of main report) Opt-out Rate IHD Offer Sample Requirement Design Recruitment Goal* TOU No RCT 1884 2355 TOU Yes RCT 3140 3925 CPP No within subjects 150 187.5 CPP Yes RED 1131 1413 TOU Yes RED 992 1240 CPP Yes RED 345 431 TOU-CPP Yes within subjects 300 375 Notes to Table: Sample requirement = total enrolls + postpones Recruitment goal = total enrolls + postpones before 20% attrition Note that for the RED and RCT designs, the opt-in treatments required much larger sample sizes than the opt-out treatments, due to the lower expected acceptance rates. Other incentives None mentioned Recruitment Approach (p. 2-4) Study tested both Opt-in and Opt-out (default) recruitment. 13426410 D-3 SMUD SmartPricing Option Pilot, Interim Load Impact Evaluation32 Name of Study Analysis Methods and Results Analysis of Impact Evaluation Methods (Section 1.5 of the FSC report, p. 6) The authors note that this field trial provided “. . .a very rare opportunity to compare impact estimates based on different analysis methods. For a CPP treatment, load impact estimates are based on three different methodologies: 1. Within-subjects analysis 2. Difference-in-differences estimation based on a control group selected using statistical matching 3. Difference-in-differences analysis based on a randomized encouragement design For a TOU treatment, load impacts estimates were made for two methodologies: 1. Difference-in-differences estimation based on a control group selected using statistical matching 2. Difference-in-differences analysis based on a randomized control trial design Average Peak Period Load Reductions for TOU Treatments (FSC Report, p. 61) Average Impact per customer (kW) Reference Load (kW) Impact as % of Reference Load Opt-in TOU, no IHD Offer 0.17 1.71 10% Opt-in TOU, IHD Offer 0.24 1.80 13% Default TOU, IHD Offer 0.12 1.87 6% Default TOU-CPP, IHD Offer 0.16 1.90 8% Group Average Peak Period Load Reductions for CPP Treatments (FSC Report, p. 67) Load Impacts Measured (FSC Report, pp. 61-67) Average Impact per customer (kW) Reference Load (kW) Impact as % of Reference Load Opt-in CPP, no IHD Offer 0.52 2.38 22% Opt-in CPP, IHD Offer 0.69 2.62 26% Default CPP, IHD Offer 0.32 2.64 12% Default TOU-CPP, IHD Offer 0.33 2.60 13% Group Summary Observations: (FSC Report, pp. 61 and 67) IHD Acceptance Rates and Connection Rates Over Time • Load impacts for both the TOU and the CPP treatments are much larger for the opt-in groups than for the default (opt-out groups) • For the TOU treatments, the average impact is 1.5 to 2 times greater for the opt-in customers than for the default customers • For the CPP treatments, the impact is roughly 2 times greater for the opt-in customers than for the default customers This study attempted to assess the impact of offering customers an IHD, not the impact of the IHD itself. They found that: • Offering a free IHD had no discernible impact on the customer acceptance rate of timevarying rates because the IHD was offered to all customers in those (opt-in) treatment 13426410 D-4 SMUD SmartPricing Option Pilot, Interim Load Impact Evaluation32 Name of Study groups at the time of enrollment and virtually all customers accepted it. (p. 75) • For the opt-in treatments, 95-96% of the customers accepted the IHD • For the default (opt-out) treatments, only 21-24% of the customers accepted the IHD By the end of the first phase of the study (9/30/12), o Only 27-28% of the opt-in customers had IHDs connected o Only 8-10% of the default customers had their IHDs connected They concluded that “(t)he reasons underlying these low connection rates are currently unknown.” See FSC Report, pp. 75-77 for a more in depth discussion of the possible factors that might have contributed to these results. • Which recruitment approach (opt-in vs. opt-out) results in the largest load impacts overall? This study is unique in that it tested the impact of the recruitment approach itself on the load reductions achieved overall. Virtually all of the other field trials completed over the past decade or so employed either an opt-in approach (recruiting volunteers) or an opt-out (customers are enrolled in the program and must take action to de-enroll from the program) approach, but not both. Opt-out is also referred to as “default” enrollment. Critical questions remain about which approach is likely to be most cost-effective overall – including the costs of recruitment, billing, customer service, infrastructure, and customer satisfaction, as well as the benefits of the program in terms of aggregate load reduction. This study helps shed light on relative load impacts that might be expected. Here is how the authors summarize their findings: “A critical policy issue is whether the aggregate demand reduction is likely to be greater based on opt-in or default enrollment of time-variant rates. While the average impact of default customers is lower . . ., the acceptance rate for TOU is much higher among default customers than opt-in customers. The acceptance rate for the opt-in treatment for TOU with an IHD offer was 17.5% whereas the initial dropout rate (prior to going on the rate) for default TOU with an IHD offer was only 3%. Thus, if 100,000 customers who met the sample selection criteria had been offered TOU on an opt-in basis during the pilot period compared to defaulting 100,000 customers onto the rate and allowing them to drop out, the aggregate peak-period load reduction would have equaled roughly 4.2 MW (0.24 kW x 100,000 x 0.175) for the opt-in program + 0.5 MW, and nearly three times as much for the default program, at 11.4 MW (0.12 kW x 100,000 x 0.97), + 2.2 MW.” (FSC Report, p. 62.) Price Elasticities Not estimated for the interim report. The plan is to estimate them as part of the final project evaluation. (FSC report, p. 33) Other Factors Price of substitutes? Not considered in this analysis Income Yes. Evaluation of Customer Characteristics (FSC Report, pp. 37-45) Home ownership (vs. rent) Home characteristics o Size of home (square feet) o Age of home o Type of home (multi-family vs. single family) • Central air conditioning ownership • Programmable thermostat ownership • Household Demographics 13426410 D-5 SMUD SmartPricing Option Pilot, Interim Load Impact Evaluation32 Name of Study o Household income (whether household is on the Energy Assistance Program Rate) o Full time employment status o Working from home status o Educational attainment o Average household income Demographic data were used to assess whether the control and treatment groups were statistically similar. Only one treatment group (default-TOU CPP group) was significantly different from its control group: 74% of the treatment group owned their own home, while only 61% of the control group owned their homes. None of the other treatment and control groups exhibited differences statistically significant differences in demographic characteristics. Premise Information Yes, (see above). Exogenous factors Weather (temperature and humidity). Load impacts were found to be higher on the hottest days when reference loads were the highest. Customer Satisfaction? Report did not specify whether customer satisfaction was assessed. Self-reports of behaviors changed? No. Persistence? No. This report is the analysis of the first phase of the deployment. 13426410 D-6 E CBS SUMMARY FOR GREEN MOUNTAIN POWER Name of Study Analysis of Green Mountain Power Critical Peak Events During the Summer/Fall of 2012 (interim report) Reference Blumsack and Hines, Nov. 2013.34 Location (p. 8) Rutland, Vermont area Customer Segment Residential customers, who pay their own bills and live in Vermont year-round. Time/Duration March/April of 2012 (when advanced metering infrastructure was deployed) through December 2012 (CPP/CPR events were called in September & October 2012) IHDs were in place AugustDecember 2012 Pilot Design (p. 6) Randomized Control Trial (RCT) consisting of two control groups and 7 treatment groups Required Sample Sizes (based on GMP’s targeted level of precision in measuring effects, assumed customer acceptance rates, and derived oversampling rates (p. 10)) Sample Sizes (p. 10) Notification Sample Size X 390 X 195 X 390 X 195 X 390 X 195 Flat X 390 Flat Flat X 390 Flat Flat Group Year 1 Year 2 1 CPR CPR 2 CPR CPR 3 CPP CPP 4 CPP CPP 5 CPR CPP 6 CPR CPP 7 Flat C1 C2 IHD X X X Totals 1200 3735 NOTES: • CPR = Critical Peak Rebate (same as Peak Time Rebate) • CPP = Critical Peak Pricing • IHD = In-Home Display • Notification = indicates whether the group was notified of a peak time event. • CPR→CPP = indicates those groups (5&6) which were placed on a CPR rate in Year 1 and transitioned to CPP in year 2 34 Seth Blumsack, Pennsylvania State University and Paul Hines, University of Vermont, November 19, 2013. Available at: https://smartgrid.gov/document/vermont_transco_green_mountain_power_interim_report . 13426410 E-1 Name of Study Analysis of Green Mountain Power Critical Peak Events During the Summer/Fall of 2012 (interim report) Description of Base Rates and Treatments Base Rates and Pilot Rate Structures (p. 8) Feedback treatments (p. 14 and Appendix 4) Rate Base Rate Critical peak rate Time period to which rate applies All hours except critical peak events 1pm-6pm on critical peak days Flat Rate (Rate 1) 14.8 ¢/kWh 14.8 ¢/kWh Critical Peak Rebate 14.8 ¢/kWh 60¢/kWh per kWh reduction from baseline Critical Peak Pricing 14.4 ¢/kWh 60.0 ¢/kWh Customers in three groups were mailed In-Home Displays during August 2012. The technology chosen was the Tendril Insight IHD, which provided: • Current power usage in kW or dollars per hour • Notification of critical peak events • Notification of each customer’s baseline power level Information • No information treatments were part of this study Control technology • No control technology was offered. • Customers were to install the IHDs. “Thus, GMP was not able to track whether customers had received their IHD as intended; or whether customers who did receive the IHD were able to install and use the IHD successfully.” • No other incentives were mentioned • Opt-in recruitment approach was used. • Overall acceptance rates ranged from 33 to 37%, but these rates reflected customers’ willingness to proceed with the survey, not their acceptance of the treatment itself. Installation Method (p. 10) Other incentives Recruitment Approach (p. 6) Analysis Methods and Results Analysis Methods for load impacts (pp. 18-24) • Calculation of customer baseline load (CBL) for the CPR treatment groups. For the study, the authors assumed “ . . . that GMP will credit each CPR customer based on the difference between that customer’s usage during critical peak events and average usage by the no-notification control group during the critical peak events.”35 • Calculation of average customer load impacts for each treatment. o The authors first tested for differences between each treatment group and the nonotification control group for the months before event notifications occurred. They found small, but statistically significant differences between the average loads in the treatment and control groups. o To control for these differences, they made two types of adjustments: First, the regression includes a fixed-effect parameter for each group which adjusts for the differences in group means 35 “In reality, GMP will calculate peak-time rebates using a customer-specific baseline formula (based on a moving average, like the customer baseline used in PJM’s demand response programs. There maybe differences between the baseline formula that will determine peak-time rebates and average usage by the no-notification control group. We were unable to recover the baseline estimates calculated by GMP for use in this report.” p. 27 13426410 E-2 Name of Study Analysis of Green Mountain Power Critical Peak Events During the Summer/Fall of 2012 (interim report) Second, the authors analyzed the results as if they were generated though a “randomized encouragement” design (RED) so they could control for differences between groups of customers who did or did not opt in/out of the study. (see pp. 23-24) o The overall approach is difference-in-difference regression modeling to decompose “. . . differences that would be observed during a non-event period; and . . . differences specifically during critical peak events.” Treatment Group Load Impacts Measured (pp. 25-27) Percentage Load Reduction during Events CPR 0.0* CPR + IHD 0.0* CPP 14.3 CPP + IHD 11.1 Flat Rate w/Notification 2.1 Summary of Average Load Reductions during Four Events called during SeptemberOctober 2012, by Treatment Group Note that the weather was unusually mild during these events. The load reductions estimated for the CPP group and the CPP group with an IHD were consistently statistically significant at the 95% level. None of the other groups load reductions were statistically significant from zero. *The nominal results for the CPR and CPR + IHD were 5.4 and 5.7% reductions, respectively. They are shown as zero here, following our prior convention, because they were not statistically significantly different from zero. This study also estimated the conservation impact of In Home Displays on monthly energy consumption (pp. 29-30) Energy Savings (conservation) Price Elasticities • Analytical Approach: A difference-in-differences type of panel regression using all customer data (including those who declined to participate) over the period March through December 2012 • The two relevant differences in this analysis are: o Monthly kWh usage by customers with and without IHDs, and o Monthly kWh usage before and after IHDs were mailed to customers in the relevant treatment groups. • Results indicated that, “. . . on average, customers with IHDs have monthly energy (kWh) usage that is 28 kWh below the average usage for customers without IHDs. . . This amounts to a 4 percent average reduction in monthly kWh relative to non-IHD customers.” Price elasticities were not estimated Other Factors Evaluation of Customer Characteristics No analysis of customer demographic characteristics was reported in this report. Premise Information None were reported Exogenous factors Weather was mentioned as a potentially confounding factor since the weather was unusually mild during September and October of 2012. 13426410 E-3 Name of Study Customer Satisfaction? Analysis of Green Mountain Power Critical Peak Events During the Summer/Fall of 2012 (interim report) Customers were surveyed to determine whether they “. . . appear to have higher, or lower, satisfaction in the different rate and information treatments.” (p. 30) Some of the notable findings included: Recall of Events: Customers recalled that as few as 1 and as many as 11 peak day events were called. (Actual number of events were 4.) Authors noted the possibility that customers might have been recalling the number of times they were notified of events rather than the number of events themselves. (p. 32) Customer Satisfaction: varied widely across all treatments, ranging from “extremely dissatisfied” to “moderately satisfied.” (p. 33) The 31 customers who were “extremely unsatisfied” were asked to explain in more detail why they were dissatisfied. 50% complained that there was no change or an increase in their bill. 35% of respondents they were dissatisfied or did not understand the technology (IHD) (p. 41) Customers were surveyed to determine “. . . the degree to which customers recalled taking specific actions to reduce electricity usage following notifications of peak events.” (p. 30) The specific actions listed were: • Self-reports of behaviors changed? Changed the settings on their thermostat • Turned off lights • Changed timers on thermostats or other household appliances • Delayed cooking • Adjusted air conditioning • Took some other action not in the above list The report has detailed information of actions taken by customers within each treatment group. Persistence? No. This is the interim report for the first year of the study. Only 4 events were called over a 5week period in the fall of 2012. 13426410 E-4 13426410 The Electric Power Research Institute, Inc. (EPRI, www.epri.com) conducts research and development relating to the generation, delivery and use of electricity for the benefit of the public. An independent, nonprofit organization, EPRI brings together its scientists and engineers as well as experts from academia and industry to help address challenges in electricity, including reliability, efficiency, affordability, health, safety and the environment. EPRI also provides technology, policy and economic analyses to drive long-range research and development planning, and supports research in emerging technologies. EPRI’s members represent approximately 90 percent of the electricity generated and delivered in the United States, and international participation extends to more than 30 countries. EPRI’s principal offices and laboratories are located in Palo Alto, Calif.; Charlotte, N.C.; Knoxville, Tenn.; and Lenox, Mass. Together…Shaping the Future of Electricity © 2014 Electric Power Research Institute (EPRI), Inc. All rights reserved. Electric Power Research Institute, EPRI, and TOGETHER…SHAPING THE FUTURE OF ELECTRICITY are registered service marks of the Electric Power Research Institute, Inc. 3002001268 Electric Power Research Institute 3420 Hillview Avenue, Palo Alto, California 94304-1338 • PO Box 10412, Palo Alto, California 94303-0813 • USA 800.313.3774 • 650.855.2121 • askepri@epri.com • www.epri.com 13426410