2008 Ex Post Load Impact Evaluation for Pacific Gas and Electric Company’s SmartRateTM Tariff Final Report December 30, 2008 Freeman, Sullivan & Co. 101 Montgomery St., 15th Floor San Francisco, CA 94104 Prepared for: Sara Margaret Gilbert Pacific Gas and Electric Company 245 Market St., 330 E San Francisco, CA 94105 Prepared by: Stephen S. George, Ph.D. Josh Bode, M.P.P. Freeman, Sullivan & Co. TABLE OF CONTENTS 1. 2. 3. EXECUTIVE SUMMARY............................................................................................................................... 2 1.1. Residential Customer Load Impacts.............................................................................................. 3 1.2. Non-Residential Customer Load Impacts...................................................................................... 4 SMARTRATE PROGRAM OVERVIEW ......................................................................................................... 6 2.1. SmartRate Pricing ............................................................................................................................ 6 2.2. SmartRate Marketing ...................................................................................................................... 9 2.3. Report Structure............................................................................................................................. 10 ANALYSIS METHODOLOGY...................................................................................................................... 11 3.1. 3.1.1. Goodness of Fit Measures ................................................................................................... 14 3.1.2. Model Accuracy and Validity Assessment .......................................................................... 16 3.1.3. Treatment of Missing Values ............................................................................................... 18 3.1.4. Estimation of Ex-post Load Impact Confidence Bands ..................................................... 19 3.2. 4. 5. Residential Regressions................................................................................................................ 12 Non-Residential Regression Models............................................................................................ 20 3.2.1. Goodness of Fit Measures ................................................................................................... 22 3.2.2. Model Accuracy and Validity Assessment .......................................................................... 22 3.2.3. Treatment of Missing Values ............................................................................................... 24 RESIDENTIAL EX POST LOAD IMPACTS ................................................................................................. 25 4.1. Load Impacts for E-1 Customers.................................................................................................. 25 4.2. Load Impacts for E-8 Customers.................................................................................................. 33 4.3. Load Impacts for Care and Non-Care Customers....................................................................... 33 NON-RESIDENTIAL EX POST LOAD IMPACT ESTIMATES ...................................................................... 39 5.1. Load Impact Estimates for A-1 Customers.................................................................................. 39 APPENDIX A: Load Impact Estimates for E-1 Customers ............................................................................ 46 APPENDIX B: Load Impact Estimates for A-1 Customers............................................................................ 58 i 1. EXECUTIVE SUMMARY Pacific Gas and Electric Company’s (PG&E) SmartRateTM1 tariff is the first large scale deployment of pure critical peak pricing in North America.2 During the summer season from May 1st to October 31st, customers on the SmartRate tariff face peak period prices that are significantly higher than the otherwise applicable price on a limited number of days (up to 15), known as SmartDays. Prices vary by time of day only on SmartDays.3 The California Public Utilities Commission (CPUC) ruling that authorized PG&E to proceed with the SmartRate tariff required the Company to conduct an ex post evaluation of the load impacts associated with the tariff.4 Specifically, ordering Paragraph 5 (p.67) of the ruling states, "PG&E shall report to DRA and the Energy Division within 60 days of the end of each CPP season the best estimate of demand response achieved during each CPP event, if any, including the number of customers (by class) on the CPP tariff and the participation rate of those customers during CPP events." This ex post load impact report is intended to fulfill the requirement of that order for the 2008 program year. The ex post load impact results reported here were estimated according to the guidance set forth in the Demand Response Load Impact Protocols adopted in CPUC Decision (D.) 08-04-050 on April 24, 2008.5 During the summer of 2008, SmartRate was offered to PG&E SmartMeterTM6 customers in the Bakersfield and greater Kern County region, a very hot area where maximum temperatures exceed 100ºF on many summer days. The tariff was initially offered to customers that were on PG&E’s E-1 and E-8 residential tariffs and the A-1 non-residential tariff, which applies to customers with peak demands below 200 kW. Direct mail materials were sent to roughly 135,000 accounts, of which more than 10,000 enrolled. Approximately 7.5 percent of residential customers and 5 percent of small commercial customers that received direct mail materials enrolled in the SmartRate program. Any use of the term SmartMeter or SmartRate in this document is intended to refer to the trademarked term, whether or not TM is included. 1 Gulf Power has offered a combination CPP/TOU rate for several years although, to the best of our knowledge, fewer than 10,000 customers currently participate in this program. PG&E’s SmartRate does not include a mandatory TOU component on non-critical peak days, although the CPUC directed PG&E to combine CPP with TOU in Decision (D.) 08-07-045. 2 3 The SmartRate tariff is an overlay on existing tariffs. If a customer’s existing tariff was a TOU tariff, such as E-7, prices would vary on both SmartDays and other weekdays. However, in 2008, the SmartRate overlay was only available to customers on non-time-varying tariffs. Final Opinion Authorizing Pacific Gas and Electric Company to Deploy Advanced Metering Infrastructure, CPUC Decision (D.) 06-07-027, July 20, 2006. 4 Decision Adopting Protocols for Estimating Demand Response Load Impacts, CPUC Decision (D.) 08-04-050, April 24, 2008. 5 6 SmartMeter™ is a trademark of SmartSynch, Inc. and is used by permission. 2 The SmartRate pricing structure is an overlay on top of a customer’s otherwise applicable tariff. SmartRate pricing consists of an incremental charge that applies during the peak period on SmartDays and a per kilowatt-hour credit that applies for all other hours from June to September. For residential customers, the additional peak-period charge on SmartDays is 60 ¢/kWh. For non-residential customers, the incremental charge is 75 ¢/kWh. The credit consists of two parts. A credit of roughly 3 ¢/kWh applies to all electricity use other than use during the peak period on SmartDays during the months of June through September. An additional credit of 1 ¢/kWh applies to tier 3 and higher usage for residential customers regardless of time period. PG&E called nine SmartDays during the summer of 2008. Three days were called in a row in early July (the 8th, 9th and 10th), three in August (27th, 28th and 29th) and three in September (3rd, 4th and 5th). The August and September event periods spanned the Labor Day weekend and covered six out of seven consecutive work days. Although PG&E initially called an event on August 14th, a problem that occurred with the notification process caused PG&E subsequently to cancel the event. Given the confusion over event notification, rather than treat August 14th as a non-event day, data for this date were dropped from the estimation sample. 1.1. RESIDENTIAL CUSTOMER LOAD IMPACTS The average load reduction for E-1 customers across the nine event days was 0.4 kW, which amounts to a reduction of 16.6 percent. The aggregate average load drop for the 8,758 customers who were included in the average event calculation equaled 3.5 MW. The number of enrolled customers varied significantly between the early event periods in July, when the number of enrolled customers was less than 4,000, and the events in September, when enrollment equaled more than 9,900.7 If the average reduction across all nine events was grossed up based on the higher September enrollment figures rather than based on average enrollment across all nine event days (to reflect what might have occurred if the enrollment had been at a steady state for the whole summer in 2008), the aggregate impact would be roughly 4 MW. There was a modest decrease in the percent load reduction across the five event-period hours for the average event day. In the hour ending at 3 pm, the load drop was 17.8 percent. The load drop fell to roughly 15 percent in the hour ending at 7 pm. The hour with the largest absolute load drop was from 5 pm to 6 pm, where the load reduction for the average customer equaled 0.44 kW. The first and last hours of the event period had the lowest absolute load reduction, at 0.37 kW. The load impacts during the three July SmartDays were substantially greater in both absolute and percentage terms than during the August or September event days. The average percent reduction across the five-hour event period on July 8th, the first event day, equaled 19.2 percent. The percent load reduction was 19.1 percent on the second July day and 22.5 percent on the third July day. The average reduction over the three July days was 20.3 percent (0.62 kW). During the three-day period in late August, leading up to the Labor Day weekend, the average reduction was 14.9 percent (0.42kW) and the average during the early September period, immediately after Labor Day, was roughly the same, 14.4 percent (0.29 kW). 7 As explained in Section 2.2, the direct mail through which PG&E recruited its 2008 SmartRate customers was sent out in late May and early June. Due to a time lag between initial customer acceptance of the SmartRate offering and the date that customers were fully enrolled and eligible for SmartDay participation, the number of customers on the rate increased throughout the summer. 3 Underlying the average load reduction is a wide dispersion of impacts across customers. Roughly 70 percent of customers that received notification provided at least some load drop for the average event. Almost 55 percent of customers provided load reductions that exceeded 10 percent of peak-period electricity use and almost 40 percent of notified participants produced load reductions of 20 percent or more. The third day of each three-day event period showed a larger percent reduction in peak-period load than either the first or second event day. The third day in each three-day period was also the hottest day in each case and the day on which the reference load was highest. The net result of these factors is that the third day in each three-day sequence provided substantially larger absolute reductions in load than either of the prior days. For example, the absolute reduction on the third day of the July event period was roughly 25 percent greater than on the first and second days. High energy users had larger percentage load impacts than did low energy users. The percent reduction in peak-period loads on SmartDays increased from about 12.4 percent for customers with average annual energy use below 5,000 kWh to 22 percent for customers with average annual usage between 10,000 and 12,500 kWh. The percent reduction dropped slightly for the two largest usage segments, but the absolute impact increased steadily across six size strata. Customers with annual energy use greater than 15,000 kWh provided load impacts that were more than five times larger than customers with annual usage below 5,000 kWh. A disproportionate number of CARE customers enrolled in SmartRate relative to the share of CARE customers in the Bakersfield area. CARE stands for California Alternate Rates for Energy and is a program through which enrolled, low income consumers receive lower electricity rates. Qualification for CARE is based on self-reported, household income and varies with the number of persons per household. Whereas approximately 35 percent of Kern County residential electricity customers who were sent marketing materials were CARE customers, 56 percent of customers who enrolled in SmartRate in 2008 were CARE customers. PG&E has seen similarly high enrollment rates by CARE customers in other programs, as CARE customers seem to have developed a belief that most new offerings from PG&E will reduce their energy costs. CARE customers had much lower load reductions, in both absolute and percentage terms, than did non-CARE customers. The average load reduction across the five-hour event period for the nine SmartDays was 11.0 percent for CARE customers and 22.6 percent for non-CARE customers. 1.2. NON-RESIDENTIAL CUSTOMER LOAD IMPACTS The average load reduction for the 194 A-1 customers included in the estimating sample across the nine event days was 16.0 percent and the absolute load reduction was 0.47 kW. The load reduction on the average event day ranged from a low of 12.8 percent between 2 and 3 pm to a high of 18.5 percent in the last event period hour, from 5 to 6 pm. The aggregate load reduction for the average event day estimate was approximately 0.09 MW, or 90 kW. Underlying this average percent reduction for A-1 customers is a precipitous fall in impacts from the first event sequence to the last. Impacts in the first three event days in July ranged from 29 to 42 percent, which were significantly larger than the residential percent impacts. However, for the threeday event period in late August, leading up to the Labor Day weekend, impacts dropped significantly, ranging from a low of 9.3 percent on August 29th to a high of 14.4 percent the prior event day. Load impacts dropped yet again over the last three-day event period, with a low of just 0.7 percent on September 5th to a high of 8.5 percent on September 3rd. 4 Due to the small number of participants, the non-residential impact estimates presented here should be used with significant caution. This sample of customers is not representative of the non-residential (A-1) customer population either in Kern County or PG&E’s service territory as a whole. Furthermore, as discussed in Section 3, there was a significant amount of missing interval data for some nonresidential customers on event days and the percentage of missing data was higher during the event period than during the non-event period. The combination of small sample size and missing data makes these results suggestive at best. 5 2. SMARTRATE PROGRAM OVERVIEW In May, 2008, Pacific Gas and Electric Company (PG&E) began offering a critical peak pricing tariff known as SmartRate to residential and small commercial customers in the Bakersfield and greater Kern County area. This region was the first in PG&E’s service territory to receive new meters under the Company’s advanced metering infrastructure deployment, branded as the SmartMeterTM Program. SmartRate is a voluntary rate supplement that features critical peak pricing. Under critical peak pricing, peak period prices are significantly higher than the otherwise applicable price on a limited number of “event” days (up to 15), known as SmartDays, during the summer season. The peak period on SmartDays is from 2 pm to 7 pm for residential customers and from 2 pm to 6 pm for nonresidential customers. The summer season runs from May 1st through October 31st. Prices only vary by time of day on SmartDays, unless the customer’s underlying rate is a time-of-use (TOU) rate. 8 Customers are notified that the next day will be a SmartDay by 3 pm on the preceding day. Customers had several options for receiving event notification (e.g., email, phone, etc.), including not being notified at all. SmartRate was promoted through direct mail. PG&E’s expectation was to enroll at least 6,000 customers in SmartRate in the initial summer season. Customers received information on SmartRate starting May 14, 2008. Direct mail materials were sent to roughly 135,000 accounts, of which more than 10,000 enrolled. The enrollment rate for residential customers equaled roughly 7.5 percent (10,001 out of 134,709). For non-residential customers, the enrollment rate equaled about 5 percent (208 out of 4,084). Section 2.2 below contains further information on the SmartRate marketing effort. The CPUC ruling that authorized PG&E to proceed with the SmartRate tariff required the Company to conduct an ex post evaluation of the load impacts associated with the tariff .9 Specifically, ordering Paragraph 5 (p.67) of the ruling states, "PG&E shall report to DRA and the Energy Division within 60 days of the end of each CPP season the best estimate of demand response achieved during each CPP event, if any, including the number of customers (by class) on the CPP tariff and the participation rate of those customers during CPP events." This ex post load impact report is intended to fulfill the requirement of that order. 2.1. SMARTRATE PRICING The SmartRate pricing structure is an overlay on top of PG&E’s other tariff offerings. SmartRate pricing consists of an incremental charge that applies during the peak period on SmartDays and a per kilowatt-hour credit that applies for all other hours from June through September. For residential 8 The SmartRate pricing structure is intended to be available to all existing single-family residential tariffs for bundled-service customers, including current TOU tariffs such as E-6 and E-7. Bundled-service commercial customers with billed maximum demands less than 200 kW that take service under rate schedules A-1, A-6, A10, E-19 and E-19V are eligible for SmartRate. During 2008, SmartRate was only offered to PG&E’s standard E-1 tariff, the seasonal E-8 tariff and the small commercial A-1 tariff, due to the geographic nature of the SmartMeterTM deployment process. The E-8 tariff is closed to new enrollment. It has the same five-tier structure as the E-1 tariff but has lower prices in tiers 3, 4 and 5 and a higher monthly charge. Only about 1 percent of customers who signed up for SmartRate were E-8 customers. 9 Final Opinion Authorizing Pacific Gas and Electric Company to Deploy Advanced Metering Infrastructure, CPUC Decision 06-07-027, July 20, 2006. 6 customers, the additional peak-period charge on SmartDays is 60 ¢/kWh. For non-residential customers, the incremental charge is 75 ¢/kWh. SmartDays can only be called during non-holiday weekdays from May 1st to October 31st. The SmartRate credit has two components, both of which apply only during the months of June through September.10 The first SmartRate credit applies to all usage other than peak-period usage on SmartDays. For residential customers, the credit equals roughly 3 ¢/kWh. For non-residential customers, this component of the credit varies with the underlying tariff, but equals 2.7 ¢/kWh for PG&E’s A-1 tariff, which applies to customers with peak demands less than 200 kW.11 The second credit component for residential customers, equal to 1 ¢/kWh, applies to all electricity usage that is priced according to the top three tiers of PG&E’s five-tier, increasing block tariff. For non-residential customers, the second credit component, equal to 0.5 ¢/kWh, applies to all usage across all underlying tariffs. PG&E’s standard residential tariff, E-1, is a five-tier, increasing block rate, with the price per kWh increasing nearly four fold between Tier 1 and Tier 5. The usage level where prices change is tied to a baseline usage amount that varies by climate zone. The Kern County area has a baseline usage amount for typical customers equal to 19.4 kWh/day during the summer, or roughly 582 kWh per month.12 Table 2 - 1 shows the prices for each tier for the E-1 tariff for both CARE and non-CARE customers who are not all-electric homes. CARE stands for California Alternate Rates for Energy and is a program through which enrolled, low income consumers receive lower rates than non-CARE customers. Qualification for CARE is based on self-reported, household income and varies with the number of persons per household. The maximum qualifying income for a household with 1 or 2 people is $30,500. For a four-person household, the maximum qualifying income is $43,200. As shown in Table 2 - 1, the CARE discount is quite significant, especially for low income households that have usage in Tier 3 and above. The ratio of marginal prices between E-1 and CARE customers is more than 4 to 1 in Tier 5, for example, and the average price ratio is more than 2.5 to 1. 10 Credits were applied only during the four month period from June through September rather than for the whole summer in an attempt to smooth out the bill impacts across the summer months. Since most event days are likely to fall in the months of June through September (indeed, all of the 2008 events occurred in these months), having the discount apply only in these months would do a better job of partially offsetting the negative bill impacts associated with the higher prices on event days. As mentioned above, only A-1 customers in Bakersfield were offered SmartRate during the summer of 2008. Eventually, SmartRate or a similar dynamic tariff will be available to medium customers that are on PG&E tariffs A-6, A-10 or E-19 (including voluntary). 11 12 For all electric homes in this climate zone, baseline usage for the summer period is 27.3 kWh/Day. Only about 5 percent of customers enrolled in SmartRate in 2008 were all electric homes. 7 Table 2 - 1 E-1 CARE and Non-CARE Prices for PG&E13 Approximate E-1 Average E-1 Maximum Price Price Based Monthly on Mid-Tier (¢/kWh) Usage in Tier Usage (kWh) (¢/kWh) Usage Tier % of Baseline Usage CARE Price (¢/kWh) Average CARE Price Based on Mid-Tier Usage (¢/kWh) 1 100% 582 11.6 11.6 8.3 8.3 2 130% 757 13.1 11.8 9.6 8.4 3 200% 1,164 24.7 14.7 9.6 8.8 4 300% 1,746 35.4 20.2 9.6 9.1 5 >300% >1,746 41.1 25.1 9.6 9.2 With the tiered pricing used in PG&E’s service territory, the price ratio between peak-period prices on SmartDays and the average price on normal days on the SmartRate tariff (which is roughly 3 ¢/kWh lower than the averages in Table 2 - 1 because of the SmartRate credit during those hours), varies significantly with usage and also varies between CARE and non-CARE customers. For example, for a Tier 1 customer on the E-1 tariff, the peak-period price on SmartDays is more than 8 times higher than on non-SmartDays.14 On the other hand, for a Tier 5 customer, the peak period price would equal roughly 85 ¢/kWh and the price ratio would be less than 4 to 1. For CARE customers, the SmartDay peak-period price is approximately 69 ¢/kWh and the price ratio between SmartDay peak-period prices and non-SmartDay prices is almost 12 to 1. The A-1 tariff is much simpler than the residential tariff. There is a monthly charge that varies between single and poly-phase service. The monthly customer charge equals roughly $8.87 for single phase service and $13.31 for poly-phase service. The A-1 tariff has a flat price for electricity use equal to 19.23 ¢/kWh. As mentioned before, the incremental charge for the peak-period price on SmartDays for A-1 customers is 75 ¢/kWh, so the price ratio between electricity on SmartDays on the SmartRate tariff compared to the underlying tariff is roughly 4 to 1. PG&E called nine SmartDays during the summer of 2008. Three days were called in a row in early July (the 8th, 9th and 10th), three in August (27th, 28th and 29th) and three in September (3rd, 4th and 5th). The August and September event periods spanned the Labor Day weekend and covered six out of seven consecutive work days. Although PG&E initially called an event day on August 14th, a problem that occurred with the notification process caused PG&E subsequently to cancel the event. Given the confusion over notification, rather than treat August 14th as a non-event day, data for this date were dropped from the estimation sample. 13 For E-1 customers, the fixed monthly charge is approximately $4.44. For CARE customers, it equals roughly $3.55. 14 The peak period price would equal 11.6 + 60 = 71.6. The price at all other times during the summer period would equal 11.6 – 3 = 8.6. The price ratio equals 8.3. 8 2.2. SMARTRATE MARKETING As indicated above, the primary marketing channel for SmartRate was direct mail. Customers were mailed recruitment materials with email follow-up starting on May 14th and given several options for enrolling—on-line, contacting PG&E’s call center, or through reply mail. Two informational workshops were also held (in both Spanish and English) to encourage customer enrollment. A $50 Visa gift card was offered to residential customers who signed up early (e.g., within roughly 10 days of receiving the recruitment material) and a $150 rebate check was offered for early sign ups for non-residential customers. Importantly, customers were offered first-year bill protection in order to address the risk aversion that pilot programs and market research have shown to be a significant barrier to participation for customers considering dynamic rate options. The first year bill protection ensured that, initially at least, customer’s bills would not increase under the new rate option relative to what they would have been over the same period under the prior tariff. Customers who agreed to enroll in SmartRate were sent a Welcome Kit with a confirmation letter, a Smart Tips guide providing information and recommendations on how to shift and/or reduce load during high price periods and other relevant material. The majority of customers were transferred to the new rate at the beginning of their next billing cycle following acceptance. Figure 2 - 1 shows the timelines for when customers indicated they were interested in participating in the program (for those who ultimately enrolled) and when enrollment was complete. The number of customers agreeing to enroll peaked about 20 days after receipt of the marketing material and the average time from agreement to enrollment was roughly another 20 days. Program enrollment nearly tripled from around 3,600 accounts during the July events to about 10,000 customer accounts for the events that occurred in late August. Figure 2 – 1 Cumulative Customer Acceptance and Enrollment15 Cummulative Customer Contacts of PG&E Cummulative Finalized Enrollment 10,000 8,000 6,000 4,000 2,000 01 Oct 08 01 Sep 08 01 Aug 08 01 Jul 08 01 Jun 08 01 May 08 0 15 The customer contact line shows when customers that had received material responded with an interest to participate, not when they received the original material. 9 A disproportionate number of CARE customers enrolled in SmartRate relative to the share of CARE customers in the Bakersfield area. Whereas approximately 35 percent of Kern County residential customers who were sent marketing materials were CARE customers, 56 percent of customers who enrolled in SmartRate in 2008 were CARE customers. PG&E has seen similarly high enrollment rates by CARE customers in other programs, as CARE customers have apparently developed a belief that most new offerings from PG&E will reduce their energy costs. As seen below in Section 4.3, CARE customers responded very differently to SmartRate prices than did non-CARE customers. An important consideration regarding the 2008 population of SmartRate customers that is not captured by marketing data is the saturation of air conditioning. A survey of roughly 400 SmartRate customers indicated that the majority of residential customers in Kern County have central air conditioning, and those who don’t typically have room air conditioners or evaporative coolers. The saturation of central air conditioning among SmartRate customers was 78 percent. Approximately 92 percent of non-CARE customers owned central air conditioners while roughly 71 percent of CARE customers had central air conditioning. The Kern County area is quite hot. The daily maximum temperature during the July event period was 108oF on two days and 111oF on the third day. The lowest maximum temperature on any of the nine event days was 98oF, on September 3rd. The minimum temperatures on event days ranged from a low of 66oF to a high of 84oF. Readers should therefore keep in mind that the population of the Bakersfield area is subject to temperatures that are not necessarily representative of the PG&E territory as a whole. 2.3. REPORT STRUCTURE The remainder of this report is organized as follows. Section 3 summarizes the analysis methods used to estimate ex post load impacts for both residential and non-residential SmartRate customers in the Bakersfield region. Evidence is presented illustrating that the methods used here produce unbiased impact estimates for the participant population. Section 4 summarizes the impact estimates for residential customers and Section 5 summarizes the results for non-residential customers. Appendix A contains estimates for each hour for each event day for E-1 customers and Appendix B contains the same information for A-1 customers. Similarly detailed information for each event day for sub-groups of SmartRate customers has been filed electronically with the CPUC. The first page of each appendix indicates the customer segments for which impact estimates have been produced and provided to the CPUC. 10 3. ANALYSIS METHODOLOGY The 2008 load impacts for the SmartRate tariff were estimated through individual customer timeseries regressions. In selecting the analysis methodology, both regression and day-matching methods were considered. Within the family of regression analysis approaches, individual customer time series regressions, panel regressions, and segmented time series regressions (described below) were considered. Regression methods were selected over the primary alternative for ex-post impact analysis, daymatching methods, for several reasons. For day-matching methods, over and under predictions of the reference load can lead to substantially larger errors in the estimated load impacts, whereas with regression methods, the accuracy of the impacts are primarily the result of parameters reflecting the impact.16 Day-matching methods sometimes produce inaccurate results when significant variation exists across customer loads. Because the effect of weather on customer load may follow a non-linear pattern, it may not be captured well with day-matching even when weatheradjustments are employed. Finally, with day-matching no established approaches exist for calculating the statistical uncertainty associated with ex post load impact estimates. Time series regressions were estimated at the individual customer level rather than for all customers combined for several reasons. Most importantly, PG&E does not typically collect data on a key explanatory variable--the size and type of air conditioning at each household. That being said, by employing individual customer regressions, the presence and use of air conditioning is captured through the temperature variables and their interaction with the hourly binary variables. Put differently, the presence of air conditioning or lack thereof is a fixed effect that interacts with weather. By allowing individual customer coefficients to vary, the results are more accurate at the customer level – an important feature when results are desired for various customer segments in addition to the average for all participants. In addition, individual customer regressions can be employed to describe accurately the distribution of customer load reductions as well as the distribution of percent load reductions. The main regression alternatives, panel regressions and segmented aggregate time series, were not selected due to the unique features of the data and the evolving customer mix and enrollment rates over time. Unlike individual customer regressions, panel regressions can make use of both control groups and pre-enrollment data and can provide very robust average customer impact To better understand this, consider the following simplified example. If the accurate reference load and percent load reduction were 1000 MW and 15 percent respectively, an upward bias of 5% in the reference load estimates would result in load impact estimates of 200 MW (1050-850) using day-matching methods, an overestimate of 33% percent. This occurs with day-matching methods because the load impacts are estimated as the difference between the reference load and the actual load. In contrast, with regression methods, the same bias in the reference load would lead to either no error or, at most, an error of 5% because the impacts are based on the regression coefficients—that is, the load impacts reflect the difference between the predicted load with the demand response event and the load without the event. No error would occur if the regressions were properly specified and the parameters were designed to capture absolute impacts. A five percent error would occur if the parameters were designed to capture percent load reductions. 16 11 estimates by controlling for omitted variables.17 While panel regression can increase the accuracy of the impact estimates for the average customer, it cannot be employed to describe meaningfully the distribution of impacts among the participant population. Importantly, the lack of data on the type and size of air conditioners at the customer level precluded the use of panel regression. Because air conditioning is a key driver of electricity demand that interacts with weather, its omission in a panel regression would likely lead to inaccurate results. The other alternative, running time series on customer load aggregated by segment, could not adequately control for the evolving customer mix or provide insights into the distribution of impacts among the participant population. Except for the lower amount of effort required, segmented time series did not yield methodological benefits that were not also captured through individual customer regressions. The following discussion summarizes the methodology employed in the analysis of residential and commercial customer load impacts. It includes a description of the regression model and explanatory variables employed in the analysis, a summary of the goodness-of-fit statistics, and comparison of the regression predicted values with the actual average hourly demand from the interval data, and a discussion of the treatment of missing values. 3.1. RESIDENTIAL REGRESSIONS The impact estimates presented in the next section are based on time-series regressions for individual customers. The analysis is based on a proportional random sample of approximately 2,000 customers drawn from the participant population of roughly 10,000.18 The dependent variable in each regression is average hourly demand (kW). The explanatory variables can be grouped into three main categories: Variables that reflect the average load shape of customers, absent the need for cooling; Variables that explain deviations in hourly usage from the average load shape; and Variables that estimate the change in energy use during event days and the factors that influence the load reductions. The explanatory variables include hourly binary variables to capture the inherent variation in usage across hours of the day, day-of-week binary variables to capture variation in usage between week days and weekends and across weekdays, weather variables to capture the influence of temperature on electricity use, and event-day and event-hour variables to estimate the impact of the higher SmartDay prices on energy use during each hour of the event period as well as hours leading up to and following the event period. The event variables are interacted with weather, day of week, month, number of consecutive event days, and cumulative number of events throughout the season in order to explain how the impacts vary as a result of changes in those conditions. For Panel regression can account for omitted variables that are unique to customers but fixed over time (fixed effects) such as household income, and can also account for omitted variables that are common across the participant population but unique to specific time periods. They cannot, however, account for omitted variables that vary both by participant and by time period or for household characteristics (e.g., central air conditioning) that interact with variables that vary over time, such as weather. 17 The exact number of participants and sample points vary by date as enrollment grew over time during the 2008 summer months. 18 12 customers for whom event notification was unsuccessful or who elected not to be notified, the relevant event days were treated as non-event days in terms of the model specification. However, when the average and aggregate impact estimates were developed from the individual customer regressions, impacts for these non-notified customers were assumed to equal zero. Thus, the average and aggregate values reported in Sections 4 and 5 reflect the fact that some customers do not reduce load due to non-notification. The model specification was intentionally designed to capture a wide variation of household operating schedules as well as different hourly responses to weather and event conditions. The specification performed well for most customers, although for specific customers, some of the parameters may have been irrelevant.19 The regressions were estimated using generalized least squares (GLS) and Huber-White robust standard errors in order to ensure that the confidence bands around the impact variables were not overstated either due to auto-correlation or heteroskedasticity.20 The following equation summarizes the model specification. Given the large number of regressions (e.g., 2,000), it was not feasible to customize regressions for each customers. Importantly, the model performed well in the aggregate, as shown below in Sections 3.1.1 and 3.1.2. 24 24 24 i= 2 i= 2 KW = α 0 + ∑βi ⋅ HOUR i ⋅ NS ⋅ WEEKDAY + ∑ γ i ⋅ HOUR i ⋅ S ⋅ WEEKDAY + ∑ δ i ⋅ HOUR i ⋅ WEEKEND ⋅ i= 2 10 24 j =7 i=1 24 + ∑ φk ⋅ MONTHk + ∑µi ⋅ HOUR i ⋅ CDH + ∑ηi ⋅HOUR i ⋅ CDH2 + ψ ⋅ S ⋅ CDH + ζ ⋅ S ⋅ CDH2 + i=1 24 24 24 i=1 i=1 ∑ πi ⋅ HOURi ⋅ EVENTDAY + ∑ θi ⋅ HOURi ⋅ EVENTDAY ⋅ CDH + ∑ λi ⋅ HOURi ⋅ EVENTDAY ⋅ CDH2l + i= 2 10 ∑ς j=6 3 k ⋅ MONTHk ⋅ EVENT + ∑υk ⋅ INAROWk ⋅ EVENT + ω ⋅ CUMEVENTS ⋅ EVENT + k ⋅ DOWk ⋅ EVENT + ε 7 ∑ξ l= 2 k =1 Where: KW = Electricity usage in Hour i for Customer j NS = No School (period during the summer when school is NOT in session) S = Period during the summer when school is in session 19 Irrelevant parameters can lead to wider standard errors, but do not bias the significant parameters. Given that the amount of observations per regression generally exceeded 2,000, statistical power was not a major concern. 20 The GLS method used relied on the Prais-Winsten technique – a form of iterated GLS. 13 WEEKDAY = Monday – Friday WEEKEND = Saturday – Sunday HOURi = Hours of the day, numbered 1-24 MONTHj = Months of the year, numbered 1-12 CDHi = Cooling Degree Hour for that hour of the day, defined as Max(0, Temperature(F) -70) CDH2 = CDH squared EVENTDAY = SmartRate event day (all 24 hours) EVENT= SmartRate event window (2-7 pm) INAROW = Number of consecutive events in a row CUMEVENTS= Cumulative number of events in season DOW = Day of week ε = the error term i = Subscript indicating the hour of day (1-24) j = Subscript indicating the month of the year (1-12) k = Subscript indicating the number of consecutive events in a row l = Subscript indicating the day of week (1-7) 3.1.1. Goodness of Fit Measures Although the regressions were performed at the individual customer level, from a policy standpoint, the focus is less on how the regressions perform for individual customers than it is on how the regressions perform for the average participant and for specific customer segments. Overall, individual customers exhibited more variation and less consistent energy use patterns than the aggregate participant population. Likewise, the regressions explained better the variation in electricity consumption and load impacts for the average customer (or average customer within a specific segment) than for individual customers. Put differently, it is more difficult to explain fully how a specific CARE customer behaves on an hourly basis than it is to explain how the average CARE customer behaves on an hourly basis. Because of this, we present measures of the explained variation, as described by the R-squared goodness-of-fit statistic, for the individual regressions and for specific segments as well as for the average customer. Figure 3-1 shows the distribution of R-squared values from the individual residential customer regressions. As the peak period use, annual consumption, and ratio of summer to non-summer usage increase, the goodness-of-fit from the regressions generally improves. While the individual customer regressions do a reasonably good job of explaining the variation in electricity use, in aggregate, nearly all of the variation in energy use across hours is explained by the model specification. 14 Figure 3 - 1 Distribution of Adjusted R-squared Values from Individual Regressions 6.0 Percent of Accounts 5.0 4.0 3.0 2.0 1.0 0.0 0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 0.90 1.00 Individual Customer Regression Adjusted R-squared value When the predicted and actual values are aggregated across the individual results, the model explains 92.2 percent of the variation in energy use. Put another way, only about 8 percent of the variation in energy use over time is explained by variables that are not included in the model. In order to estimate the average customer R-squared values, the regression-predicted and actual electricity usage values were averaged across all customers for each date and hour. This process produced regression predicted and actual values for the average customer, which enabled the calculation of errors for the average customer and the calculation of the R-squared value. The same process was performed to estimate the amount of explained variation for the average customer in specific segments. The R-squared values for the average participant and for the average customer by segment were estimated using the following formula21: ∑ (ˆy − y ) = 1− ∑ (ˆy − y) t R2 2 t t 2 t t Where: y t is the actual energy use at time t ŷ t is the regression predicted energy use at time t y is the actual mean energy use across all time periods. Technically, the R-squared value needs to be adjusted based on the number of parameters and observations from each regression. Given that the number of observations per regression was typically over two thousand, the effects of the adjustment were anticipated to be minimal. As a result, the unadjusted Rsquared is presented in order to avoid the complication of tracking the number of observations and parameters from each individual regression. 21 15 Table 3-1 summarizes the amount of variation explained by the regression model for the average customer for specific segments. Overall, depending on the specific group assessed, roughly 80 to 93 percent of the variation is explained through the individual regressions. Table 3 - 1 Adjusted R-squared Values for the Average Customer by Segment 0.90 Ratio of of Summer to Non-Summer Usage <100% 5,000 to 7,500 0.78 7,500 to 10,000 Annual Consumption 5,000 or less 2 R 2 0.89 CARE Status Non-CARE 100%-125% 0.89 CARE 0.91 125%-150% 0.88 10,000 to 12,500 0.93 150-175% 0.94 12,500 to 15,000 0.85 175%-200% 0.93 15,000 + 0.92 200%+ 0.85 R R2 0.91 0.93 3.1.2. Model Accuracy and Validity Assessment The most important feature of load impact analysis is the ability to predict accurately customer load and load reductions under the extreme conditions for which demand response is designed to provide a reliable resource. Unlike day matching methods, the accuracy of load impact estimates depend more on the accuracy of the regression coefficients representing the load impacts than on how well the regression predicts customer load. Put differently, if properly designed, regressions can accurately estimate load impacts and are more robust to over or under predictions of hourly energy consumption than day-matching methods. For SmartRate, we are not only confident that the load impact parameters are accurate, but the regression predicted values of energy consumption closely mirror and are often nearly indistinguishable from actual energy consumption, further validating the accuracy of the load impact estimates. To assess the accuracy and validity of the model, we compared actual and predicted values during event days by hour and temperature. In addition, given the estimated differences in response between CARE and non-CARE customers discussed in Section 4, we also present the comparisons of actual and regression predicted values for those customer segments. Figure 3-2 shows the actual average hourly energy use of customers during event days compared to the regression predicted average customer energy use with and without demand response. The close match between predicted values with demand response and actual values reflects the ability of the regressions to predict accurately under event conditions. Figure 3-3 compares the actual and predicted values by temperature, based on data from the nine event days, and illustrates the model’s ability to predict accurately customer behavior under event conditions for a wide range of temperatures. It also illustrates that, in general, load response increases as temperature increases. Figure 3-4 compares the actual and regression predicted values for CARE and nonCARE customers during event days. For each figure the relevant comparison of accuracy is between the actual load under event conditions (the solid line) and the regression predicted load under the same conditions (solid line with squares). For most graphs, the two are nearly indistinguishable. In addition, for information purposes, we have included the regression predicted values absent the SmartDay event (the dashed line). All of the comparisons are for time periods 16 with actual interval data reads. Estimates of missing values were not included in the comparisons or in the estimated regressions. As seen, the regressions predict the behavior of both CARE and non-CARE enrollees in Bakersfield extremely well. Figure 3 - 2 Average Residential Customer Actual and Predicted Values for the Average SmartDay Actual kW Predicted w/o DR Predicted w/ DR 3.0 Average kW 2.5 2.0 1.5 1.0 0.5 0.0 0 3 6 9 12 Hour Ending 15 18 21 24 Figure 3 - 3 Average Residential Customer Actual and Predicted Values by Temperature for Event Days Actual kW Predicted w/o DR 75 85 90 Temperature (F) Predicted w/ DR 3.5 3.0 Average kW 2.5 2.0 1.5 1.0 0.5 0.0 65 70 80 95 100 105 110 17 Figure 3 - 4 Comparison of CARE and Non-CARE Participants Actual and Predicted Values for Event Days CARE Customers Non-CARE Customers 3.5 3.5 Actual kW 3.0 2.5 2.5 Predicted w/ DR Average kW Average kW 3.0 Predicted w/o DR 2.0 1.5 2.0 1.5 1.0 1.0 0.5 0.5 0.0 0.0 0 3 6 9 12 15 Hour Ending 18 21 24 0 3 6 9 12 15 Hour Ending 18 21 24 Similar comparisons of actual and predicted values were conducted by month, day of week, individual event days, and various other iterations – all of which indicated that the results were not only unbiased for the average day and average customer, but also unbiased across multiple customer segments and temporal characteristics. 3.1.3. Treatment of Missing Values The regression analysis and the comparison of actual versus predicted values were based on interval meter reads. Whereas some customers had missing data for one or two event days, there was still information on their actual behavior during other event days and how that load reduction varies with different drivers, such as weather and day of week. Given the accuracy of the regressions during both event and non-event conditions, for each individual customer in the sample, regression predicted values were used to fill in gaps associated with missing values. In other words, the regressions were not based on any missing or estimated values but instead were employed to fill-in missing values. Overall, for residential customers, less than 2.5% percent of the event day values were missing. The share of missing values during the 2-7 pm event window was substantially lower, less than 1.5% percent. However, missing values were concentrated on four event days. Figure 3- 5 shows the percent of missing values for residential customers on each SmartRate event day for both the entire day and the event window. The events on July 8th, 9th, and 10th, had the highest share of missing values, but also took place when less than a third of the participant population had enrolled. The August 29th event also had a relatively high share of missing values for the day – over 6 percent – but over 99 percent of the data during the event window were available. For all other event days, less than 1 percent of the data was missing. 18 Figure 3 - 5 Percent of Missing Values by Event Day and Hour for Residential Customers % estimated - all day % estimated - event hours 12.00% 10.00% 8.00% 6.00% 4.00% 2.00% be r pt em Se Se pt em be r 5 4 3 be r pt em Se Au gu st 2 9 8 Au gu st 2 7 Au gu st 2 10 Ju ly Ju ly Ju ly 8 9 0.00% During non-event days, there was substantial variation in load and temperatures, thus allowing the regression to explain variation in the amount of hourly energy consumption under a wide range of conditions. This is true despite a network upgrade and substation outages on six non-event days, during which data could not be collected on actual energy consumption for most customers. The data blackouts did not occur on SmartDays and affected all customers. As a result, the nonmissing data could not bias the regression coefficients and the estimates of load impacts given the use of individual customer regressions. 3.1.4. Estimation of Ex-post Load Impact Confidence Bands Because the results focus on historical event days, the only uncertainty associated with the estimates is due to statistical uncertainty from the regressions. This statistical uncertainty is reflected in the standard error of the regression, which is based on the regression errors or, more specifically, the root mean squared error. The standard errors from the individual regressions are normally distributed and it is reasonable to assume that they are independent from each other in light of the fact that the regression controls for many of the factors common across the participants, including weather and event day response. These two features, normal distributions and independence of the distributions, facilitate calculating the combined uncertainty of the load impacts through the following relatively simple formula: σ Average _ Customer = σ 1 + σ 2 + ... + σ n n or σ Aggregate = σ 1 + σ 2 + ... + σ n 19 3.2. NON-RESIDENTIAL REGRESSION MODELS The impact estimates for non-residential customers were based on time-series regressions for each participant for which adequate data were available. In total, up to 209 non-residential accounts were enrolled in SmartRate during 2008, although the number varied depending on the date. The final estimation was based on 186 individual customer regressions. Customers that were not enrolled for any events and accounts that had more than fifteen percent of the relevant observations missing were excluded from the analysis. The treatment of missing values is detailed in section 3.2.4. As with the residential models, the dependent variable in each regression is the average hourly demand (kW). The explanatory variables can be grouped into three main components: Variables that reflect the average load shape of customers, absent the need for cooling; Variables that explain deviation in hourly usage from the average load shape; and Variables that estimate the change in energy use during event days and the factors that influence the load reductions. The explanatory variables are similar to those employed in the residential analysis, with a few exceptions. Explanatory variables include hourly binary variables for weekdays and weekends to capture the inherent variation in usage across hours of the day absent weather effects, weather variables to capture the influence of temperature on electricity use, and event-day variables and interactions to estimate the impact of the higher SmartDay prices on energy use during each hour of the event period as well as hours leading up to and following the event period. In addition, the event variables were interacted with temperature, number of consecutive event days, cumulative event days, day of week, and month variables in order to explain how load reductions vary due to those factors. The primary difference between the residential and non-residential customer regressions is in the operating schedule of businesses and homes. By modeling the effect of temperature on each hour separately, the regression identifies differences in the operating schedules and when customers use A/C load. The specification was intentionally designed to capture a wide variation of operating schedules as well as different hourly responses to weather and event conditions. For residential customers, the regressions explicitly modeled the effects of the school calendar on the household operating schedule and energy consumption. This was not included in the nonresidential regression specification. In addition, for non-residential customers, no distinction was made based on whether or not specific accounts were sent notification of events. Although roughly half the accounts chose not to provide contact information for event day notifications, they provided substantial impacts. The regressions were developed using the same GLS estimator and robust standard error techniques used for the residential sector analysis. The following equation summarizes the model specification. 20 24 24 10 i= 2 j =7 KW = α 0 + ∑βi ⋅ HOUR i ⋅ WEEKDAY + ∑ δi ⋅ HOUR i ⋅ WEEKEND + + ∑ φk ⋅ MONTHk i= 2 24 24 i=1 i=1 + ∑µi ⋅ HOUR i ⋅ CDH + ∑ηi ⋅HOUR i ⋅ CDH2 + ψ ⋅ S ⋅ CDH + ζ ⋅ S ⋅ CDH2 + 24 24 24 i=1 i=1 ∑ πi ⋅ HOURi ⋅EVENTDAY + ∑ θi ⋅ HOURi ⋅EVENTDAY ⋅ CDH + ∑ λi ⋅ HOURi ⋅ EVENTDAY ⋅ CDH2l + i= 2 10 ∑ς j= 6 3 k ⋅ MONTHk ⋅ EVENT + ∑υk ⋅ INAROWk ⋅ EVENT + ω ⋅ CUMEVENTS ⋅ EVENT + k ⋅ DOWk ⋅ EVENT + ε 7 ∑ξ l= 2 k =1 Where: KW = Electricity usage in Hour i for Customer j WEEKDAY = Monday – Friday WEEKEND = Saturday – Sunday HOURi = Hours of the day, numbered 1-24 MONTHj = Months of the year, numbered 1-12 CDHi = Cooling Degree Hour for that hour of the day, defined as Max(0, Temperature(F) -70) CDH2 = CDH squared EVENTDAY = SmartRate event day (all 24 hours) EVENT= SmartRate event window (2 - 6 pm) INAROW = Number of consecutive events in a row CUMEVENTS= Cumulative number of events in season DOW = Day of week ε = the error term i = Subscript indicating the hour of day (1-24) j = Subscript indicating the month of the year (1-12) k = Subscript indicating the number of consecutive events in a row l = Subscript indicating the day of week (1-7) 21 3.2.1. Goodness of Fit Measures Figure 3-6 describes the distribution of R-squared values for the individual non-residential customer regressions. While the individual customer regressions did a good job of explaining the variation in electricity use, in aggregate, nearly all of the variation in energy use across hours was explained by the model specification. When the predicted and actual values were aggregated across the individual results, the model explains roughly 92.9 percent of the variation in energy use. Figure 3 - 6 Distribution of Adjusted R-squared Values from Individual Non-Residential Customer Regressions Percent of Accounts 10.0 8.0 6.0 4.0 2.0 0.0 0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 Adjusted R-squared Values from Individual Customer Regressions 0.90 1.00 3.2.2. Model Accuracy and Validity Assessment With any load impact analysis, the most important feature is the ability to accurately predict customer load and load reductions under the extreme conditions for which demand response is designed to provide a reliable resource. With the non-residential customers, the focus on accuracy was even more important due to the relatively high amount of missing values, which is detailed further in the next section. Highly accurate regressions can be used to fill in data gaps, provided they accurately predict actual values and explain the behavior of customers during both event and non-event conditions. To assess the accuracy and validity of the model, we compared actual and predicted values by hour and temperature during event days. Importantly, missing values and regression based predictions of those values were excluded from the accuracy comparisons. Figure 3-7 compares the actual average hourly energy use of non-residential customers across event days to the regression predicted values under event and non-event conditions. Figure 3-8 compares the actual and predicted values at various hourly temperature values and illustrates that the model predicts accurately across the full range of temperatures. As in the residential section, for each figure the relevant comparison of accuracy is between the actual load under event conditions (solid 22 line) and the regression predicted load under the same conditions (solid line with squares). We have included the regression predicted values absent the SmartDay event (dashed line) for informational purposes only. In all of the comparisons, the actual and regression predicted values under event conditions are virtually identical. In other words, the regressions predict how customers behave under event conditions nearly perfectly. The same is true for non-event days. Figure 3 - 7 Actual and Predicted Values for the Average SmartDay (Average C&I Customer) Actual kW Predicted w/o DR Predicted w/ DR 4.0 3.5 Average kW 3.0 2.5 2.0 1.5 1.0 0.5 0.0 0 3 6 9 12 Hour Ending 15 18 21 24 Figure 3 - 8 Actual and Predicted Values by Temperature for Event Days (Average Non-residential Customer) Actual kW Predicted w/o DR Predicted w/ DR 3.5 3.0 Average kW 2.5 2.0 1.5 1.0 0.5 0.0 65 70 75 80 85 90 Temperature 95 100 105 110 23 3.2.3. Treatment of Missing Values The non-residential regressions were based on actual interval meter reads. The regressions were used to fill-in missing values during event and non-event days. For almost all customers with missing data during an event, data on their actual behavior across multiple other event days was generally available, enabling the development of regression-based estimates. In total, 20 out of the 210 accounts were not included in the analysis due to the relatively high amount of missing values (over 15%). Although there is no inherent reason to assume that the load reductions from those customers were any different than those in the estimating sample, assuming the worst case scenario – zero or negative load reductions – for those customers would have a minimal impact on the overall non-residential load impacts, and a far smaller impact on the SmartRate aggregate load reduction. Overall, 7.4% of the average hourly demand values were missing for event days. The proportion of missing values was even higher during the 2-6 pm peak period, where roughly 11.3% of the interval data was missing. The missing values are far more prevalent for non-residential customers than with the residential data. Figure 3-10 shows the percent of missing values on each event day for both the entire day and for the event window. In contrast to the residential sector, missing values are more prevalent during the event window hours. In addition, the missing values are not clustered among a sub-set of event days. That being said, given the accuracy of the regressions, it is possible to reliably estimate missing values through them. Figure 3 - 9 Percent of Missing Values by Event Day and Hour for Non-residential Customers % estimated - all day % estimated - event hours 25.00% 20.00% 15.00% 10.00% 5.00% be r 5 4 Se pt em be r 3 pt em be r Se pt em 28 29 Se Au gu st Au gu st 27 Au gu st 14 10 Au gu st Ju ly 9 Ju ly Ju ly 8 0.00% 24 4. RESIDENTIAL EX POST LOAD IMPACTS As indicated in Section 3, the load impact analysis was based on individual customer regressions. The use of individual customer regressions allowed for easy development of load impact estimates for a wide variety of customer segments, distinguished by rate class (E-1 and E-8 and CARE versus non-CARE for residential customers and A-1 for non-residential customers), average annual energy use and the ratio of summer to non-summer energy use (as a proxy for air conditioning saturation). Load impact estimates are available for each hour for each event day as well as for the average event day, on both an aggregate and per enrolled customer basis. Some of these detailed results are contained in the appendices while other detailed results (e.g., hourly impacts) for some SmartRate customer segments have been filed electronically with the CPUC. In this section, we provide a reasonable summary of the much more detailed analysis that is available. 4.1. LOAD IMPACTS FOR E-1 CUSTOMERS Figure 4 - 1 and Table 4 - 1 summarize the load impacts for the average E-1 SmartRate customer for the average event day in 2008. Similar figures are shown in Appendix A for the average E-1 customer on each of the nine event days that were called in 2008. The average impacts for each event day are summarized and discussed later in this section. As seen in Figure 4 - 1, the average hourly load reduction for E-1 customers across the nine event days was 0.4 kW, which amounts to a reduction of 16.6 percent. As indicated in the graph in Figure 4 - 1 (and in Table 4 - 1), there was no pre-cooling prior to the event period (which starts at 2 pm). Indeed, there was a modest drop in load relative to the reference load prior to the beginning of the event period. Some payback in cooling occurred after the event period, as evidenced by the observed load exceeding the reference load starting after 7 pm and continuing throughout the evening. As seen at the bottom of Table 4 - 1, the overall change in energy use on the average event day equaled a reduction of 4.33 percent, or about 1.56 kWh. Table 4 - 1 also shows that there was a modest decrease in the percent load reduction across the five event-period hours. In the hour ending at 3 pm, the load drop was 17.8 percent. The load drop falls to roughly 15 percent in the hour ending at 7 pm. The hour with the largest absolute load drop was from 5 pm to 6 pm, where the load reduction for the average customer equaled 0.44 kW. The first and last hours of the event period had the lowest absolute load reduction, at 0.37 kW. The aggregate average hourly load drop for the 8,758 customers who were included in the average event calculation equaled 3.5 MW. It should be noted that the number of enrolled customers varied significantly between the early event periods in July, when the number of enrolled customers was less than 4,000, and the events in September, when enrollment equaled more than 9,900. If the average reduction across all nine events was grossed up based on the higher September enrollment figures rather than based on average enrollment across all nine event days (to reflect a steady state 2008 enrollment level), the aggregate impact would be almost 4 MW. 25 Load impacts were also estimated for customers segmented according to the two CAISO Local Capacity Areas (LCA) represented among the 2008 SmartRate sample.22 7,544 SmartRate participants were located in the Kern County LCA and 1,299 were in the Other LCA. The average load reduction for the Kern County LCA segment was 17.5 percent, or 0.44 kW. The aggregate, average load reduction for this LCA across the nine event days totaled 3.32 MW. The average reduction for customers in the Other LCA was 11.2 percent, or 0.20 kW. The aggregate load reduction for the Other LCA was 0.27 MW. Figure 4 - 1 Load Impacts for the Average E-1 Customer for the Average Event Day Type of Results Avg. Customer Customer Class Residential Event Average Event Participant Category E1 Event Date Average Event Event Notification Day Ahead Event Start 14:00 Event End 19:00 ACCOUNTS FOR EVENT 8,758 TOTAL ENROLLED ACCOUNTS 8,758 Avg. Load Reduction for Event Window 0.40 % Load Reduction for Event Window 16.6% Reference Load (kW) Observed Load (kW) 3.0 Avg. Customer (kW) 2.5 2.0 1.5 1.0 0.5 0:00 21:00 18:00 15:00 12:00 9:00 6:00 3:00 0:00 0.0 Hour Ending 22 The estimates presented for each LCA include both E-1 and E-8 customers. There were only 82 E-8 customers enrolled in the SmartRate program in 2008. 26 Table 4 - 1 Load Impacts by Hour for the Average E-1 Customer for the Average Event Day Hour Ending Reference Load (kW) Observed Load (kW) 1:00 1.05 1.09 2:00 0.92 3:00 0.83 4:00 Uncertainty Adjusted Impact - Percentiles %Load Reduction Weighted Temp (F) 10th 30th 50th 70th 90th -0.04 -4.16% 80.0 -0.05 -0.04 -0.04 -0.04 -0.04 0.96 -0.04 -3.99% 78.9 -0.04 -0.04 -0.04 -0.04 -0.03 0.85 -0.02 -2.73% 77.6 -0.02 -0.02 -0.02 -0.02 -0.02 0.78 0.80 -0.02 -1.97% 75.9 -0.02 -0.02 -0.02 -0.01 -0.01 5:00 0.76 0.77 -0.01 -0.78% 74.9 -0.01 -0.01 -0.01 -0.01 0.00 6:00 0.75 0.77 -0.01 -1.97% 73.7 -0.02 -0.02 -0.01 -0.01 -0.01 7:00 0.83 0.83 -0.01 -0.97% 73.0 -0.01 -0.01 -0.01 -0.01 -0.01 8:00 0.83 0.87 -0.04 -4.52% 75.4 -0.04 -0.04 -0.04 -0.04 -0.04 Load Impact (kW) 9:00 0.83 0.85 -0.02 -2.98% 80.6 -0.03 -0.03 -0.02 -0.02 -0.02 10:00 0.93 0.94 -0.01 -1.13% 85.2 -0.01 -0.01 -0.01 -0.01 -0.01 11:00 1.14 1.10 0.04 3.59% 89.9 0.04 0.04 0.04 0.04 0.04 12:00 1.38 1.30 0.08 5.51% 93.6 0.07 0.08 0.08 0.08 0.08 13:00 1.65 1.54 0.11 6.70% 96.6 0.11 0.11 0.11 0.11 0.11 14:00 1.86 1.72 0.14 7.37% 98.5 0.14 0.14 0.14 0.14 0.14 15:00 2.07 1.70 0.37 17.78% 100.3 0.37 0.37 0.37 0.37 0.37 16:00 2.29 1.89 0.40 17.35% 101.3 0.39 0.40 0.40 0.40 0.40 17:00 2.50 2.08 0.42 16.64% 101.9 0.41 0.42 0.42 0.42 0.42 18:00 2.60 2.17 0.44 16.72% 101.6 0.43 0.43 0.44 0.44 0.44 19:00 2.51 2.13 0.37 14.94% 100.1 0.37 0.37 0.37 0.38 0.38 20:00 2.35 2.37 -0.02 -0.81% 97.1 -0.02 -0.02 -0.02 -0.02 -0.02 21:00 2.19 2.37 -0.18 -8.33% 93.4 -0.18 -0.18 -0.18 -0.18 -0.18 22:00 1.98 2.15 -0.17 -8.71% 89.7 -0.17 -0.17 -0.17 -0.17 -0.17 23:00 1.65 1.77 -0.12 -7.11% 86.6 -0.12 -0.12 -0.12 -0.12 -0.12 0:00 1.32 1.40 -0.08 -6.44% 83.7 -0.09 -0.09 -0.08 -0.08 -0.08 Reference Energy Use (kWh) Observed Energy Use (kWh) Change in Energy Use (kWh) 35.99 34.43 1.56 Daily Cooling % Daily Load Degree Hours Reduction (Base 75) 4.33% 130.9 10th 1.56 Uncertainty Adjusted Impact - Percentiles 30th 50th 70th 1.56 1.56 1.56 90th 1.56 Table 4 - 2 summarizes the average absolute and percent reduction in energy use across the five hour SmartRate period for each SmartDay. The table also shows the average maximum and minimum temperature for each SmartDay and the number of enrolled customers underlying each event-day estimate. It should be noted that there was a substantially smaller number of enrolled customers underlying the July SmartDay estimates (roughly 3,650) than the number underlying the last six SmartDay impact estimates, which were based on more than 9,900 enrolled customers. Other differences across the event day periods should also be noted. For example, the three-day period in July had an average maximum temperature of 109ºF. The average maximum temperature for the three-day August period was 102 ºF and the average for the September period was 99ºoF. The minimum temperature (which typically occurs around 6 am or 7 am) was also substantially lower in September than in July. 27 As seen in Table 4- 2, the hourly impacts for the average customer during the three July SmartDays were substantially greater in both absolute and percentage terms than during the August or September event days. The average percent reduction across the five-hour event period on July 8th, the first event day, equaled 19.2 percent. The percent load reduction was 19.1 percent on the second July day and 22.5 percent on the third July day. The average reduction over the three July days was 20.3 percent (0.62 kW). During the three-day period in late August, leading up to the Labor Day weekend, the average reduction was 14.9 percent (0.42kW) and the average during the early September period, immediately after Labor Day, was roughly the same, 14.4 percent (0.29 kW). Table 4 - 2 Load Impacts by Event Day for E-1 Customers Date Day of Week # of Enrolled Customers23 Maximum Temp (oF) Minimum Temp (oF) Average Hourly Load (kW) 2 pm to 7 pm Average Load Reduction (kW) 2 pm to 7 pm Average % Load Reduction 2 pm to 7 pm 7/8 T 3,279 108 79 2.694 0.56 19.2 7/9 W 3,688 108 84 3.01 0.57 19.1 7/10 Th 3,977 111 84 3.19 0.72 22.5 8/27 W 9,581 98 72 2.16 0.27 12.4 8/28 Th 9,561 102 74 2.50 0.42 17.0 8/29 F 9,544 106 77 2.82 0.56 19.7 9/3 W 9,924 98 66 1.91 0.24 12.7 9/4 Th 9,913 100 67 2.07 0.30 14.7 9/5 F 9,913 101 71 2.16 0.34 15.7 Avg n/a 8,758 102 73 2.39 0.40 16.6 It should also be noted that the average reference load drops significantly across the three eventday sequences. The September event days were significantly cooler, and had much lower reference loads, than did the August days. The average reference load across the five hour peak period in July was 3.04 kW, which was 22 percent higher than the August SmartDay average and nearly 50 percent larger than the September average. Another interesting finding is that the third day of each three-day event period actually showed a larger percent reduction in peak-period load than either the first or second event day. The third day in each three-day period was also the hottest day in each case and the day on which the reference load was highest. The net result of these factors is that the third day in each three-day sequence provided substantially larger absolute reductions in load than either of the prior days. For example, the absolute reduction on the third day of the July event period was roughly 25 percent greater 23 These values represent the number of enrolled customers included in the estimation, which may differ moderately from the actual number of customers enrolled due to missing data or other factors. 28 than on the first and second days. In other words, the often expressed concern that load impacts will fall off on the third day of a multi-day heat wave appears to not be true for this group of customers. Table 4 - 3 shows the load impact for E-1 customers segmented by size (e.g., average annual energy use). As seen, high energy users have higher percentage load impacts than do low energy users. The percent reduction in peak-period loads on SmartDays increased from about 12.4 percent for customers with average annual energy use below 5,000 kWh to 22 percent for customers with average annual usage between 10,000 and 12,500 kWh. The percent reduction dropped slightly for the two largest usage segments, but the absolute impact increased steadily across all six size strata. Customers with annual energy use greater than 15,000 kWh provide load impacts that were more than five times larger than did customers with annual usage below 5,000 kWh. Annual kWh Table 4 - 3 Load Impacts for the Average Event Day by Customer Size Average Daily Percent Load Absolute Load Energy Use on Reduction During Reduction During SmartDays Peak Period Peak Period (kW) (kWh) # of Customers in Segment <5,000 21.33 12.4% 0.17 3,080 5,001 – 7,500 32.40 14.2% 0.32 2,291 7,501 – 10,000 41.90 17.2% 0.49 1,658 10,001 – 12,500 52.44 22.0% 0.77 898 12,501 – 15,000 60.26 21.7% 0.86 503 >15,000 79.07 18.9% 0.91 415 Table 4 - 4 shows the load impact for E-1 customers segmented according to the ratio of summer to winter energy use. This variable is a crude proxy for air conditioning usage. Customers with summer usage less than 125% of winter usage had average reductions of roughly 14% and 0.25 kW. Customers with summer usage between 125% and 175% of winter usage provided load reductions averaging between 18 and 22 percent, and absolute reductions between 0.35 and 0.55 kW. The percent reduction in peak-period energy use was lower for customers with summer usage that is greater than 175% of winter usage. 29 Table 4 - 4 Load Impacts for the Average Event Day by the Ratio of Summer to Winter Electricity Use Ratio of Summer to Winter Usage Average Daily Energy Use on SmartDays (kWh) Percent Load Reduction During Peak Period Absolute Load Reduction During Peak Period (kW) # of Customers in Segment Summer < Winter 28.28 14.6% 0.25 1,097 Summer = 101125% of Winter 30.94 13.6% 0.25 694 Summer = 125150% of Winter 31.39 18.2% 0.35 1,174 Summer = 150175% of Winter 39.27 21.8% 0.55 1,310 Summer = 175200% of Winter 37.74 19.3 0.48 1,171 Summer > 200% of Winter 40.29 15.0% 0.43 3,343 Figures 4 - 2 and 4 - 3 show the cumulative load reduction in percentage and absolute terms compared with the percent of the participant population. These figures include both customers who were and who were not notified. As seen in both figures, roughly 23 percent of participants had load increases during the peak-period on SmartDays. However, many of these customers were those who were not successfully notified. At the other end of the distribution are customers who provided above average load reductions. Roughly 30 percent of participants provided load reductions that exceeded the average reduction of 16.6 percent. Almost 20 percent of participants provided load reductions that exceed 30 percent. 30 Figure 4 - 2 Cumulative Distribution of % Load Reductions (Based on all customers, including those not notified) Cummulative Distribution of % Load Reduction Percent Load Reduction (by Customer) 100.0 80.0 60.0 40.0 20.0 0.0 -20.0 -40.0 -60.0 -80.0 -100.0 23% Had Load Increases on SmartDays Average Load Reduction = 16.6% 20% Had Load Reductions > 30% 30% Had Above Average Load Reductions 0.00 0.10 0.20 0.30 0.40 0.50 0.60 Proportion of Accounts 0.70 0.80 0.90 1.00 As seen in Figure 4 - 3, roughly 25 percent of SmartRate participants provided load reductions that exceeded 0.5 kW, and approximately 13 percent provided average load reductions that exceeded 1 kW. Figure 4 - 3 Cumulative Distribution of Average Event load Reduction (Based on all customers, including those not notified) Cummulative Distribution of Average Event Load Reduction (by Customer) 3.0 Load Reduction (kW) 2.5 2.0 1.5 23% Had Load Increases on SmartDays 1.0 25% of Participants Had Load Reductions > 0.5 kW 0.5 0.0 13% of Participants Had Load Reductions > 1.0 kW -0.5 -1.0 -1.5 0.00 0.10 0.20 0.30 0.40 0.50 0.60 Proportion of Accounts 0.70 0.80 0.90 1.00 31 Tables 4-5 shows the percent of customers who were not notified (either by choice or due to notification failure) as well as the percent of customers receiving notification by type. As seen, fewer customers were successfully notified during the first three events than during later events. On average, roughly 80 percent of customers were notified, with the majority receiving the notice via voice mail. Table 4 - 5 Percent of Customers Receiving Notification by Type Percent Notified by Event Day Date Percent Notified Percent Not notified 7/8/2008 7/8/2008 7/8/2008 8/8/2008 8/8/2008 8/8/2008 9/8/2008 9/8/2008 9/8/2008 Average Event 64.7 70.8 75.3 83.7 85.0 84.3 80.2 80.2 80.2 35.3 29.2 24.7 16.3 15.0 15.7 19.8 19.8 19.8 13.8 17.9 18.6 19.7 19.7 19.4 18.6 19.2 18.3 35.8 37.4 41.0 49.3 50.9 52.1 49.3 48.9 50.0 6.4 7.0 8.2 6.2 6.2 6.1 6.0 6.1 6.4 8.7 8.5 7.6 8.6 8.3 6.8 6.4 6.1 5.4 80.4 19.6 18.8 48.1 6.3 7.1 E-mail only Voice-mail only E-mail and voicemail Live person reached Table 4-6 shows the percent of customers providing load reductions that exceed selected magnitudes. This table is based only on customers who were notified. For the average event, approximately 70 percent of customers provided at least some amount of load reduction. Approximately 45 percent of notified customers reduced load by more 10 percent, and roughly 30 percent reduced load by 20 percent of more. Almost 19 percent of customers showed load reductions exceeding 50 percent of peak period electricity use. 32 Table 4 - 6 Percent of Notified Customers Providing Load Reduction by Amount Share of accounts providing load reductions greater than… SmartDay Date 7/8/2008 7/8/2008 7/8/2008 8/8/2008 8/8/2008 8/8/2008 9/8/2008 9/8/2008 9/8/2008 Average Event 4.2. 0% 74.7 74.1 76.5 65.3 74.1 75.7 65.7 69.2 70.1 70.8 10% 61.7 62.5 63.6 47.2 55.5 60.9 50.8 52.8 53.1 54.7 20% 49.4 46.8 50.2 34.4 38.1 42.4 38.1 37.8 39.3 39.8 30% 38.5 35.4 38.2 27.9 27.6 31.3 30.5 28.7 28.9 30.3 40% 31.9 28.9 30.1 21.5 22.4 23.7 23.8 21.7 22.0 23.6 50% 26.0 24.0 25.0 16.6 17.5 18.8 18.6 17.1 17.2 18.7 LOAD IMPACTS FOR E-8 CUSTOMERS The E-8 tariff is closed to new enrollment and less than 1 percent of participants in the SmartRate program are E-8 customers. Average E-8 enrollment over the nine SmartDay events equaled 82 customers. The average E-8 participant had twice the daily energy use on SmartDays than did the average E-1 customer. With an average reduction of 29.3 percent across the nine SmartDays in 2008, E-8 customers provided nearly twice the load reduction in percentage terms compared with E-1 customers. In absolute terms, the average load reduction of 1.24 kW for E-8 customers was more than three times larger than for E-1 customers. Electronic tables showing the load reduction by hour for E-8 customers have been filed with the CPUC. 4.3. LOAD IMPACTS FOR CARE AND NON-CARE CUSTOMERS As discussed in Section 2, CARE is a program offering significant rate discounts for qualifying, low income customers. The magnitude of the rate differential between CARE and non-CARE customers varies with the amount of electricity use. The CARE tariff has only two tiers rather than the five tiers associated with the E-1 tariff. In Tier 1, the CARE price per kWh is almost 30 percent less than for E-1 customers. For CARE customers that would otherwise be in Tier 5 on the E-1 rate, the marginal price for CARE customers is nearly 75 percent less than for E-1 customers. On the other hand, the price ratio during the peak period on SmartDays compared with normal week days for CARE customers on the SmartRate is much higher than for non-CARE customers, as the SmartRate price differential is the same for both groups, 60 ¢/kWh. Thus, if CARE and non-CARE customers had the same price elasticity, one would expect to see much larger percent reductions in energy use on SmartDays for CARE customers than for non-CARE customers. As summarized below, just the opposite is true. Figure 4 - 4 shows the reference and observed loads for CARE and Non-CARE customers for the average SmartDay. As seen, CARE customers reduced load much less than did non-CARE customers. Indeed, the average load reduction across the five-hour event period for the nine SmartDays was 11.0 percent for CARE customers and 22.6 percent for non-CARE customers. This finding is consistent with what was seen in California’s Statewide Pricing Pilot, which showed that 33 the average load reduction for CARE customers was only 3 percent compared with a load reduction of roughly 15.6 percent for non-CARE customers.24 Figure 4 - 4 Average Load Impacts for Nine SmartDays Non-Care Customers Reference Load (kW) Care Customers Observed Load (kW) Reference Load (kW) 3.5 Observed Load (kW) 2.5 3.0 2.0 2.5 1.5 2.0 1.5 1.0 1.0 0.5 0.5 0.0 Hour Ending Hour Ending Tables 4 - 7 and 4 - 8 show the average change in load in each hour on SmartDays for CARE and non-CARE customers respectively. As seen in Table 4 - 7, the load reduction in the first hour of the event period is roughly 14 percent for CARE customers and decreased steadily across each of the five hours until it fell below 9 percent in hour five. As seen in Table 4 - 8, for non-CARE customers, the percent reduction was nearly constant across all five event hours, equaling 22.2 percent in the first hour and 21.2 percent in the fifth hour. The load reduction in the hour ending at 6 pm had the highest percent reduction, but at 23.7 percent, was only approximately 2 percentage points larger than the lowest reduction, which occurred in the hour ending at 7 pm. 24 Stephen S. George and Ahmad Faruqui. Impact Evaluation of the California Statewide Pricing Pilot, Final Report. March 16, 2005. p. 77. 34 0:00 21:00 18:00 15:00 12:00 9:00 6:00 3:00 0:00 0:00 21:00 18:00 15:00 12:00 9:00 6:00 3:00 0:00 0.0 Table 4 - 7 Load Impacts by Hour for the Average CARE Customer for the Average Event Day Hour Ending Reference Load (kW) Observed Load (kW) 1:00 0.99 1.03 2:00 0.87 3:00 0.78 4:00 5:00 Load Impact (kW) Uncertainty Adjusted Impact - Percentiles %Load Reduction Weighted Temp (F) 10th 30th 50th 70th 90th -0.04 -3.70% 79.9 -0.04 -0.04 -0.04 -0.04 -0.03 0.90 -0.03 -3.28% 78.8 -0.03 -0.03 -0.03 -0.03 -0.03 0.79 -0.02 -2.08% 77.5 -0.02 -0.02 -0.02 -0.02 -0.01 0.72 0.74 -0.02 -2.82% 75.8 -0.02 -0.02 -0.02 -0.02 -0.02 0.70 0.71 -0.01 -1.01% 74.8 -0.01 -0.01 -0.01 -0.01 0.00 6:00 0.67 0.68 -0.01 -1.45% 73.6 -0.01 -0.01 -0.01 -0.01 -0.01 7:00 0.69 0.68 0.01 0.81% 72.8 0.00 0.00 0.01 0.01 0.01 8:00 0.69 0.72 -0.03 -4.48% 75.2 -0.03 -0.03 -0.03 -0.03 -0.03 -0.02 9:00 0.70 0.72 -0.02 -2.91% 80.5 -0.02 -0.02 -0.02 -0.02 10:00 0.81 0.81 0.00 -0.11% 85.1 0.00 0.00 0.00 0.00 0.00 11:00 1.04 0.97 0.06 5.97% 89.8 0.06 0.06 0.06 0.06 0.06 12:00 1.26 1.17 0.09 6.82% 93.4 0.08 0.08 0.09 0.09 0.09 13:00 1.51 1.40 0.11 7.49% 96.5 0.11 0.11 0.11 0.11 0.12 14:00 1.72 1.57 0.15 8.68% 98.4 0.15 0.15 0.15 0.15 0.15 15:00 1.92 1.65 0.27 14.14% 100.2 0.27 0.27 0.27 0.27 0.27 16:00 2.10 1.84 0.26 12.43% 101.2 0.26 0.26 0.26 0.26 0.26 17:00 2.25 2.01 0.24 10.84% 101.8 0.24 0.24 0.24 0.25 0.25 18:00 2.30 2.08 0.22 9.74% 101.5 0.22 0.22 0.22 0.23 0.23 19:00 2.20 2.01 0.19 8.59% 100.0 0.19 0.19 0.19 0.19 0.19 20:00 2.03 2.04 -0.01 -0.38% 97.0 -0.01 -0.01 -0.01 -0.01 -0.01 21:00 1.91 2.07 -0.15 -8.03% 93.3 -0.16 -0.15 -0.15 -0.15 -0.15 22:00 1.78 1.91 -0.13 -7.06% 89.6 -0.13 -0.13 -0.13 -0.12 -0.12 23:00 1.52 1.61 -0.09 -5.71% 86.5 -0.09 -0.09 -0.09 -0.09 -0.08 0:00 1.23 1.29 -0.06 -5.21% 83.6 -0.07 -0.07 -0.06 -0.06 -0.06 Reference Energy Use (kWh) Observed Energy Use (kWh) Change in Energy Use (kWh) 32.38 31.38 1.00 Daily Cooling % Daily Load Degree Hours Reduction (Base 75) 3.08% 130.9 10th 1.00 Uncertainty Adjusted Impact - Percentiles 30th 50th 70th 1.00 1.00 1.00 90th 1.00 35 Table 4 - 8 Load Impacts by Hour for the Average Non-CARE Customer for the Average Event Day Hour Ending Reference Load (kW) Observed Load (kW) 1:00 1.14 1.19 2:00 1.00 3:00 0.92 4:00 5:00 Load Impact (kW) Uncertainty Adjusted Impact - Percentiles %Load Reduction Weighted Temp (F) 10th 30th 50th 70th 90th -0.05 -4.57% 80.2 -0.06 -0.05 -0.05 -0.05 -0.05 1.05 -0.05 -4.68% 79.0 -0.05 -0.05 -0.05 -0.05 -0.04 0.95 -0.03 -3.17% 77.7 -0.03 -0.03 -0.03 -0.03 -0.03 0.88 0.88 -0.01 -1.03% 76.0 -0.01 -0.01 -0.01 -0.01 -0.01 0.85 0.86 0.00 -0.51% 75.0 -0.01 -0.01 0.00 0.00 0.00 6:00 0.88 0.90 -0.02 -2.48% 73.9 -0.03 -0.02 -0.02 -0.02 -0.02 7:00 1.02 1.05 -0.03 -2.48% 73.2 -0.03 -0.03 -0.03 -0.02 -0.02 8:00 1.02 1.07 -0.05 -4.64% 75.6 -0.05 -0.05 -0.05 -0.05 -0.04 9:00 1.01 1.04 -0.03 -3.18% 80.8 -0.04 -0.03 -0.03 -0.03 -0.03 10:00 1.10 1.13 -0.03 -2.41% 85.4 -0.03 -0.03 -0.03 -0.03 -0.02 11:00 1.30 1.28 0.02 1.29% 90.1 0.01 0.02 0.02 0.02 0.02 12:00 1.56 1.49 0.07 4.20% 93.7 0.06 0.06 0.07 0.07 0.07 13:00 1.86 1.75 0.12 6.20% 96.7 0.11 0.11 0.12 0.12 0.12 14:00 2.08 1.95 0.13 6.22% 98.7 0.13 0.13 0.13 0.13 0.13 15:00 2.28 1.78 0.51 22.21% 100.5 0.50 0.51 0.51 0.51 0.51 16:00 2.57 1.98 0.59 22.95% 101.5 0.59 0.59 0.59 0.59 0.59 17:00 2.85 2.20 0.65 22.71% 102.0 0.64 0.65 0.65 0.65 0.65 18:00 3.02 2.30 0.72 23.74% 101.7 0.71 0.72 0.72 0.72 0.72 19:00 2.93 2.31 0.62 21.16% 100.2 0.62 0.62 0.62 0.62 0.62 20:00 2.78 2.82 -0.04 -1.46% 97.3 -0.04 -0.04 -0.04 -0.04 -0.04 21:00 2.57 2.79 -0.23 -8.76% 93.5 -0.23 -0.23 -0.23 -0.22 -0.22 22:00 2.27 2.50 -0.23 -10.22% 89.8 -0.24 -0.23 -0.23 -0.23 -0.23 23:00 1.84 2.00 -0.16 -8.60% 86.7 -0.16 -0.16 -0.16 -0.16 -0.16 0:00 1.45 1.56 -0.11 -7.84% 83.8 -0.12 -0.11 -0.11 -0.11 -0.11 Reference Energy Use (kWh) Observed Energy Use (kWh) Change in Energy Use (kWh) 41.17 38.83 2.34 Daily Cooling % Daily Load Degree Hours Reduction (Base 75) 5.69% 130.9 10th 2.34 Uncertainty Adjusted Impact - Percentiles 30th 50th 70th 2.34 2.34 90th 2.34 2.34 Table 4 – 9 shows the average reduction in peak-period energy use for CARE and non-CARE customers for each event day. As seen, there was more variation in results across event days for CARE customers than for non-CARE customers. For example, the variation in kW reduction SmartDays for CARE customers was more than three fold, from a high of 0.43 kW on July 8th to a low of 0.13 kW on September 3rd. For non-CARE customers, the difference was about 2.5 to 1, from a low of 0.4 kW, also on September 3rd, to a high of 1.06 kW on July 19th. 36 Table 4 – 9 Average Load Impacts by Event Day for CARE and non-CARE Customers Event Day Date CARE Customers Non-CARE Customers Percent Impact 16.6% kW Impact 0.43 Percent Impact 21.1% kW Impact 0.70 July 9th 13.0% 0.35 23.8% 0.80 July 19th 13.5% 0.38 29.6% 1.06 August 27th 7.2% 0.14 18.6% 0.46 28th 11.4% 0.26 23.6% 0.68 August 29th 14.1% 0.36 26.1% 0.86 September 3rd 7.6% 0.13 18.6% 0.40 September 4th 9.1% 0.17 21.2% 0.49 5th 11.0% 0.22 20.9% 0.51 11% 0.24 22.6% 0.62 July 8th August September Average Day There are several possible reasons why CARE customers were less price-responsive than non-CARE customers. One potential reason for the difference is that CARE customers use significantly less energy than do non-CARE customers. In the 12 month period prior to enrolling in SmartRate, CARE customers used roughly 30 percent less electricity than did non-CARE customers. Other studies have shown a positive correlation between energy use and price responsiveness.25 Indeed, as was seen in Table 4 – 3, SmartRate customers with annual energy use less than 5,000 kWh were substantially less price responsive than were customers with annual energy use greater than 10,000 kWh. It should be noted, however, that a comparison of the reference load estimates for CARE and nonCARE customers on SmartDays shows that the percent of daily electricity use that falls within the five-hour peak period from 2 pm to 7 pm was essentially the same for CARE and non-CARE customers. Indeed, CARE customers used slightly more (32.4 percent) of their daily electricity use during the peak period than did non-CARE customers (31.1 percent). This would suggest that CARE customers may, at least on a percentage basis, have as much discretionary load as do nonCARE customers. As was seen earlier, survey data showed that roughly 71 percent of CARE customers had central air conditioning whereas 92 percent of non-CARE customers had central air conditioning. Given that both segments had similar load shapes, it may be that CARE customers in the Bakersfield area have a much higher saturation of room air conditioning units or evaporative coolers than do non-CARE customers. If so, it may be easier to reduce load in response to dayahead notification by adjusting a central air conditioning thermostat for the next day than by adjusting multiple room air conditioners or controlling an evaporative cooler, which may only have an on/off switch rather than a thermostat. Another potential explanation for the difference in demand response between CARE and non-CARE customers is that many CARE customers may have enrolled primarily to obtain the early enrollment 25 Ibid. 37 incentive rather than with the intent of responding to the SmartRate price signals (at least not during the bill protection period when there is no downside risk to not responding). It should also be noted that CARE customers get a very large percent reduction in their average price of electricity when they enroll in SmartRate regardless of whether or not they shift load. The 3 ¢/kWh credit that all customers receive for all non-peak period hours during June through September represents almost a 30 percent reduction in the average price of electricity for CARE customers whereas it represents only a 20 percent reduction for a mid-Tier 3 customer and closer to a 10 percent reduction for a Tier 5 customer’s average price. Put another way, CARE customers are almost certain to be structural benefiters if they enroll in SmartRate and many of them may be content to simply take their windfall gain and ignore the additional savings that would come from reducing load further on SmartDays. PG&E plans to conduct an analysis of the difference in price responsiveness between structural benefiters and customers who are not structural benefiters, which will shed light on this speculative explanation for the low responsiveness of CARE customers. The structural benefiter analysis will be reported in conjunction with the SmartRate ex ante load impact estimates on April 1, 2009. Another hypothesis is that it may be more difficult for CARE customers to be properly notified when SmartDays occur. For example, there may be a higher percent of households that have English as a second language among CARE customers. In addition, the program data indicates that over fifty percent of non-CARE customers provided PG&E with multiple ways to be notified of events. In contrast, only 30 percent of CARE customers provided more than one point of contact. Furthermore, approximately 65 percent of CARE customers were notified of events through phone only, while approximately 45 percent of non-CARE customers relied solely on phones. In addition fewer CARE customers received notification via e-mail, either as a primary or secondary method of contact. Approximately 17% of CARE customer received event notifications via e-mail, while 36 percent of non-CARE received event notification via e-mail. These differences could at least partially explain why CARE customers showed more modest load reductions than did non-CARE customers on SmartDays. On final possibility is the fact that CARE customers often sign up for PG&E offers in disproportionate numbers relative to their representation in the general population based on the belief that, if PG&E is offering something new to them, it must be in their interest. Put another way, CARE customers may have believed that, merely by signing up, their bills would go down regardless of what they did or didn’t do. Indeed, this may be true for nearly all CARE customers, given the fact that the 3 ¢/kWh subsidy is equivalent to a bigger rate reduction on a percentage basis for CARE customers than it is for non-CARE customers. If CARE customers believed that SmartRate was just another example of PG&E being “on their side,” they may not have taken the time to educate themselves regarding the additional opportunity that exists for them to further reduce their bills by shifting energy use on SmartDays. 38 5. NON-RESIDENTIAL EX POST LOAD IMPACT ESTIMATES This section presents load impact estimates for non-residential customers who are on PG&E’s A-1 tariff and who also signed up for SmartRate. Only about 210 A-1 customers participated in SmartRate in 2008. The impacts presented here are based on the entire participant population, except for the elimination of roughly 20 customers due to a high percent of missing values, as discussed in Section 3. Due to the small number of participants, the non-residential impact estimates presented here should be used with significant caution. This sample of customers is not representative of the nonresidential (A-1) customer population either in Kern County or PG&E’s service territory as a whole. Furthermore, as discussed in Section 3, there was a significant amount of missing interval data for some customers on event days and the percentage of missing data was higher during the event period than during the non-event period. The combination of small sample size and missing data makes these results suggestive at best. The small sample size also makes it even more questionable to provide impact estimates for various customer segments, such as the breakdown by the eight business types requested by the CPUC. As such, we have not presented estimates by business type in this report. Estimates have been provided to the CPUC in electronic format for four business types (offices and services, retail, institutional and schools and other) but we strongly caution drawing any conclusions from these estimates. Indeed, that is our primary recommendation for the results presented below—that is, these results represent the best estimates available for a small, non-representative sample of customers in Kern County that have a significant amount of missing data. 5.1. LOAD IMPACT ESTIMATES FOR A-1 CUSTOMERS Figure 5 – 1 and Table 5 – 1 summarize the load impacts for the average A-1 SmartRate customer for the average event day in 2008. Similar figures are shown in Appendix B for the average A-1 customer on each of the nine event days that were called in 2008. The average impacts for each event day are summarized and discussed below. As seen in Figure 5 – 1, the average load reduction for the 194 A-1 customers included in the estimating sample across the 9 event days was 16.0 percent and the absolute load reduction was 0.47 kW. The load reduction on the average event day ranged from a low of 12.8 percent between 2 and 3 pm to a high of 18.5 percent in the last event period hour, from 5 to 6 pm. The aggregate load reduction for the 194 customers represented in the average event day estimate is approximately 0.09 MW, or 90 kW. Of the 194 customers in the estimating sample, 138 were located in the Kern County LCA and the remaining customers were classified in the Other LCA. The average reduction across the four hour event period on the average event day was 13.9 percent for the Kern County segment (O.4 kW), and 20.9 percent (O.73 kW) for the Other LCA segment. The aggregate average reduction for the Kern County LCA was 50 kW and the aggregate for the Other LCA was 40 kW. Table 5 – 2 shows the average absolute and percent reduction in energy use across the four hour SmartRate period for each SmartDay. Underlying this average percent reduction for A-1 customers is a precipitous fall in impacts from the first event sequence to the last. Impacts on the first three event days in July ranged from 29 to 42 percent, which were significantly larger than the residential percent impacts. However, for the three-day event period in late August, leading up to the Labor Day weekend, impacts dropped significantly, ranging from a low of 9.3 percent on August 29th to a 39 high of 14.4 percent the prior event day. Load impacts dropped yet again over the last three-day event period, with a low of just 0.7 percent on September 5th to a high of 8.5 percent on September 3rd. It is difficult to know what to conclude about the significant drop in load impacts for A-1 customers across the event periods. The overall average reduction in load was much larger than what was found for this customer segment in California’s Statewide Pricing Pilot, which showed no statistically significant reduction for this segment in the absence of enabling technology.26 Put another way, the SPP results were a lot closer to what is shown here for the September event period than for either July or August or for the overall average impact. On the other hand, the July results clearly show a willingness to reduce load, at least initially, even in the absence of enabling technology. The fall in load impacts may result, in part, from a potential need to maintain business operations leading up to and shortly following the Labor Day holiday period, or from fatigue associated with the fact that events were called on six out of seven consecutive business week days. Whether this trend is permanent will not be known until the 2009 event results are available. As indicated at the beginning of this section, we do not recommend drawing any important conclusions from this small, unrepresentative sample of non-residential customers. Figures 5-2 and 5-3 and Table 5-3 contain information on the distribution of impacts across customers. As indicated in Table 5-3, roughly 60 percent of non-residential customers provided at least some load reduction for the average event day. Almost 45 percent of customers provided load impacts exceeding 10 percent, and almost a third provided load reductions exceeding 20 percent. 26 Stephen S. George, Ahmad Faruqui and John Winfield. California’s Statewide Pricing Pilot: Commercial & Industrial Analysis Update. Final Report, June 28, 2006 40 Figure 5 - 1 Load Impacts for the Average A-1 Customer for the Average Event Day Type of Results Customer Class Event Participant Category Event Date Event Notification Average Customer C&I Average Event A1 Average Event Day Ahead Event Start 14:00 Event End 18:00 ACCOUNTS FOR EVENT 194 TOTAL ENROLLED ACCOUNTS 194 Avg. Load Reduction for Event Window 0.47 % Load Reduction for Event Window 16.0% Reference Load (kW) Observed Load (kW) 3.5 2.5 2.0 1.5 1.0 0.5 0:00 21:00 18:00 15:00 12:00 9:00 6:00 3:00 0.0 0:00 Average Customer (kW) 3.0 Hour Ending 41 Table 5 - 1 Load Impacts by Hour for the Average A-1 Customer for the Average Event Day Hour Ending Reference Load (kW) Observed Load (kW) Load Impact (kW) %Load Reduction Weighted Temp (F) Uncertainty Adjusted Impact - Percentiles 10th 30th 50th 70th 90th 0.01 1:00 1.07 1.08 -0.01 -1.22% 81.2 -0.04 -0.02 -0.01 0.00 2:00 1.00 1.00 0.00 -0.28% 80.0 -0.03 -0.01 0.00 0.01 0.02 3:00 0.96 0.93 0.02 2.21% 78.6 0.00 0.01 0.02 0.03 0.04 4:00 0.91 0.91 0.01 0.96% 77.1 -0.01 0.00 0.01 0.02 0.03 5:00 0.93 0.92 0.01 1.01% 76.0 -0.01 0.00 0.01 0.02 0.03 6:00 0.99 0.96 0.02 2.22% 75.0 0.00 0.01 0.02 0.03 0.05 7:00 0.97 0.94 0.02 2.53% 74.5 0.00 0.02 0.02 0.03 0.05 8:00 1.16 1.19 -0.02 -2.15% 76.9 -0.05 -0.03 -0.02 -0.02 0.00 9:00 1.68 1.75 -0.08 -4.57% 81.8 -0.10 -0.09 -0.08 -0.07 -0.05 10:00 2.11 2.29 -0.18 -8.54% 86.5 -0.20 -0.19 -0.18 -0.17 -0.16 11:00 2.78 2.77 0.01 0.43% 91.2 -0.01 0.00 0.01 0.02 0.04 12:00 3.01 3.02 -0.01 -0.44% 94.7 -0.04 -0.02 -0.01 0.00 0.01 13:00 3.12 3.12 0.00 -0.04% 97.6 -0.02 -0.01 0.00 0.01 0.02 14:00 3.15 3.18 -0.02 -0.74% 99.5 -0.05 -0.03 -0.02 -0.01 0.00 15:00 3.20 2.79 0.41 12.78% 101.3 0.39 0.40 0.41 0.42 0.43 16:00 3.10 2.55 0.54 17.55% 102.2 0.52 0.53 0.54 0.55 0.57 17:00 2.90 2.45 0.46 15.70% 102.9 0.43 0.45 0.46 0.47 0.48 18:00 2.63 2.15 0.49 18.53% 102.5 0.46 0.48 0.49 0.50 0.51 19:00 2.32 2.49 -0.17 -7.52% 101.1 -0.20 -0.18 -0.17 -0.16 -0.15 20:00 2.16 2.43 -0.27 -12.47% 98.2 -0.29 -0.28 -0.27 -0.26 -0.25 21:00 2.08 2.17 -0.08 -4.03% 94.5 -0.11 -0.09 -0.08 -0.07 -0.06 22:00 1.86 1.91 -0.05 -2.44% 90.9 -0.07 -0.05 -0.05 -0.04 -0.02 23:00 1.58 1.63 -0.05 -3.27% 87.8 -0.07 -0.06 -0.05 -0.04 -0.03 0:00 1.36 1.36 -0.01 -0.52% 84.8 -0.03 -0.02 -0.01 0.00 0.02 Reference Energy Use (kWh) Observed Energy Use (kWh) Change in Energy Use (kWh) 47.01 45.98 1.03 Daily Cooling % Daily Load Degree Hours (Base 75) Reduction 2.18% 130.9 10th 1.02 Uncertainty Adjusted Impact - Percentiles 30th 50th 70th 1.02 1.03 1.03 90th 1.03 42 Table 5 - 2 Load Impacts by Event Day for A-1 Customers Date Day of Week # of Enrolled Customers27 Maximum Temp (oF) Minimum Temp (oF) Average Hourly Load (kW) 2 pm to 6 pm Average Load Reduction (kW) 2 pm to 6 pm Average % Load Reduction 2 pm to 6 pm 7/8 T 185 108 79 3.03 0.86 28.5 7/9 W 185 108 84 3.19 0.96 30.2 7/10 Th 185 111 84 3.23 1.35 41.8 8/14 Th 187 98 72 2.83 0.31 11.0 8/27 W 187 102 74 3.06 0.44 14.4 8/28 Th 186 106 77 3.27 0.30 9.3 8/29 F 208 98 66 2.62 0.15 5.9 9/3 W 208 100 67 2.73 0.23 8.5 9/4 Th 208 101 71 2.78 0.02 0.7 9/5 F 194 103 74 2.96 0.47 16.0 Avg n/a 185 108 79 3.03 0.86 28.5 27 These values represent the number of enrolled customers included in the estimation, which may differ moderately from the actual number of customers enrolled due to missing data or other factors. 43 Figure 5 - 2 Cumulative Distribution of % Load Reductions (Based on all customers, including those not notified) C&I Cummulative Distribution of % Load Reduction (by Customer) 100.0 80.0 Percent Load Reduction 60.0 40.0 20.0 0.0 -20.0 -40.0 -60.0 -80.0 -100.0 0 .1 .2 .3 .4 .5 .6 Proportion of Accounts .7 .8 .9 1 0.90 1.00 Figure 5 - 3 Cumulative Distribution of Average Event load Reduction (Based on all customers, including those not notified) Cummulative Distribution of Average Event Load Reduction (by Customer) 3.0 2.5 Load Reduction (kW) 2.0 1.5 1.0 0.5 0.0 -0.5 -1.0 -1.5 0.00 0.10 0.20 0.30 0.40 0.50 0.60 Proportion of Accounts 0.70 0.80 44 Table 5 - 3 Percent of Non-residential Customers Providing Load Reductions by Amount Share of accounts providing load reductions greater than… SmartDay Date 7/8/2008 7/8/2008 7/8/2008 8/8/2008 8/8/2008 8/8/2008 9/8/2008 9/8/2008 9/8/2008 Average Event 0% 71.7 72.5 75.4 53.4 58.0 66.0 54.8 52.4 54.2 61.2 10% 60.6 63.4 67.7 41.6 48.1 53.1 41.6 42.2 41.6 50.1 20% 50.4 53.4 55.4 31.1 40.1 43.2 34.3 34.9 31.9 40.8 30% 40.9 46.6 49.2 28.6 34.0 36.4 30.7 28.9 28.3 35.2 40% 39.4 43.5 44.6 22.4 29.6 32.1 27.1 25.9 24.7 31.4 50% 37.8 38.9 44.6 19.9 27.8 30.9 24.7 22.9 21.7 29.1 45 APPENDIX A: Load Impact Estimates for E-1 Customers This appendix contains estimates of load impacts for each hour of each event day, and for the average event day, for residential customers on the E-1 tariff. The brief table in the upper left hand corner of each page shows the level of aggregation (e.g., all customers or the average customer) and the event day date. Below that is a table showing the number of enrolled customers underlying the estimate for the event day and the average percent and absolute change in load across the five-hour event period on that day. The Table on the right hand side of each page shows, for each hour, the reference load, the predicted load with demand response (e.g., observed load), the load impact (the difference between the reference and predicted load), the percent load reduction, temperature, and the uncertainty adjusted impact estimates for the 10th, 30th, 50th, 70th and 90th percentiles. Tables containing similarly detailed information for other residential customer segments have been provided in electronic form to the CPUC. Specifically, tables for the average customer and for all customers combined, for the average event day and for each of the nine event days, have been provided for the following segments: • E-8 customers; • Customers located in each of two Local Capacity Areas, the Kern County LCA and Other; • Customers that were notified about an event (as distinct from all customers, some of whom were not notified); • CARE and non-CARE customers • Customers grouped according to the following annual energy use categories: ,5,000 kWh; 5,001 to 7,500 kWh; 7,501 to 10,000 kWh; 10,001 to 12,500 kWh; 12,501 to 15,000 kWh and >15,000; • Customers grouped according to the ratio of summer to non-summer usage for the following categories: summer < winter; summer = 101 to 125% of winter; summer = 125 to 150% of winter; summer = 150 to 175% of winter; summer = 175 to 200% of winter; summer > 200% of winter. 46 47 48 49 50 51 52 53 54 55 56 57 APPENDIX B: Load Impact Estimates for A-1 Customers This appendix contains estimates of load impacts for each hour of each event day, and for the average event day, for non-residential customers on the A-1 tariff. The brief table in the upper left hand corner of each page shows the level of aggregation (e.g., all customers or the average customer) and the event day date. Below that is a table showing the number of enrolled customers underlying the estimate for the event day and the average percent and absolute change in load across the five-hour event period on that day. The able on the right hand side of each page shows, for each hour, the reference load, the predicted load with demand response (e.g., observed load), the load impact (the difference between the reference and predicted load), the percent load reduction, temperature, and the uncertainty adjusted impact estimates for the 10th, 30th, 50th, 70th and 90th percentiles. Tables containing similarly detailed information for other non-residential customer segments have been provided in electronic form to the CPUC. Specifically, tables for the average customer and for all customers combined, for the average event day and for each of the nine event days, have been provided for the following segments: • Four business types—retail; offices and service; institutional and schools; and other. Note that the sample size was too small to provide a breakdown by the eight business types for which the CPUC requires impact estimates. • Customers located in each of two Local Capacity Areas, the Kern County LCA and Other. 58 59 60 61 62 63 64 65 66 67 68 69