Rewarding Energy Engagement

advertisement
Rewarding Energy Engagement
Evaluation of Electricity Impacts from CUB Energy Saver,
a Residential Efficiency Pilot Program in the ComEd Service Territory
Prepared by:
Matthew Harding, Assistant Professor of Economics, Stanford University
Patrick McNamara, Efficiency 2.0
Abstract
This whitepaper evaluates online and printed report savings from CUB Energy Saver,
an energy efficiency pilot program created by Efficiency 2.0 and the Citizens Utility
Board (CUB) of Illinois to generate energy savings for Commonwealth Edison (ComEd)
customers. The CUB Energy Saver program engages customers with advanced energy
savings recommendations, feedback and rewards. We have analyzed the program over
a 12-month pilot period from July 2010 to July 2011. Through multiple evaluation
methods a treatment effect of 6.01 percent is observed for online participants and 1.47
to 1.63 percent for mailer-only participants. Experimental Design and QuasiExperimental Design, as well as other forms of Large-Scale Data Analyses, are
explored and employed as the optimal protocol for program evaluation.
Efficiency 2.0 provided raw electricity usage and household characteristic data to
Matthew Harding, Assistant Professor of Economics, Stanford University, which was
accessed through a partnership with CUB and ComEd. All data was scrubbed of
personally identifiable information.
Table of Contents
•
I. Introduction (p. 2). Introduces the program by outlining the objectives, the
behavioral research in which those objectives are based and how the program
will be evaluated.
•
II. Program Overview (p. 5). Describes how the program was implemented,
including marketing and feedback.
•
III. Evaluation Protocols (p. 12). Sets forth the specific evaluation methods for
both online (“engaged”) and mailer-only (“passive”) treatments.
•
IV. Program Results (p. 16). Provides a detailed analysis of both engaged and
passive savings, as well as supporting process metrics.
1
I.
Introduction
A. Program Objectives
Energy efficiency is an oft-discussed method of alleviating many significant issues at
the forefront of economics and policy today: climate change, electrical grid stability
and consumer welfare. Not only is energy efficiency identified as cost-effective relative
to other options, but in many cases provides a positive net present value (McKinsey,
2009), meaning that regardless of any climate problems, energy efficiency has great
potential value. Secondly, energy efficiency has the potential to reduce peak-load
concerns, which to consumers means a reduced risk of blackouts and brownouts.
Because the efficiency of appliances affects products 24 hours a day, lowering
baseline usage and peak load simultaneously and reducing maximum load. Finally
consumers benefit directly through reduced utility bills, the extra funds of which they
can spend however they please.
The Illinois Citizens Utility Board and ComEd funded the CUB Energy Saver program in
order to help Illinois ratepayers save energy through information and incentives. The
CUB Energy Saver pilot was designed to test the impact of incentives, engagement,
and feedback across multiple channels including e-mail, direct mail and the use of an
online portal on consumer’s energy usage. In addition, the pilot set out to test the
scalability of the approach in order to determine its impact in improving the reliability of
the grid and reducing pollutants and emissions.
B. Evaluation Methods
The CUB Energy Saver program is a hybrid opt-out/opt-in program, where mailer-only
participants are assigned (opt-out) and online participants actively choose to
participate (opt-in). This paper has analyzed both of these components using
Experimental Design (ED) and Quasi-Experimental Design (QED) evaluation methods to
evaluate the program’s performance. QED is a sub-variant of ED evaluation that is in all
ways similar to ED, except for the lack of true random assignment to treatment and
control groups. ED programs are experiments in which group assignment is random
and measurements are taken both before and after treatment. This allows for the
groups to be as similar as possible and accounts for any minute differences that may
exist in the variable of measurement, in this case, electricity consumption. Many things
can affect the outcome of the experiment such as maturation, the observation effect,
and statistical regression (Sullivan, 2010). With appropriately large sample sizes of
similar groups these possibilities for bias can be assumed to be non-factors on the
outcome of the experiment.
By contrast, in QED programs, participants will opt into the program, potentially
exhibiting what is generally referred to as “self-selection bias”. Because these
2
households self-select into the program, they are believed to exhibit some form of bias
through characteristics like a desire to participate in so-called “green” programs or
online information portals that the rest of the population may not have access to. In
most cases the lack of an explicit control group is one of practicality where issues of
consumer equity mandate that certain programs be made available to all utility
customers or budgets limit the scope of the program. Methods to control for this
potential bias include Regression Discontinuity, Non-Equivalent Control Groups, and
Interrupted Time Series, variations of which are explored in this paper as an optimal
evaluation method for this type of program. For a more detailed discussion on any of
the aforementioned methods see “Guidelines for Designing Effective Energy
Information Feedback Pilots: Research Protocols,” Section 3, pages 13 to 21.
Large Scale Data Analysis (LSDA) is employed for supporting or, “process metrics”,
that measure non-energy aspects of the program, including online engagement. These
process metrics include time on site, e-mail open rate and page-views per visit. Time
on site is an important metric because it describes how much time someone spends
interacting with the website. E-mail open rate is a very important metric identifying how
responsive the users are to the information and indicates the level of trust and
engagement they have with the information. Additionally, the CUB Energy Saver
program provides monthly bill analysis and custom savings recommendations through
e-mail, so the open rate is a strong measure of engagement a user has with the
feedback they receive. Page-views per visit is another important metric similar to time
on site; using this information one can determine how long users are spending on
different types of information. With that and general assumptions about online reading
speed one can determine important metrics like the amount of information consumed
by the target audience.
C. Behavioral Research
Erhardt et al. (2010) reviewed 57 studies on the effects of feedback on energy usage,
covering over 30 years from 7 countries. Savings ranged from 4 to 12%, with savings
increasing as feedback becomes more direct and timely. They found that “Enhanced
Billing” and “Estimated Feedback” strategies were potentially the most cost-effective
of strategies because of the low cost relative to “Real-Time” feedback opportunities
that usually require in-home hardware. In “Enhanced billing” programs, household
specific advice is provided using the home’s energy bills to offer custom insights for
customers, showing average savings of approximately 3.8%. The “Estimated
Feedback” programs usually included more detailed energy audits and were available
on an ongoing basis, providing two-way interaction with the household and the utility,
exhibiting 6.8% average savings.
They argue that advanced meters alone will not provide the greatest energy savings
and that “incorporating the best motivational techniques and behavioral approaches
3
will be important to realize optimal savings”. Additionally, they found that most of the
savings resulted from behavioral changes and low-cost upgrades like washing in cold
water and installing CFLs as opposed to investments in new equipment. On the issue
of persistence, they find that longer studies had slightly lower savings, but that the
longer programs also had substantially larger sample sizes, which could be the cause
of the more modest savings numbers. However, when conducting a within-studies
analysis they find that energy savings are persistent but also that the persistence of
savings may require continued feedback. They recommend that the best-performing
programs will make feedback “convenient, engaging and beneficial for consumers”.
Dowling and Uncles (1997) first explored the concept of rewards programs by trying to
find out if they actually were effective in achieving their intended goals. These goals
can vary from brand recognition to customer retention, with the overall objective of
improving their relationship with their customers. They found that the evidence was
somewhat mixed on the effectiveness of loyalty programs driving sales and that loyalty
programs may be a mistake in terms of cost-effectiveness if introduced into an already
competitive market. However, those that “directly enhance the product/service value
proposition, or (ii) broaden the availability of the product/service, or (iii) neutralize a
competitor’s program, may be worthwhile.”
Yi and Jeon (2003) expanded on Dowling and Uncles’ research and argue that loyalty
programs are often misunderstood and/or have been misapplied. They find the effects
and value of loyalty programs are dependent upon the level of engagement the
customer has with the company. Specifically, they find that high-involvement loyalty
programs can positively influence brand loyalty but that under low-involvement
programs there is minimal effect on brand loyalty. They also addressed the differences
between immediate rewards and delayed rewards. Immediate rewards were things like
discount coupons or gift certificates where delayed rewards required long-term loyalty
to reach a rewards milestone. They found that delayed rewards were only effective for
high-engagement loyalty programs and that immediate rewards were effective for both
high and low-engagement programs.
4
II.
Program Overview
CUB Energy Saver includes four primary components:
•
•
•
•
Direct and community marketing
Online engagement
Regular email feedback
Reward points for saving energy
Participation is defined by the receipt of direct mail (“passive participants”) and signing
up for the website (“online participants”).
A. Program Marketing
Direct Marketing
A primary method of marketing both the web tools and the value of energy savings in
the CUB Energy Saver program was through the use of direct mail. In these mailers,
recipients found marketing information about the online rewards program and what
they would expect to find when they signed up. They were given 100 rewards points
for signing up, which made them automatically eligible for discounts like $10 gift cards
at a local restaurant or 20% off online stores. Direct mail has been used widely as an
engagement mechanism for various consumer-facing industries, including the utility
sector. In this case, the direct mail pieces were targeted at inducing energy savings
directly and incentivizing the recipient to sign up online to achieve greater savings.
The direct mail pieces delivered to participants were also designed to induce energy
savings independent of whether the participant signs up online. Multiple types of mailer
designs were delivered and are displayed below. The first mailer was sent in October
of 2010, while the second mailer shown was delivered in March and June of 2010. The
first mailer lists energy saving actions in a table on the front, displaying both points
earned and dollars saved for each action, as well as a secondary set of actions on the
back. The front identifies specific redemption opportunities and companies that offer
rewards on the website, and the back displays a normative treatment asking the
recipient to “team up” with their community to save energy.
Within the second version, there are two variants: one that displays rewards
redemption opportunities and one that makes normative comparisons. The rewardsoriented variant indicates how many rewards points one would receive by signing up,
as well as six different ways to redeem those points. The savings-oriented variant
outlines how much the recipient would be saving if they had signed up online with the
last mailer, and how much their “most efficient neighbors” who signed up online are
saving. Both variants tell the recipient how many of their neighbors signed up online on
5
the front. On the back they display customized savings recommendations and indicate
how many of their neighbors have committed to doing those actions on the site. The
only difference on the back is the text on the side of the mailer, which indicates how
much money could be saved over a year for the savings-oriented mailers or how many
points could be earned over a year for the rewards-oriented mailers.
6
7
Community Marketing
Another frequently used method of marketing in the CUB Energy Saver program is
community marketing, often through local outreach events or direct and indirect
contact with potential users via municipalities, employers, service organizations, nonprofits and other formal and informal organizations. At these events, an outreach
coordinator will engage with potential users face-to-face, describing the benefits of the
online rewards program, such as customized energy saving tips, the ability to earn
discounts, and participating in the organization’s efforts to engage similarly situated
users.
In certain cases, outreach coordinators encourage users to sign up during the event
when Internet access is available. An outreach coordinator may also set up more
formal presentations to companies, communities or other organizations at Team
events. Teams are any group of individuals who decide to join together to save energy
as a group, competing against other teams in their area. Teams allow for greater
potential savings because of the ongoing influence of social norms outside the realm of
the web tools, as well as through the additional content on the web tools related to
teams that lets individuals see how their team is performing. Team members often
interact in the real world through the Team that brought them together to save.
Additionally, members of Teams will see their teams standing relative to the best in
their area, which is another normative motivator. Below is an example of a Team page
for the City of Evanston employees.
8
B. Online Engagement
When a users signs onto the website, they are presented with different tabs to interact
with. The “Ways to Save” page outlining various energy saving options can be found
immediately below. Clicking on an action brings the user to a page describing that
action in greater detail with how-to steps and advice to assist the user in completing
the action. The page also provides dynamic fields to customize savings estimates. The
“Track Progress” page shows bar graphs outlining usage by end-use and savings todate and the “Rewards” page presents the user with ways to “spend” their points
balance by redeeming them for different gift certificates or products.
9
C. Monthly Email Feedback
In addition to the on and off-site contact described above, users receive monthly email updates when a bill comes in, informing them whether or not they saved
electricity. There is a normative treatment in these e-mails through the thumbs up and
associated text outlining how many points they earned if they saved, or a graphic and
text explaining that they did not save. In either case they will be see their current points
balance and to-date savings since joining, as well as two different energy savings
recommendations. These recommendations are adjusted seasonally for optimal
savings. If a user receives an e-mail about their May bill, they will see Summer savings
opportunities like cleaning their A/C filter and closing their blinds during the day, not
just year-round recommendations such as unplugging their coffee maker or to use a
drying rack instead of a clothes dryer.
10
D. Rewards
Following the recommendations of Yi and Jeon (2003), rewards points are distributed
on an immediate and ongoing basis to serve as a motivator for saving energy. For
every kWh a user saves in their monthly bill, they are given 2 points, up to 250 points
per billing period. Comparing projected and actual usage calculates the individual
savings for a given billing period. Users are notified of points through the above e-mail
on the day their bill is released, which itself happens the day after usage is read at the
meter. These rewards points do not expire and can be redeemed for 116 different
items like gift cards to local restaurants, discounts at online retailers or free magazine
subscriptions. This immediate feedback is an engagement mechanism for both high
and low engagement users; if someone wants to redeem their points immediately, they
can do so. If they want to save their points to redeem for a larger item long term, they
can do that as well. The rewards page and some sample rewards can be found
immediately below.
11
III. Evaluation Protocols
A. Impact Evaluation Overview
Online Savings
One of the challenges of program evaluation consists in finding a valid control group in
order to account for unobservable characteristics of adopters. Households which optin are believed to possess characteristics different from their opt-out counterparts
because their participation is not random, thereby exhibiting selection bias, which
skews the statistical evaluation. While households who don’t adopt are not influenced
by the treatment, they are arguably also different from adopting households in an optin program evaluation. Some households who don’t adopt over the period of analysis
may adopt at some later stage, while others may not adopt under any circumstances.
ET,C
= Electricity usage in kWh/day for Treatment and Control groups
DACT,C = Daily Average Consumption for Treatment and Control groups
n
= Number of households
m
= Month joined
y
= First month with available bill
z
= Last month of program
S
= Program Savings Percentage
In order to avoid these issues while evaluating savings for opt-in households, we use a
delayed treatment analysis whereby the kWh consumption over a given period for
current opt-in households is compared to that of households that opt-in at a later
point. This eliminates the potential bias from the propensity of opt-in households to
engage in such programs.
The aim of this evaluation process is to measure the overall savings achieved by the
program at a given time. This measures the aggregate effect without directly estimating
the savings for any particular individual. Furthermore, the program accumulates
savings from two sources. First, already enrolled households engage in energy savings
to a degree, which may vary over time and across households. Second, the overall
savings achieved by the program vary with the number of people in the program and
12
the continuous enrollment of new participants contributes to the overall program
savings.
We implement this method by using the random variation in the timing of adoption. The
pre-opt-in billing data from opt-in households make up the control group for prior time
periods. We exclude the billing period in which a household joins the site, eliminating
potential bias due to pre- and post-treatment effects occurring in the same bill.
As an example, assume we have 500 people join the program in each of three months,
January, February and March. Those entering the program in January will have their
average daily use values ignored for the month of January (per the average daily usage
rules outlined above). In February, those who joined in January will become part of the
treatment group. Those who join in February will have their bills ignored for the month
of February, and become part of the treatment group in March. Those who join in
March will have their bills ignored for that month, and become part of the treatment
group in April. The February bills of those who join the program in March represent the
control group for those who joined the program in January and are part of the
treatment group in February. This cycle continues throughout the year eliminating optin bias by comparing households to others exhibiting the same opt-in tendency.
Example: Wisconsin Low-Income Energy Assistance Program Evaluation
This method has been used widely elsewhere. For instance, a similar method was used
in Wisconsin in 2004 to evaluate their Department of Energy’s Low Income Public
Benefits program. The primary purpose of the analysis was to determine whether
participation in Wisconsin’s Weatherization Assistance Program (WAP) had an effect
on arrearage levels for those who were receiving financial assistance for their utility bills
through the Low Income Home Energy Assistance Program (LIHEAP). Home
weatherization work occurs at a one point of time, so the onset of treatment is both
known and fixed. The evaluation challenge comes from the fact that someone could
participate in the financial assistance program one year, not participate the next, and
once again participate the following year; this leaves no single point in time defining the
onset of treatment. Participation in the low-income assistance program was for 12
months at a time, but one could enroll at any time during the calendar year for the 12
months of the program, leading to the creation of what they called a “rolling control
group” in which the makeup of the control group is dynamic on a monthly basis.
Passive Savings
For mailer-only savings evaluation, a difference-of-differences model was used to
determine the percentage savings achieved by the treatment group. This was used to
account for any small, insignificant differences that may exist between the treatment
and control group before the start of treatment. This also ensures the control group is
13
as identical to the treatment group as possible, so that it represents what the treatment
group would look like had they not received any mailers. The steps below outline how
savings using the difference-of-differences method were calculated.
For the differences of differences method, a few steps are needed in the calculation of
energy savings for Month X after program initiation. Let YiX denote the energy
consumption for the nth household in Month X.
1. For each bill, calculate the daily average consumption (DAC); this is needed to
make bills comparable since different bills can have different billing period
lengths.
number of days in bill for Month X
2. The percentage difference WiX between DAC of Month X with the same month
last year (before program initiation), denoted by Month {-X} is calculated for
each household
3. The averages, V, across control and treatment households are calculated.
4. The average percentage savings SX in Month X is the difference of these two
averages.
5. We then calculate T, the total gross verified energy savings in kWh or therm
achieved by treatment households in that month. This is calculated by
multiplying S with the total energy consumption E of treatment households in
Month {–X}, the baseline used to calculate the percentage savings:
.
There are some basic billing requirements that make a difference of differences
analysis effective. The first is that the treatment and control groups are as statistically
similar as possible before treatment so that the difference-of-differences process is
14
only accounting for small, insignificant variations in usage, optimally around 0.1 kWh
per day. It cannot account for significant pre-treatment variations of 0.5 kWh or more,
as differences that large are likely due to significant variations in other things like home
square footage, household members or unobservable characteristics. A second
requirement is that the treatment and control groups must be large enough to easily
test the difference in observable characteristics pre-treatment, especially usage, as
well as to assume unobservable characteristics are similar.
B. Baseline Usage
Figure 1 outlines the seasonal electricity usage for the online control group through the
first year of the program, in sequence. The seasons are defined as they are in the solar
calendar, with summer beginning June 22nd and ending September 22nd, etc. The
program began in early June, but because the bill that includes the signup date is
biased and subsequently discarded, the requisite number of bills available to calculate
savings with confidence is not available until June 29th. Therefore, Summer 2010
actually covers the period from June 29th to September 22nd. At least 13 months of
bills was required to be included in the evaluation. The treatment group sample size is
2,925 households, while the control group totals 3,382 households.
Figure 1
15
IV. Program Results
A. Online Savings
Figure 2 below shows the 1-year daily savings estimates for the program, from the
point at which there were enough bills to calculate savings with confidence at the very
end of June. February is henceforth excluded due to irresolvable irregularities in the
billing data collected from ComEd during that billing period.
Figure 2
Figure 3 outlines the daily usage difference of differences for the treatment and control
groups, as well as the daily kWh savings by season.
16
Figure 3
Figure 4 outlines this more specifically with monthly savings estimates.
Figure 4
17
Daily, Monthly and Seasonal
With the appropriate removal of unknown February bills, weighted year-to-year savings
are 6.01%. The average daily usage for the control group was 29.4 kWh/day, with the
difference of differences being 1.77 kWh/day. This results in a 95% confidence interval
of 1.69 to 1.84kWh/day, or 5.74% to 6.26%. These results are 99.99% significant. The
above graphs illustrate that while consumption varies seasonally, significant savings
levels are exhibited year-round with fall savings being expectedly lower than the
summer and winter. Savings in the Spring of 2011 are higher than expected, though
that may be due in part to effective marketing campaigns and public signup events that
were not present prior to then. Savings in June 2010 support this conclusion as June
2011 was milder than July or August of 2010, but savings were comparable for both
summer periods. March 2011 is the month with the greatest savings, in part because
there were a large number of signups in February due to the aforementioned marketing
campaigns, and households often see substantial savings in their first month. February
is once again excluded due to reliable data being unavailable.
B. Printed Report (or “Mailer Only”) Savings
In addition to online savings, CUB Energy Saver has also generated passive, “Mailer
Only” savings by providing custom energy insights as well as tailored savings
opportunities for households. These mailers were distributed to approximately 15,000
households in the ComEd service territory three times over the pilot period. The
households received information about the best energy-saving opportunities,
incentives for signing up online, and contextual information about their communities.
The first mailers were delivered in homes mid-October 2010, the second mid-March
2011, and the third mid-June 2011. Data is available through the end of June.
We received a random, non-personally identifiable selection of 20% of available
accounts in the ComEd billing database, as well as all retrievable accounts from the
treatment group based on name and address information. A total of 15,672 treatment
accounts were retrieved, of which 14,855 had 24 or more bills. 60,065 control accounts
were leveraged within the same zip code as the treatment group.
We conducted two analyses to determine passive savings. The first analysis (Figure 5)
restricts the control group by not including participants and control accounts with
missing bills. The second analysis (Figure 6) includes these accounts with missing bills.
18
Figure 5
Figure 5 finds a weighted treatment effect of 1.63% and is 90% significant, and Figure
6 finds a weighted treatment effect of 1.47% that is also 90% significant.
The treatment group in Figure 5 is approximately 6,873 homes, while the treatment
group in Figure 6 contains 15,097. The control group in Figure 5 is just over 25,770
homes, while the control in Figure 6 is about 60,851. The analyses present more similar
estimates as the number of control bills increases over time, providing support to the
similarity and significance of the two analyses.
19
Figure 6
Though the monthly results are affected by billing availability, especially in the earlier
months, the trend of savings over time is mostly similar. Additionally both show
statistically significant treatment effects and improvement over time. Since the first
mailer was delivered in the middle of October, the second delivered in the middle of
March and the third delivered in the middle of June, these savings numbers generally
fit the results of similar programs in which the first mailer has a small effect, but isn’t
very sticky (“sticky” insofar as the savings effect lasts for a significant period of time
after the first mailer is delivered), but improves with further mailers. The second mailer
did not arrive until five months after the first, and savings are expected to decrease as
the time between the first and second mailer grows. Thereafter, the second mailer
savings improves and continues to improve with the third mailer arriving in June.
C. Process Metrics
There are a multitude of metrics one can use to measure the performance of a website
and how users or customers engage with it, but two of the most important ones are
‘time on site’ and ‘bounce rate’. Time on site is a measurement of how long someone
stays on a webpage reading, watching videos, filling out forms, etc. The longer the time
on site, the longer visitors are using whatever is being provided. Bounce rate is usually
20
very negatively correlated with time on site, as it is the percentage of those who come
onto a site and leave without visiting another page. All else being equal, the more
pages someone visits, the longer they are on the site. Time on site is a slightly more
complex metric, as different industries and types of websites will produce vastly
different metrics. Someone can spend a long time on a site because they really want to
find something but cannot; these people are motivated but frustrated, and are unlikely
to return to your site. At the same time, a short time on site does not necessarily reflect
negatively on the site; it very well may be that they’re finding what they want quickly
and easily, which will lead them to come back to the site again because they know
they can find what they need. Over time an effectively designed site will have time on
site shrink slightly through improvements designed to help users effectively navigate
the site, but those users come back again and again because they know they can find
what they need. So the best metrics for an improving site are decreasing bounce rates,
which indicate ease of navigation, and relatively stable if slightly decreased time on
site, unless the nature of the site changes significantly.
These metrics require better understanding because some sites will by nature have
higher bounce rates (links to scientific journals) or longer time on site (Hulu.com for
example) by their very nature. Liu et al. (2010) of Microsoft Research analyzed over 2
billion visits to over 200,000 different pages to understand time on site behavior. They
found the first 10 seconds on a page were the most important; if someone stayed 10
seconds they often stayed much longer. They observed this behavior follows a Weibull
distribution, which is traditionally used in engineering for reliability analysis.
The authors also looked further to find differences by site type, and found both page
viewing time to be highly dependent upon the nature of the site. Entertainment sites
were found to have much longer times than science and education sites.
Weinrich and Obendorf (2008) researched how much users actually read when they get
to the page they’re looking for. They found that users exhibit a “screen-and-glean”
behavior in which they quickly examine a page to see if they found what they wanted. If
they believe they did, they will read through the page to get the information they want.
They also found that while users expectedly spent more time on pages with more
words, every additional 100 words on the page lead to an average of 4.4 seconds more
time spent there. Given the average online reading speed is approximately 200 words
per minute (WPM)*, they find much of the additional content likely goes unread.
21
Figure 7
E-mail has regularly been used as a way for utility companies to save on customer
billing costs, but has not been used widely as a method of engagement with the
customer. The potential benefits are substantial. First, e-mail is a much faster method
of communication than letters or even phone calls. Second, e-mail communication
allows for quick and easy connection both to a utility’s website and to more detailed
information regarding someone’s bill and opportunities to save energy. Third,
customized billing analysis and recommendations to save energy can be sent the
moment a bill comes in rather than through time-consuming and expensive bill
generation processes. This keeps the information current and the delivery cheap.
Fourth, computer technology allows for more advanced graphics and levels of
interactivity directly in the e-mail than is possible through a paper mail piece,
increasing both the methods of direct engagement available and the overall potential
engagement with the customer. All these potential benefits outline the need to design
an e-mail based communication style that both informs and engages, so these benefits
can be realized.
Bounce Rate and Time on Site
Bounce rate and time on site are presented below in Figure 6. The bounce rate for the
duration of the program was 41.85% throughout the first year, with no significant
differences between new and returning visitors. However the bounce rate has dropped
significantly since the program began, with an average of 53.31% in the 1st quarter,
41.36% in the 2nd quarter, 39.31% in the 3rd quarter, and 34.21% in the 4th quarter. The
bounce rate drops 64% from the 1st to the 4th quarter, indicating a significant
improvement in the users ability to find what they’re looking for. There was no
22
difference in bounce rates through the first year between new and returning users, as
the difference between the two goes up and down slightly.
Average time on site is the metric designed to measure how long you have someone’s
attention and can engage him or her on the information displayed. The average time on
site for the program’s first year is 227 seconds, just under 4 minutes. With bounce rate
dropping so strongly, one might expect time on site to drop as users are able to find
information more quickly. However time on site goes up and down, with an average of
215 seconds in the 1st quarter, 257 seconds in the 2nd quarter, 234 seconds in the 3rd,
and 218 in the 4th. The changes in time on site come as the site changes as well; there
were significant improvements to the site in the 2nd and 3rd quarters, including both
layout changes and a significant amount of additional content. This fits with the
divergence between new and returning visitors, as the difference between new and
returning users is only 20 and 6 seconds in the first two quarters, while it is 26 and 69
seconds for the 3rd and 4th quarters, respectively. This indicates that while the content
on the site is increasing, many users are intentionally returning to the site and engaging
with the material and may indicate that they are able to do so in a more intuitive way.
Figure 8
Bounce Rate
New Users
Returning Users
Overall
1st Quarter
54.67%
51.06%
53.31%
2nd Quarter
40.29%
42.86%
41.36%
3rd Quarter
40.26%
37.45%
39.31%
4th Quarter
34.14%
34.42%
34.21%
1st Quarter
2nd Quarter
3rd Quarter
4th Quarter
Time on Site
New Users
Returning Users
Overall
208
228
215
255
261
257
225
251
234
200
269
218
Email Open Rate
The vast majority of the e-mails the users receive from the CUB Energy Saver platform
are either monthly savings and rewards updates or regular newsletters, which contain
seasonal savings information and other content. The average open rate tracked in
Efficiency 2.0’s analytical tool for these e-mails is 53%. However, the actual open rate
may be significantly higher, as the software used to track open rates only identifies an
e-mail as “opened” if the images in the e-mail are downloaded. Most major e-mail
services by default do not display images for a variety of reasons. Typical e-mail open
rates vary by industry, but commercial software services which track these metrics
typically find open rates to vary between 14 and 29%. As such, an average open rate
23
of 53% is a strong signal of engagement with both CUB Energy Saver and the
information it provides.
CUB Energy Saver conducted a survey, in which users were asked to identify the
things they were doing to save energy since joining the site. The answers were openended and respondents were given $5 Amazon.com gift cards for their responses. The
goal of the survey was to identify if users were making behavioral changes or installing
new equipment to save energy. “New equipment” can be anything from an efficient
refrigerator to insulation, while behavioral changes were defined as recurring actions of
no direct cost; the installation of CFLs was classified separately. In all, 83% of
respondents identified behavioral changes they made to save energy, frequently
actions like turning off lights, leaving the air conditioner temperature setting just a little
bit higher, and unplugging appliances. 25% of users made equipment changes in their
home, and 80% of those respondents made behavioral changes as well. 41% of
respondents said they bought CFLs for more lights or for the first time, 73% of whom
also made at least one behavioral change. While there were major changes made, such
as the installation of efficient HVAC equipment and new windows, the vast majority of
respondents reported they made at least one behavioral change in an effort to save
energy.
Users also were asked to report on the number of rebates or direct incentives they
received both before and after they joined the program. The vast majority of
respondents (79%) reported they did not receive a rebate or other incentive from
ComEd before or during program participation. Of the 21% of respondents who had
received a rebate or incentive, there were no significant differences in the levels of
incentives received before or after program participation, with just over 6% of
respondents saying they received an incentive or rebate after the program but not
before, and just under 9% reporting the opposite. An additional 6% of respondents
report receiving incentives both before and after joining CUB Energy Saver.
In an additional set of surveys, participants were asked how satisfied they were with
ComEd on a 1-7 scale before and after program participation (1 lowest satisfaction; 7
highest). One survey was distributed during the 7th month of the program, and another
in the 13th month, both to random sets of users. Satisfaction with the utility improves
upon program participation, with a 7.0% increase across utilities by the 7th program
month and a 10.6% aggregate increase by the 13th program month. The standard
deviation from the mean is relatively low at 1.09, 1.17 and 1.23 for the before, after 6
months and after 12 months. This indicates a large number of moderate increases in
satisfaction as opposed to a low number of high increases combined with a large
number reporting no increase in satisfaction.
24
Before joining program
After 6 months
After 12 months
4.68
5.01
5.19
The hypothesis of increased customer satisfaction was also explored by asking
participants whether their satisfaction would change if they were no longer able to
participate in CUB Energy Saver. They were given three reporting option: that their
satisfaction with the utility would strongly decrease, somewhat decrease, or not
decrease at all. Approximately 69% of participants stated their satisfaction with the
utility would decrease somewhat or strongly if they were no longer able to participate.
The remaining 31% responded that their satisfaction with the utility overall would not
decrease if they were no longer able to participate in the program. Further research is
required to indicate better how this perception of decreased satisfaction changes
during different periods of the program treatment.
Conclusion
CUB Energy Saver has produced significant electricity savings during the pilot period
based on demonstrated online engagement and incentives associated with the online
and mailer-only components. The use of multiple methods of engagement has been
examined and has been found to be a valuable strategy for engaging customers
around their energy consumption as well as motivating them to undertake energy
saving actions. Further program expansion and evaluation will attempt to illuminate
savings differences within household cohorts, marketing strategies and varying
incentives to provide more detailed guidance on the drivers of energy savings.
25
Download