Energy Education - Energydataweb Home

advertisement
Energy Savings Assistance (ESA)
Energy Education:
Overview of Proposed Plan
March 7, 2013
Agenda
Time
Topic
Lead
10:00 am
Welcome / Introductions / Purpose
Carol Edwards
10:15 am
Overview of Research Plan
Steve Westberg
10:40 am
Task 1 Comprehensive Review
Valerie
Richardson
11:00 am
Task 2 Contractor Interviews
Steve Westberg
11:20 am
Task 3 Customer Qualitative In-Home Visits
Steve Westberg
11:40 am
Task 4 Customer Quantitative Survey
Steve Westberg
12:00 pm
Task 5 Savings Measurement
Valerie
Richardson
12:30 pm
Lunch Break
1:15 pm
Summary Wrap-Up / Next steps
Steve Westberg
Valerie
Richardson
2:00 pm
Adjourn
All
1
HINER / KEMA Team: Key Members
Team Member
Responsibilities
Company
Steve Westberg
Project Manager, HINER Lead (Tasks 2-4)
HINER
Valerie Richardson
Assistant Project Manager, KEMA Lead (Tasks 1, 5)
KEMA
Rachel Schiff
Comprehensive Review (Task 1)
KEMA
Paul Caracciolo
Field Interviews (Task 3)
HINER
Luke Thelen
Survey Programming, Quantitative Analysis, and
Statistics (Tasks 2-4)
HINER
Fred Coito
Savings Measurement
KEMA
2
Project Objectives
Key issues to be addressed:
•
Identify best practices and potential Improvements related to
HOW Energy Education is delivered (e.g., format, time, etc.)
•
Identify best practices and potential Improvements related to
WHAT materials and content are provided (e.g., relevance,
value, gaps)
•
Assess current and potential energy savings resulting from
Energy Education
3
Project Components: Data Sources
Five data components for the study:
Stage Component
1
2
3
4
5
Purpose
(1) Guidelines, training, and materials provided to contractors
Comprehensive
(and customers). (2) IOU programs and technologies, and OutReview
Of-Area materials or practices that could be leveraged/adopted.
(1) Understand current practices, knowledge, etc. of educators.
Contractor
(2) Solicit ideas for improvement from those closest to the
Interviews
education.
Understand and explore: (1) range of educational experiences,
Customer
(2) retention of content, (3) adoption of energy efficient
Qualitative
behaviors, and (4) unmet needs.
Customer
(1) Measure the prevalence of experiences, knowledge,
Quantitative
behaviors and needs across the population of ESA participants.
Savings
(1) Design the experiment. (2) Employ the method, (est.
Measurement
completion Dec 2014)
4
Meeting Objectives
Component
Comprehensive
Review
Contractor
Interviews
Customer
Qualitative
Customer
Quantitative
Savings
Measurement
HOW Delivery
WHAT Content
Energy Savings
X
X
(X)
X
X
X
X
X
X
(X)
X
5
Meeting Objectives
Objective 1: HOW energy education is delivered
Component
Research Questions
What training has been provided to contractors? How do IOU’s assess
Comprehensive
or monitor performance? What other methods of education delivery
Review
could be employed?
Contractor
How is education delivered? What differences in delivery exist? How do
Interviews
customers respond? What can interfere with effective delivery?
How has education been provided? What methods stand out? What
Customer
prompts you to put learning into practice? What new methods of
Qualitative
delivery have potential? What is missing or lacking?
Customer
How many or what percent of customers … based on qualitative issues
Quantitative
6
Meeting Objectives
Objective 2: WHAT materials are provided
Component
Research Questions
Comprehensive
Review
Contractor
Interviews
What materials are provided to contractors/customers? What is the
specific content? What new content could be added?
What resonates with customers? What do customers ignore? What do
customers ask about that is not included?
What content is most useful? What content does not seem to apply?
What issues within the home prevent adoption? What potential new
content appeals? What is missing?
Customer
Qualitative
Customer
Quantitative
How many or what percent of customers … based on qualitative issues
7
Meeting Objectives
Objective 3: Assess energy savings
Component
Research Questions
Comprehensive
Review
Contractor
Interviews
Customer
Qualitative
Customer
Quantitative
What challenges will be encountered regarding an experimental
design? What sample frame and sample units are available?
Savings
Measurement
---
To what extent have energy savings practices been implemented?
How can energy savings be measured? With what validity and
reliability? What energy savings can be attributed to energy
education?
8
Stage 1: Comprehensive Review
 Purpose:
• Document what contractors currently provide regarding Energy Education
• Provide a resource for the project team to identify potential new content,
delivery methods, or additional resources
 Approach Overview:
• Review will include:
• Program documentation for training and delivery practices
• Educational materials provided customers
• Contractor implementation and supervisory practices
• Third party studies and education materials
• Existing and planned utility programs and technologies with potential
application to En Ed (e.g., UAT, SmartMeter tools, etc.)
• Interviews with IOU managers
9
Stage 1: Comprehensive Review
 Deliverables:
• Written summary report that will include:
1. Listing of all materials received and reviewed
2. Short description of each document or item
3. Assessment by the reviewer(s) about the quality or estimated
efficacy of each document or item
4. Listing of aspects of materials, training, etc. that appear to be gaps
(e.g., materials and processes that appear to be missing or lacking in
the existing energy education program)
10
Stage 2: Contractor Interviews
 Purpose:
• Document, along with the Comprehensive Review:
• What contractors currently provide regarding energy education
• The range of differences between contractors
• Barriers to effective education
• Best practices
• Ideas for improvements from the contractors’ perspective
 Approach Overview:
• Contractor interviews among:
• front line (customer facing) supervisors/managers
• in-home technicians
• Qualitative: In-depth telephone interviews
• Quantitative: Web-based survey
11
Stage 2: Contractor Interviews
 Sampling:
Supervisor /
Manager
Assessment
Technician
Installation
Technician
PG&E
1
3
1
SCEOnly
-
2
-
SCG Only
-
2
-
SCE/SCG Joint
1
1
1
SDG&E
1
2
-
Total
3
10
2
12
Stage 2: Contractor Interviews
 Discussion Topics – Managers:
•
•
•
•
•
How much time is allocated during an in-home visit for energy education
How do you monitor or ensure compliance in the field
What do you evaluate technicians on
How do you compensate technicians
Do you ever receive feedback from customers about the education they
have received
• Etc.
13
Stage 2: Contractor Interviews
 Discussion Topics – Technicians:
• What training or education do you provide in homes
• What are the most and least effective topics and materials
• What barriers or problems that interfere with the training in the home do
you encounter (e.g., language, householder availability and interest, topics
relevant to the householder)
• What type of training did you receive prior to making field visits
• What was missing or lacking in your training or materials
• Etc.
14
Stage 2: Contractor Interviews
 Deliverables:
• Written report that describes, for front-line (customer-facing)
supervisors/managers and technicians:
1. Awareness and knowledge about educational requirements and
content
2. Aspects of delivery, including time spent in total and on specific
content areas, method (e.g., walking around/demonstration, sitting
at table, etc.), and recipients (e.g., homeowner, other household
members)
3. Perceived obstacles or barriers to effective education
4. Ideas for educational materials or delivery improvements
15
Stage 3: Customer Qualitative In-Homes
 Purpose:
• Compare contractor-provided information to what customers said about the
education they received to identify retention gaps
• Determine what customers have retained and put into action
• Identify gaps between what customers need and what they received
• Identify additional opportunities for new topics, delivery methods, and
resources
16
Stage 3: Customer Qualitative
 Approach Overview:
• Collect feedback and input from customers, regarding:
• How existing training practices meet the needs of different households
• Needs that might not be addressed
• Customer motivations
• How delivery and content can be improved
• What information has been retained and put into practice
• How household Energy Education experiences differed across
contractors and technician
• What customers think about potential new education materials, content,
or resources
• 30 in-home interviews
• 6 focus groups
17
Stage 3: Customer Qualitative
 Sampling:
In-Home
Interviews
Focus Groups
PG&E
12 (4 clusters)
2 (1 location)
SCE/SCG
12 (4 clusters)
2 (1 location)
SDG&E
6 (2 clusters)
2 (1 location)
18
Stage 3: Customer Qualitative
 Customer In-Home Interviews:
• Allows us to understand the environment as well as customer experiences
and preferences. For example:
• Should training be conducted during the walk-through, at the kitchen
table, or in front of a computer?
• Is there value in demonstrating what the customer should do to reduce
energy use?
• Is it possible to bring together all household members?
 Focus Groups:
• Group discussions to:
• Brainstorm ways to improve content and delivery
• Review and provide feedback on new content, materials, and delivery
ideas
19
Stage 3: Customer Qualitative
 Discussion Topics – In-Home Interviews:
• Expectations of the program and benefits gained from reducing energy
use, knowing about gas and electric safety, etc.
• Experience of receiving the educational information during the contractor’s
visits to your home: how was training conducted, etc.
• What you got out of the educational information or training: what did you
learn, did you know any of this previously, have you used any of this, etc.
• Barriers to implementing what you learned: habits hard to change, others
in the home not assisting, not home to take action, etc.
• What might be missing from the education: do you have any unanswered
questions, any changes you would suggest for the visit
20
Stage 3: Customer Qualitative
 Discussion Topics – Focus Groups:
• Similar to in-home discussions, but in addition:
• How can other utility programs and tools could be adopted: which of these
other programs might fit your situation, etc.
• Review and brainstorm around new ideas for content and delivery of
energy education: what do you think about this idea, etc.
21
Stage 4: Customer Quantitative
 Purpose:
• “Validate” and quantify the qualitative findings with a telephone survey
among a representative sample of ESA participants
 Approach Overview:
• 500 telephone interviews among ESA participants:
• Margin of error = 4.4% at 95% confidence
• Small enough to allow for recent participants to be included in the
survey
22
Stage 4: Customer Quantitative
 Survey Topics:
•
•
•
•
•
•
•
•
Screening for person most involved/present during assessment visit
Household characteristics: type of home, size, age, people in the home, etc.
Measurement of attitudes, motivations, and barriers to reducing energy use
Energy education: how much time spent, information provided, etc.
Frequency of energy efficient behaviors & beliefs about success
Satisfaction with energy education
Interest in new content, programs, tools, etc. to enhance energy education
Demographics
23
Stage 4: Customer Quantitative
 Deliverables:
1.
2.
3.
4.
5.
6.
Written report with percentage estimates of En Ed participants who:
Have specific needs and motivations
Recalled components of their energy education
Favorable (or not) about the content and delivery of the energy education
Implemented or took action as a result of their energy education
Interested in and favorable toward new content and delivery ideas
24
Stage 5: Savings Measurement
 Project Purpose:
• Decision 12-08-044 specifies a field component to measure actual energy
savings attributable to energy education, including an experimental group
of high usage (200-400% of baseline) CARE participants
 Approach Overview:
• Develop and implement an experimental design
• Controlling the “treatment” that is delivered to customers and comparing
energy usage to a control group to isolate savings
• Prior to August 31, 2013:
• Design and extract treatment and control samples from the CARE
population
• Determine which energy education program elements to test
• Establish monitoring and tracking process of post-treatment data after
customers receive treatment
• In 2014, we will analyze a minimum of 12 months of post-treatment billing
history using either T-test and/or conducting modeling
25
Stage 5: Savings Measurement
 Deliverables:
• August 31, 2013:
• The framework for the experimental design including sample
design and post-treatment tracking
• December 2014:
• Energy savings analysis results comparing 12 months of posttreatment behavior to control group during the same period
26
Updated Timeline
Date
Task or Action
January 24
Kick-Off Meeting
February 6
Detailed Research Plan
March 7
Public Workshop
February/March
Comprehensive Review
March
Contractor Qualitative; Experimental Design (sample design, extraction)
April
Contractor Online Quantitative; Experimental Design (test parameters)
April
Customers In-Home Qualitative
May
Customer Telephone Quantitative
Mid-July
Data Analysis and Draft Report
Mid-August
Public Workshop
August 31, 2013
Final Report
27
Summary of Comments
 Review of comments provided during this meeting
28
Next Steps and Action Items
 IOU’s are preparing information in response to HINER/KEMA
data request
 Comprehensive Review is in progress
 Develop the Final Research Plan
29
Download