The National Energy Efficiency Programmatic Best Practices Study

advertisement
Understanding the Elements of
Success: Findings from the National
Energy Efficiency Programmatic Best
Practices Study
CALMAC Meeting
December 15, 2004
Kenneth James, PG&E, Study Manager
Mike Rufo, Quantum Consulting, Prime Contractor
1
Project Advisory Committee
Kenneth James – PG&E
Pierre Landry – SCE
Project Team
Mike Rufo – Quantum
Marissa Meyers - Quantum
Rob Rubin – SDG&E
Phil Willems – Quantum
Jay Luboff – CPUC
Jane Peters – Research Into
Action
Eli Kollman – CPUC
Sylvia Bender - CEC
Bruce Mast – Frontier
Associates
Megdal & Associated
Shel Feldman Mgt Cons.
2
Presentation Overview
• Project Background
• Example Best Practices Study Findings
• Best Practices Website and Products
• Next Steps
3
Benchmarking
“Benchmarking is the process of identifying, sharing, and using best
practices to improve business processes.” Source: American
Productivity and Quality Center
"Benchmarking is simply about making comparisons with other
organizations and then learning the lessons that those comparisons
reveal". Source: The European Benchmarking Code of Conduct
Benchmarking almost always occurs as a collaborative process in
which participants share information. Typically the shared information is
about business processes with the intention of identifying excellence
and developing an understanding of how it is achieved.
Benchmarking addresses the question, “How one can improve by
learning from others?” Benchmarking is inexorably tied to concept of
best practices, also called business excellence or exemplary practices.
4
Best Practices
The term “Best Practice” refers to the business practice that, when
compared to other business practices that are used to address a similar
business process, produces superior results.
Best practices are documented strategies and tactics employed by
successful organizations and programs. Rarely is an organization or
program "best-in-class" in every area. Our focus is not on identifying
best programs or best organizations but, rather, best practices that exist
within and across programs.
Best Practices are identified from in-depth interviews with program
managers, thorough review of program documents, analysis of
secondary sources, and comparison of program features and outcomes.
The focus of this Study is on best practices that can be generalized and
have a high likelihood of transferability to other programs within or
across program categories.
5
Background
California energy crisis
Energy Action Plan
resource “loading
order”
CA knowledge management
database program guidance
New entrants into energy
efficiency
Nationally sustained
database - driver of high
program practices
Best Practices
Database
& Website
Adapting Best Practices
for Performance
Improvement
6
A Few Key Questions Addressed by this Best
Practices Project:
• What EE program design, implementation and
management practices are entities currently using?
• How effective are they?
• Where is there room for performance improvement?
• How will this knowledge assist in meeting the challenges
and opportunities in CA new EE environment?
7
Program Components
Program Design
Program Management
Project Management
Theory, Linkages & Partnerships
Reporting & Tracking
Structure, Policies & Procedures
Quality Control & Verification
Program Implementation
Program Evaluation
Outreach, Marketing & Advertising
Participation Process
Evaluation & Adaptability
Installation & Delivery
8
Program Areas
Program Area Reports (13)
On Beta Website:
• Residential
–
–
–
–
–
–
Lighting
HVAC
Single-Family Comp
Multi-Family Comp
Audits
New Construction
• Nonresidential
– Lighting/Turnkey
– Large Comprehensive
In Review:
• Nonresidential
•
– HVAC
– Trade Ally Training
– New Construction
Other
– Mass Market Advertising
Still in Preparation
– Res Appliances
9
Study Products:
Program
Area Content
White Paper Reports
Report
•
•
•
•
•
Summary of Findings
Overview of Programs (5-10 each, 80+ total)
Policy/Historic Context & Issues
Feature Benchmarking and Best Practices
Comparison of Outcomes
10
Example of Range of Programs Covered Large Nonresidential Comprehensive Incentives
Program Area
•
•
•
•
•
•
•
•
•
•
CA’s Nonresidential Standard Performance Contract
NYSERDA’s Energy $martTM C/I Performance
United Illuminating’s Energy Opportunities
BC Hydro’s Power Smart
Xcel Energy’s Custom Efficiency (Colorado)
Northeast Utilities’ Custom Efficiency
Massachusetts Electric’s Energy Initiative
Alliant Energy’s Energy Shared Savings
Efficiency Vermont’s Business Energy Services
SMUD’s Commercial & Industrial Custom Retrofit
11
Study Products:
Program
Area White
Paper Reports
Program
Profiles
•
•
•
•
•
•
Program Synopsis
Program Focus
Program Context
Program Components
Key Sources
Contact Information
12
Sample Program Design BPs
Generally Cross-Cutting
• Articulate plan/theory that states expectations, timing, approach
• Link strategic approach to policy objectives and constraints
• Build feedback loops into program design and logic
• Do not over-promise results
• Understand local market conditions
• Conduct sufficient market research
• Maintain program design flexibility to respond to changes in market &
other factors
• Put process plan (including program management) in writing
• Tailor program to the unique needs of market sector targeted
• Define & locate hard-to-reach customers & target programs
accordingly, as appropriate
13
Sample Program Management BPs
Generally Cross-Cutting
• Clearly define program management responsibilities to avoid
confusion as to roles and responsibilities
• Develop and maintain clear lines of communication
• Use well-qualified engineering staff (for technical programs)
• Motivate field staff and service providers
• Maintain consistency in personnel over time
• Delegate responsibility based on risk versus reward
• Make sure at least some of the institutional memory resides inhouse, not with subcontractors
• Reward high performing staff and link performance evaluations to
tangible measures which are known in advance and developed
together jointly by the manager and employee
14
Sample Reporting & Tracking BPs
Generally Both Program-Specific and Cross-Cutting
• Define & identify key information early in program process
• Clearly articulate data requirements for measuring program success
• Balance level of tracking planned against resource availability
• Design system to support requirements of evaluators
• Use Internet to facilitate data entry & reporting; build in real time
data validation systems that perform routine data quality functions
• Automate, as much as is practical, routine functions
• Develop electronic application processes
• Develop accurate algorithms & assumptions for savings estimates
• Conduct regular checks of reports to assess program performance
• Document tracking system & provide manuals for all users
15
Sample QC & Verification BPs
Mostly Program-Specific
• Develop inspection & verification procedures during program design
• Base quality control on program’s relationship with vendors, #
involved, types of measures, volume, variability of project size
• Use measure product specification in requirements & guidelines
• Require pre-inspections for large or uncertain impact projects
• Require builder/representative to be on site during inspection
• Verify accuracy of rebates, coupons, invoices to ensure the reporting
system is recording actual product installations
• Assure quality of product through independent testing procedures
• Build in statistical features to the sampling protocol
• Treat inspection visits as partnership-building & learning events
• Ensure inspectors have plenty of hands-on-construction practice
• Assess customer satisfaction with the product through evaluation 16
Program Implementation
Mostly Program Specific
• Keep participation simple, Offer a single point of contact for customers
• Develop participation strategies that are multi-pronged & inclusive
• Review & understand product availability before establishing eligibility
• Make program participation part of an existing, routine transaction
• Use Internet/electronic means to facilitate participation
• Avoid over committing to projects before design parameters are known
• Set incentive levels to maximize net not gross program impacts
• Use incremental costs to benchmark and limit payments
• Tie rebates for popular measures to those less likely to be considered
• Limit or exclude incentive payments to known free riders
• Tie incentives to building performance
• Offer low interest loans or financing as leverage on incentives
• Use disincentives for savings inflation for performance-based options
17
Program Marketing
Mostly Program Specific
• Use Energy Starīƒ¤ logo to instill consumer confidence/utility credibility
– Leverage with cities/community-based organizations & other programs
• Include adequate retail outreach & support to ensure product is stocked
& advertised & that point-of-purchase materials are accurate & clear
• Develop & disseminate case studies to showcase program projects
• Use target marketing strategies to ensure that hard-to-reach populations
are informed about available programs and options
• Use face-to-face marketing, where possible, especially for small biz
• Give builders the opportunity to participate in development of message
• Market to multiple departments with volume building organizations
• Provide trade allies with training & resources to enhance marketing
• Sell the customer benefits first, then energy efficiency
• Keep benefits quantifiable in economic terms, Promote life-cycle cost
• Take advantage of external factors (i.e., heat waves, etc.)
18
Program Evaluation
Generally Both Program-Specific and Cross-Cutting
• Engage the implementation team in the evaluation process
• Create a culture whereby evaluation findings are valued and
integrated into program management
• Present actionable findings in real time and at the end of study
• Stagger the timing of process and ex post impact tasks so that
process results are communicated on a relatively real-time basis
• Conduct impact evaluations routinely, but not necessarily annually
• Include periodic estimation or free-ridership and spillover
• Use regular process evaluation to provide timely and fresh results
• Periodically review & update market information on construction
practices, EE market share, measure adoption, trends
• Perform detailed market assessments particularly for MT programs
• Support program review & assessment at the most comprehensive
level possible
19
The Best Practices
Benchmarking
Website
20
Communication of Results Model
Top-line
BP List
BP Rationales,
Whitepapers, &
Gap Analysis
Comprehensive Program Profiles,
All Performance Benchmarks
Complete Documentation of Methods, Data
Collection Processes, Results
21
Home Page – Best Practice
Website
22
Find A Study
23
Links to Program Areas
24
Search Results
25
Website Products
– Program Area Chapter Reports
• Description of Report
• Full Chapter Report (PDF)
• Executive Summary (PDF)
• Best Practices Table (PDF)
– Program Profiles
• Description of Program Profile
• Program Profile Report (PDF)
26
•
•
•
•
Next Steps
Need feedback from Beta Review Group and
Project Advisory Committee (PAC)
Post review consideration, revisions made on
January
Load final chapters onto website in January to
be used for 2005-’08 Portfolio planning
Further analysis of outcomes ($/kWh saved)
27
Download