Meeting Notes - Regional Technical Forum

advertisement
Guidelines Development for Unit Energy Savings (UES) Measure Life
Kick off Meeting August 10, 2011
Meeting Notes by Mark Kendall edited by SERA
Attendees SERA and Cadmus: Lisa Skumatz, SERA; Juri Freeman, SERA; Tina
Jumayara, Cadmus; Aquilla Malones, Cadmus; Josh Rushton, Cadmus
Attendees RTF Subcommittee: Gillian Charles, RTF; Mark Kendall, RTF; Rich
Arneson, Tacoma; Tom Leinhard, Avista; Eugene Rosilie, Cowlitz; Adam Hadley,
RTF; David Baylon, Ecotope; Ian Doyle, Lockheed; Erin Hope, BPA
Meeting Convened: 11:06 am
Meeting Adjourned: 12:40 pm
The presentations, meeting notes and other documents for review by the public and
subcommittee will be posted at:
\\http:/www.nwcouncil.org/energy/rtf/subcommittees/measurelife/
Agenda:
11:00 Introductions
11:10 Review of the Scope of Work and Work Plan
11:20 Character of measure life and persistence determination practice
11:50 Review of UES Measure Development Guidelines and relationship to
Measure Life Guidelines
12:10 Subcommittee discussion of expectations and considerations, next steps
The meeting convened and introductions of the contractor staff in the room and online was completed. Introductions of the other RTF members, corresponding
members, and interested parties was done.
RTF Staff Gillian Charles announced that a web site will be set up for this
Subcommittee. The PowerPoint presentations are posted to the Regional Technical
Forum Measure Life Guidelines Subcommittee web page along with the Scope of
Work for this contract. The detailed work plan and project schedule will be posted
to the web page as well. All notes summarizing the meetings will be posted after
the meetings.
RTF contract staff Mark Kendall summarized the elements of the scope of work and
the schedule for subcommittee interaction with the contractor and RTF
presentations of draft and completed materials.
Lisa Skumatz of SERA, Inc. led the presentation using two PowerPoint presentations.
One is an overview of measure life guidelines work having been done or underway
nationwide. Her discussion identified where SERA and Cadmus have been involved
in development of those best practices. The second was specific to the RTF project
Work Plan.
Discussion was invited throughout the presentations. Following is a summary of
Subcommittee questions and concerns.
Skumatz presented on how best practices are established for determination of
measure life and how they use verified program and measure evaluation. She
identified that many terms will need to be defined for use in the RTF Guidelines such
as Expected Useful Life (EUL), Remaining Useful Life (RUL), and Replacement and
how methods of determining measure life are applied from fixed load reduction
technologies requiring little or no commissioning and maintenance to behaviorally
dependent UES measures. She addressed the use of Weibull curves for determining
statistically defensible measure life. That analysis depends on data quality that
needs to be defined in the Guidelines. Data quality available or needed for different
measures or type of measures may be variable. That, too, needs to be a routine
procedure for assuring consistency across the portfolio of measures.
Skumatz discussed how measure quality will need to be defined in terms of
categories, reporting format, review and version control. She summarized the
general EUL results (and gaps) from reviews she conducted of more than 100 EUL
studies from a California project (and two related California follow-up projects) and
from a national review of EUL work / best practices, and noted that the report for
the RTF project will incorporate the quantitative results and additional values
identified through the RTF work. She discussed the role of having a documented
annotated bibliography of best practice studies on measure life that is available and
accessible in the future development of other RTF measures. She mentioned that
lots of national studies and adopted values have variations – partly (potentially) due
to variations in climate, usage, and programs, but partly because many adopted
values have been “adopted” through processes that have been consensus, quasiquantitative, and less-than-fully-documented in some cases (that includes CA, NW,
NE and elsewhere). The EUL protocols in different states also vary. For instance,
for a number of years, California required EUL studies of the top 4 measures that
accounted for 50% of the energy savings (and they mandated frequencies based on
sector – 4th/5th and 9th year studies, etc.). Thus, a large number of EUL studies were
conducted in California, but it also led to repeat EULs of certain common measures,
and gaps for others that weren’t “top 4”, for example. However, it led to
considerable attention in the evaluation and verification of measure life. How the
measure contributes to the regional portfolio of savings should weight on the data
quality and measure life confidence. Technical degradation factors of measures has
not been widely studied in any consistent fashion. She noted few to no EUL studies
had been conducted on behavioral program measures (and discussed the potential
importance of behavior on measure-based programs, with thermostats and
commissioning as examples). She also discussed the weaknesses inherent in the
trend toward simplifying EUL tables by adopting one value across a whole sector –
which buries important differences in lifetimes within the non-residential sector
(quick turnover in restaurants, vs. slow turnover in schools).
Baylon asked what studies had verifiable measure life and cost or used a best
practice methodology for determination of those? We need to use those resources
as indicated, if they are determined using best practices recognized by our emerging
guidelines. He asked what do we mean by data sourced estimation of measure life
versus consensus on life, based on studies. He referred back to earlier presentation
on documenting a procedure for determining data quality.
Arneson asked if there should be measure and program impact evaluation criteria
or procedures developed to increase data and data quality around measure life. The
group was generally in support of a procedure for evaluations to improve measure
life knowledge. He mentioned that we should not just take measure life on face value
and that lifetime maintenance requirements might need to be defined with regards
to avoid performance degradation.
Hope asked about behavioral measures and how measures whose performance
relies heavily on commissioning to verify repeatable sequence of operation
comparison to the performance expected for the measure will be addressed. His
concern includes measure for which the measure itself is behavioral, such as alarm
systems in Energy Management Controls or variable speed drive alarms and the like.
He encouraged attention to Guidelines for ongoing evaluation methods for
determining measure life. His example is for measures in industry where how the
equipment is set up, and controlled can be changed in the field and that verification
of performance as a UES measure over some period of time would improve
confidence in measures relying on behavior of the operators. He identified that the
life of the HPEM suite of industrial management measures is verified by BPA each
year for three years to determine consistency of the behaviors that yield the UES
savings.
There was general discussion regarding measure life based upon certain load
characteristics that may change over time. Some measure may be installed in a
facility that has unpredictable use or load hours that may cause significant measure
life consequences (e.g. going from one shift to three in an industrial setting).
Arneson pointed out that that also applies to business types not just equipment use.
For example, restaurants have a typical business life of between 5 and 7 years. We
will not only want to qualify measure life for it’s technologically verifiable life, but
also for its application. Does that indicate that we need more UES approved
measures based upon their application because a significantly shorter life in a
different application may change the cost effectiveness.
Lienhard asked about what role deferred maintenance will have on the guidelines
for determining EUL’s and RUL’s and how measure life is addressed regarding the
useful life of other equipment around it upon which it interacts or is dependent will
be important. He mentioned that utilities are moving past one year energy savings
estimates to multiple years of crediting savings and that Guidelines should address
technical performance degradation factors.
Doyle asked about the bounds of behavioral measure life. Some measures require
on behavior for the installed systems or controls to perform and provide the EUS.
Some measures rely completely on behavioral input to existing controls. Some
measures may be in response to out of tolerances adjustment, and some may
supplemental operations and maintenance costs to restore them to EUS measure
performance. Skumatz responded that there are significant data gaps in solely
behavioral measure life, and to some extent how measure required commissioning
behavior may impact measure life.
Arneson raised the issue of whether we treat EUS Measure Life differently when it’s
replacing failed equipment, or replacing older technology in the midst of it’s useful
or serviceable life. Hope supported developing procedures that address the
equipment to be replaced. Discussion ensued of whether the Measure Life
Guidelines should have a method to prompt UES review in some cases, or if measure
replacement of failed equipment or still serviceable equipment is a measure life
issue or programmatic incentive level issue. This discussion transitioned into a
issues of how measure life may be regarded differently with regards to codes and
standards.
Rosilie expressed concern over variable and apparently interchangeable use of
“measure” and “program” in the discussion. He requested clarity and that those
types of distinctions throughout the process will important.
Hadley points out that historically, measure life has been determined by the
proponent of the measure,. In some cases RTF staff based measure life upon
manufacturer or other best estimates. To a lesser extent, historical measure life
determination may be based on studies, but the data quality has not always been
reviewed. Some measure life rely on more or less defensible ASHRAE ratings and
some are modified from those estimated lifetimes based on field experience that is
either anecdotal or not well studied or documented.
Baylon further mentioned that although the technical life of a luminaire may be 20
years, the components may not last that long and that technology may pass the
fixture up in that time. And, in office setting for example, there is approximately an
11 year life of lighting systems based upon remodel schedules, tenant turnover and
any number of other reasons. We will want to address the field expected life as
much as the technical equipment life.
Baylon said, that to get traction on the project and to get Guidelines that are useful
in the near term, there needs to be some mechanism where we use what we have, be
able to define its quality and then describe those areas of weakness in measure life
determination and define what needs to be studied.
Rushton pointed out that some measure lifetimes may need to be updated overtime
or as better data emerges and that Guidelines should address when an update
should occur.
Arneson pointed out that there are measures that are quite cost effective and others
that have a BC of just barely over one. Those that rely heavily on the measure life to
maintain their TRC and BC ratios should have more attention to the accuracy of
measure life determination. Some measures with 3 year estimated life , with low BC
ratios, can be impacted significantly with a one year life change one way or the
other.
Baylon mentioned that measure life use in determining Total Resource Cost is
important in long life measures as well. Those long life measures with BC ration of
just over 1 deserve close attention to the data quality as well.
The presentation and discussion on the topic changed to discussion of the project
schedule and next steps. Skumatz reiterated their interest in completing the work
within the RTF schedule. It was noted that the timeline shared in the PowerPoint
was likely to be amended. There was no second subcommittee meeting scheduled
until the SERA and Cadmus team meet to begin preparing the first round of the
Guidelines structure.
Doyle requested that the proposal from the successful contractors be made
available. Kendall noted that the work plan proposed by SERA and Cadmus will be
posted, but not the entire proposal response.
The meeting came to a close at 12:40.
Hope of BPA called RTF Staff right after the meeting to ask some clarifying questions
about the how the meeting proceeded, due to a brief absence he had during the
meeting. Of concern was the usefulness of Guidelines as they pertain to behavioral
measures that are becoming more prevalent. His understanding of discussion at the
close of the meeting added concern that behavioral measure life might not be
addressed well. Following are RTF staff Kendall’s response to those inquiries.
The strategy is on track to develop useful and effective measure life Guidelines. BPA
and other utility program experience with behaviorally dependent measures will be
valuable to the RTF contractor. The contractor will be conducting interviews of
program managers and evaluators and your program M&V requirements may very
well be shown to be useful in determination of behavioral measure life. BPA’s
insights into how we assess a measures life based upon initial testing, tuning,
adjusting and balancing of systems with follow-up verification will be helpful.
As we discussed in the kick-off meeting, there will be measures for which the
equipment life and performance are quite straight forward (e.g. lighting, motor
horsepower reductions ...) that will have a specific set of guidelines and checklist for
determining measure life with high confidence. That procedure will help improve
the consistency of RTF determined measure life for a number of measures. There is
another tier of measures in which there is capital investment in equipment that is
dependent upon the sequence of operations or controls where commissioning
behavior is critical to determination of measure life. And then there are those
behavior only measures for which the measures life is dependent upon ongoing
owner/operator behaviors and M & V. Determining the measure life based upon
prescriptive evaluation schedules or some other form of verification (such as the
BPA industrial programs), will be benefited by establishing formal procedures and
conditions.
Download