Use and Development of Outcome-based Measures in Government

advertisement
Workshop:
Development and Use of
Outcome-based Measures in
Government Planning & Reporting
September 28th, 2006
Winnipeg, Manitoba
Presented by:
Manitoba Treasury Board Secretariat ,
Office of the Provincial Comptroller of Manitoba, and
The International Institute for Sustainable Development (IISD)
1
Agenda
I.
The Context in Manitoba [10-10:30am]
•
•
II.
Opening Statements and introductions
Overview of government-wide departmental planning and
reporting
Trends in Planning, Reporting and Performance
measures [10:30-11:15]
•
•
III.
What are outcome-based measures and why are they
important?
Innovative case examples
Sharing of Examples From the Department of
Advanced Education and Literacy [11:15-noon]
•
IV.
Plenary discussions based on Pre-work assignment
Analytic Tools for the Development and Use
Outcome-based Measures [noon-2:15pm]
•
•
Key steps in development, use and analysis of outcome-based
measures
Group work on key steps
Closing Remarks [2:15-2:30pm]
2
Objectives of Workshop
• Intro to developing performance measures
linked to organizational goals
• Overview of current practice elsewhere
• Understanding/discussion of issues related to
government performance measurement
reporting
• Context for new Annual Report requirements
(looking back) and new PSO requirements
(looking ahead)
3
Part I. The Big Picture
•
•
Performance Measurement in Manitoba
Role of performance measures in annual
reporting
4
I. The Big Picture
Trends and context
Trend toward reporting on
changes in socio-economic
and environmental conditions
that matter to Manitobans
5
I. The Big Picture
2005 Discussion Document
examples: Economy
6
I. The Big Picture
2005 Discussion Document
examples: People
7
I. The Big Picture
2005 Discussion Document
examples: Community
8
I. The Big Picture
2005 Discussion Document
examples: Environment
9
I. The Big Picture
The Planning and Reporting
context
• Performance Measurement is not done in
isolation
• Identified as part of Priorities & Strategies
Overview (PSO)
• reported upon in Annual reports
• Performance Measures show whether our
plans are working
10
I. The Big Picture
Full cycle view
Spring Yr1
Fall Yr1
Spring Yr2
Spring Yr3 Fall Yr3
PSO
New
(plan) for
initiatives
following
for
Estimates
fiscal
following
for
fiscal
following
Budget
fiscal
announced
Estimates
supplement
Execution of plans
during fiscal year
Annual
Report
11
I. The Big Picture
Full cycle view
Spring Yr1
Fall Yr1
Spring Yr2
Spring Yr3 Fall Yr3
PSO
New
(plan) for
initiatives
following
for
Estimates
fiscal
following
for
fiscal
following
Budget
fiscal
announced
Estimates
supplement
Execution of plans
during fiscal year
Annual
Report
12
I. The Big Picture
Multiple cycles in play
Fall 04
Spring 05
Fall 05
Budget
Estimates
supplement
PSO
(plan)
New
initiatives
Budget
Estimates
supplement
PSO
(plan)
Fall 06
Annual
Report
New
initiatives
Estimates
07/08
Spring 06
Annual
Report
Estimates
06/07
05/06
04/05
Spring 04
Budget
Estimates
supplement
PSO
(plan)
New
initiatives
Estimates
13
I. The Big Picture
Issues/Opportunities
re Annual Reports
• Previous guidelines did not request
measures of progress or performance
• No specific requirement to link annual
reports to larger process (Estimates
Supplement or plans)
• No specific requirement for Department
annual reports to be placed online
• New direction set in 2005 through
Reporting to Manitobans on Performance
14
I. The Big Picture
Trends and issues
in government annual reporting
•
•
•
•
Accountability
Who uses annual reports? (internet)
Public expectations (of results, of reporting)
Limited resources, demographic trends
(retirements)…
• …governments need to do “more with less”
• Trend to shared, horizontal efforts
• Important to agree on how to assess, report and
use results
15
Recap: Full cycle view
Spring Yr1
Fall Yr1
Spring Yr2
Spring Yr3 Fall Yr3
PSO
New
Annual Reports,
(plan) for
initiatives
including measures, are
following
for
connected back to
Estimates
fiscal
following
plans…
for
fiscal
following
Budget
fiscal
announced
Estimates
supplement
Execution of plans
during fiscal year
Annual
Report
16
I. The Big Picture
Part II. Trends in Government
Planning and Reporting
A. What are outcome measures and why are they
important
 Concepts and vocabulary
B. Who is doing this really well?
 Oregon: Results-oriented Strategic Planning
C. What are Alberta and Saskatchewan doing?
 Alberta: Goal-based Budgeting
 Sask. Government Accountability Framework
17
II. Trends in Reporting
A. What are outcome measures and why
are they important?
Our Environment
de•vel•op (di•vel′əp) v.t. 1. To expand or bring out the
potentialities, capabilities, etc.
Our People
Our Economy
18
II. Trends in Reporting
A. What are outcome measures and why
are they important?
Our Environment
Experience has shown that a pathway to
sustainability cannot be charted in
advance.
Rather,
pathway
beout the
de•vel•op
(di•vel′əp)
v.t.the
1. To
expand must
or bring
navigated
throughetc.
processes of learning
potentialities,
capabilities,
and adaptation.
Our People
National Academy of Science 1999. Our Common Journey: A
Transition Toward Sustainability.
Our Economy
19
II. Trends in Reporting
IISD International and National-Level Perspective
• Nations starting to develop
information systems to gauge
societal wellbeing and sustainability
The United Kingdom
Norway
20
II. Trends in Reporting
Vision
Evaluation and
Improvement
SD Principles
Priorities and
Objectives
Inter-generational
consideration
Monitoring
Systems thinking
Plans
Multi-stakeholder
participation
Adaptive management
Implementation
Sustainable
Development
Departmental,
Principles
Government
Accountability
Systems
Vision
Thematic
Budgets
AccountabilityEvaluation and
Principles Improvement
Priorities and
Objectives
Accountability
Principles
National SD
Strategy Process
Monitoring
Transparency
Plans
Efficiency
Accountability
Departmental,
Thematic
Implementation
21
Budgets
• Provinces and states using societal
goal and outcome-based planning
and budgeting systems
Oregon
Alberta
IISD Provincial and State Level Perspective
22
II. Trends in Reporting
• Communities creating new forms of
social infrastructure to navigate
quality of life and sustainability
Orlando
– Healthy Community Indicators Initiative
of Greater Orlando
Winnipeg
– Quality of Life Indicators System concept
being proposed by range of community
stakeholders
IISD Community-level Perspective
23
II. Trends in Reporting
Logic Model
Contributing to society’s
Key Priority Areas
wellbeing and
sustainability objectives
High-level
Outcomes
Outcome
Objectives
Intermediate
Outcome
24
Change in wellbeing
conditions (economic,
social and environmental)
Short to medium-term
consequence of an
output
Output
The result of an activity
Actions
Activity
Processes and Inputs of a
policy, program, or project
Hybrid from three different logic models
(Canadian International Development Agency, Province of Alberta, State of Oregon)
Example Measures
Environmental
Social
Healthy,
sustainable
surroundings
Quality
Jobs
High-level
Objectives
Change in wellbeing conditions
(economic, social
and environmental)
Stream water
quality
(turbidity)
Children
entering school
ready to learn
Intermediate
Outcome
Short to mediumterm consequence
of an output
Rate of soil
erosion by
water
Key Priority Areas
Output
Actions
25
The result of an
activity
Processes and
Inputs of a policy,
program or
project
% children
enrolled in Pre-K
program
Kilometers of
river bank with
vegetation
Demographic
surveys for Headstart program
completed
% of revegetation
programs
completed
Head-start
program
implemented
Climbing the Steps
toward Performance Management
Performance
Measures
Objectives
Mission/Goals
Mission statements declare
the agency’s long-range
intent; its purpose.
Although the goals
expressed in a mission
statement may help shape
the agency’s values and its
organizational culture,
they often are imprecise
and sometimes even a bit
vague.
Objectives are
unambiguous statements
of the agency’s
performance intentions,
expressed in measurable
terms, usually with an
implied or explicit
timeframe.
Performance measures
indicate how much or how
well the agency is doing.
Ideally, they track the
agency’s progress toward
achieving its objectives.
Analysis for
Continuous
Improvement
Many agencies compare
this month’s or this year’s
performance measures to
those of the past. Some
are beginning to make
comparisons with other
agencies and to begin the
process of benchmarking.
(From Gov. of Alberta 1996)
26
II. Trends in Reporting
Who does this really well?
The Oregon Shines Case Study
27
Oregon Shines Case Study
Oregon Shines
Oregon's Strategic Plan
- Oregon Shines (1989)
- Updated every eight years
- Encompasses the entire state
Oregon Progress Board
- independent agency created to be
the steward of Oregon Shines
- law mandates Board to report
biennially
- chaired by governor
28
From Conrad (2005)
Oregon Shines Case Study
Vision – “Oregon Shines II”
Economy:
Quality jobs for all
Oregonians
People:
Safe, caring and
engaged communities
Environment: Healthy, sustainable
surroundings
29
From Conrad (2005)
Oregon Shines Case Study
Oregon Benchmarks
Measures for how Oregon as a whole is doing.
• Quality Jobs for All Oregonians
– Economy (#1-17)
– Education (#18-29)
Key Priority
Areas
High-level
Objectives
• Engaged, Safe & Caring Communities
– Civic Engagement (#29-38)
– Social Support (#39-60)
– Public Safety (#60-67)
• Healthy, Sustainable Surroundings
– Community Development (#68-74)
– Environment (#75-90)
30
From Conrad (2005)
Oregon Shines Case Study
Linking Government to the
Benchmarks
31
Is society
benefiting?
High Level Outcomes
(Benchmarks)
Are strategies
working?
Intermediate
Outcomes
Is work
happening?
Outputs
From Conrad (2005)
Oregon Shines Case Study
http://www.oregon.gov/DAS/OPB/
32
Oregon Shines Case Study
33
Oregon Shines Case Study
34
Oregon Shines Case Study
35
Oregon Shines Case Study
Oregon Shines
Goals & Benchmarks
Goal #1: Quality Jobs
for All Oregonians
Key Priority
Areas
High-level
Objectives
Change in wellbeing conditions
(economic, social
and environmental)
Intermediate
Outcome
Short to mediumterm consequence
of an output
Output
Actions
36
The result of an
activity
Processes and
Inputs of a policy,
program or
project
Benchmark #29
Dept of Community
Colleges and Workforce
Development
Dept Goal #2: Oregon’s
workforce is well trained and
has access to a wide variety
of training programs
Percent of participants
ranking WIA funded
current workforce
ratings good or better
shared best practices,
encouraged quality
services and conducted
quality assurance review
Workforce skill
development
project
Linking Government to the
Benchmarks
Oregon’s
Progress
Benchmarks
External Influences
Organization’s
Progress
37
From Conrad (2005)
Performance
Measures
Oregon Shines Case Study
Towards Goal-based Budgeting in
Alberta
http://www.finance.gov.ab.ca/publications/measuring/measup06/index.html
38
Alberta Case Study
Key Priority
Areas for
Alberta
39
40
Alberta Case Study
41
Alberta
Measures Up
Goal #2: Albertans will be
prepared for Lifelong
Learning and Work
Key Priority
Areas
High-level
Objectives
Change in wellbeing conditions
(economic, social
and environmental)
Intermediate
Outcome
Short to mediumterm consequence
of an output
Output
Actions
42
The result of an
activity
Processes and
Inputs of a policy,
program or
project
Employment Rates
of Albertans Age
24-35 by Highest
Level of Education
Department of
Advanced Education
Dept Goal #1: High
Quality Learning
Opportunities for All
The learning
system meets the
needs of all
learners, society,
and the economy.
$3 billion Access to Future
endowment; a $1 billion
expansion to Alberta Heritage
Scholarship Fund, $500 million
expansion to Ingenuity Fund
Introduced Bill 1,
the Access to the
Future Act,
Part III. Sharing Examples
•
Sharing of
Departmental
examples in the
use of outcomebased measures
43
III. Department Examples
Manitoba Examples:
Pre-Work Discussion
• Review your pre-workshop assignment
• Brief presentations
[5 minutes + 5 min questions]
• General discussion
of issues
44
III. Department Examples
Part IV: Development and Use of
Outcome-based Measures
A. Overview of key steps in the
development and use of
outcome-based measures
B. Working through an example
•
45
Groups of at least two (per
department if possible)
IV. Outcome-based Measures
Logic Model
Contributing to society’s
Key Priority Areas
wellbeing and
sustainability objectives
High-level
Objectives
Outcome
Intermediate
Outcome
46
Change in wellbeing
conditions (economic,
social and environmental)
Short to medium-term
consequence of an
output
Output
The result of an activity
Actions
Activity
Processes and Inputs of a
policy, program, or project
Hybrid from three different logic models
(Canadian International Development Agency, Province of Alberta, State of Oregon)
Key Steps
1. Frame the logic model for your issue
2. Identify SMART measures for the logic
model
3. Understand and articulate key external
influences
4. Analyze feedback based on the SMART
measures and external influences
47
IV. Outcome-based Measures
3. External
Influences
1. Logic Model
2. Measures
and Targets
4. Analysis
& Feedback
Key Priority Areas
Influences:
High-level Objective:
Measure:
Analysis:
Measure:
Analysis:
Measure:
Analysis:
_______________
Influences:
Intermediate Outcome:
_________________
Influences:
Output:
________________
Actions
Measure:
________________
48
IV. Outcome-based Measures
1. Logic Model
Key Priority Area
Safe drinking water,
enjoyable recreation
High-level Objective:
Improved stream
Water quality
Intermediate Outcome:
Reduced soil erosion
Output:
River bank re-vegetation
Actions:
River bank re-vegetation
program
49
Hybrid from three different logic models
(Canadian International Development Agency, Province of Alberta, State of Oregon)
2. Identify SMART Measures
•
•
•
•
•
Specific
Measurable
Aggressive, yet achievable targets
Relevant
Time-bound
50
IV. Outcome-based Measures
Anatomy of a measure
Indicator
Target
Data
Oregon Population Survey, a random sample telephone
survey of Oregon households conducted in evennumbered years. Margin of Error +/- 1.60%.
51
Source
IV. Outcome-based Measures
1. Logic Model
Key Priority Area
2. Measures
and Targets
Safe drinking water,
enjoyable recreation
High-level Objective:
Improved stream
Water quality
Intermediate Outcome:
Reduced soil erosion
Output:
River bank re-vegetation
Actions:
River bank re-vegetation
program
52
River water quality
(turbidity, nitrogen
concentration)
Rate of soil and rill
erosion on farmland
Kilometers of
river bank
with vegetation
% of re-vegetation
Programs complete
IV. Outcome-based Measures
Types of Targets
Type
Example
Policyspecific
targets
Determined in a political and/or technical process taking past performance and
desirable outcomes into account.
Standards
Nationally and/or internationally accepted properties for procedures or environmental
qualities.
Example: official development assistance shall be 0.4 percent of national GNP.
Example: water quality standards for a variety of uses.
Thresholds
The value of a key variable that will elicit a fundamental and irreversible change in the
behaviour of the system.
Example: maximum sustainable yield of a fishery.
Benchmark
Comparison with a documented best-case performance related to the same variable
within another entity or jurisdiction.
Example: highest percentage of households connected to sewage system in a
comparable jurisdiction.
Principle
A broadly defined and often formally accepted rule.
Example: the policy should contribute to the increase of environmental literacy.
53
STEP 3: External Influences
• Your policies, programs and projects are not the
only influence on the desired outcomes and
outputs
• Direct influences
• Other department activities, businesses, NGOs, civil society
• Other jurisdictions (provinces, countries)
• Nature (e.g., weather)
• Indirect influences
• Broader societal driving forces (e.g., demographics, markets,
consumption patterns)
54
IV. Outcome-based Measures
3. External
Influences
1. Logic Model
2. Measures
and Targets
Key Priority Area
Safe drinking water,
enjoyable recreation
• Quality of water
flowing into MB
• Treatment plant
effectiveness
High-level Objective:
Zero till practices,
weather, crop type
Intermediate Outcome:
Improved stream
Water quality
Reduced soil erosion
• Independent actions
farmers
• Natural growth
Output:
River bank re-vegetation
Actions:
River bank re-vegetation
program
55
River water quality
(turbidity, nitrogen
concentration)
Rate of soil and rill
Erosion on farmland
Kilometers of
river bank
with vegetation
% of re-vegetation
Programs complete
IV. Outcome-based Measures
4. Analysis and Feedback
• What do the measures tell you about your
activities?
– Output level analysis
– Intermediate outcome level analysis
– High-level outcome analysis
• How does your understanding of external
influences help your analysis?
56
IV. Outcome-based Measures
3. External
Influences
1. Logic Model
2. Measures
and Targets
4. Analysis
& Feedback
River water quality
(turbidity, nitrogen
concentration)
Analysis: Water quality
Key Priority Area
Safe drinking water,
enjoyable recreation
• Quality of water
flowing into MB
• Treatment plant
effectiveness
High-level Objectives:
Zero till practices,
weather, crop type
Intermediate Outcome:
Improved stream
Water quality
Reduced soil erosion
• Independent actions
of farmers
• Natural growth/disease
Output:
River bank re-vegetation
Actions:
River bank re-vegetation
program
Rate of soil and rill
Erosion on farmland
Kilometers of
river bank
with vegetation
% of re-vegetation
Programs complete
95%
57
improving. Quality of water
flowing into province has
improved significantly.
Analysis: Could be due
to less bank vegetation,
change in crop or extreme
series of extreme rainfall
events
Analysis: Cropland
expansion could be reason
or natural disease. Need to
research further
Feedback: Water quality
improving but not due to
program. Other factors
counteracted program
such as expanded
cropland and reduced
zero tillage
Exercise 1: Development of an
Outcome-based Measure
• Form your work group [during lunch]
– Find your working partner (your departmental colleague if
possible)
• Select measure to focus on [during lunch]
– With your partner select a measure from your department to
focus on for the exercise (from your workshop preassignment, or something else of mutual interest and value
for your reporting cycle)
• Working with your Colleague [1:00-1:45pm]
– Use the attached template as a guideline to carry out the 4
analysis steps for developing and/or using outcome-based
measures
• Summarize your analysis on an overhead sheet
• Plenary [1:45-2:15 pm]
– Two groups will be asked to share their results, each
followed by a plenary discussion
58
3. External
Influences
1. Logic Model
2. Measures
and Targets
4. Analysis
& Feedback
Key Priority area
Influences:
High-level Objective:
Measure:
Analysis:
Measure:
Analysis:
Measure:
Analysis:
_______________
Influences:
Intermediate Outcome:
_________________
Influences:
Output:
________________
Action:
________________
59
Measure:
Expectations for this past round of
Department Annual Reports (2005-06)
New: Section featuring department
performance reporting
New: Central review of the new section
New: Annual Reports to be consistently
available online
60
What was required for the new
section of the Report?
• Five progress or performance measures
• The few critical indicators that illustrate
progress against desired outcomes
• Ideally, those that support key Department
priorities
• Could be drawn from previous PSOs,
Reporting to Manitobans on Performance
document, or other sources
61
Overview of Key Questions That
Needed to Be Addressed
What is Being
Measured and
How? (A)
1.
2.
3.
4.
5.
62
Why is it
Important to
Measure? (B)
Comments/
What is the most
What is the
Recent
Recent Available
Trend over
Value for this
time for this Actions/Rep
Indicator? (C)
Indicator? (D) ort Links (E)
Illustrative Example
What is
being
Measured
and How?
Access to
health services
by measuring
the percentage of
Manitobans
reporting “no
difficulty” in
accessing health
services.
Why is it
Important
to Measure
this?
One key
determinant of
population
health is its
access to
quality health
services. This
measure
assesses
access only.
What is the Most
Recent Available
Value for this
Indicator?
What is the
Trend over time
for this
Indicator?
For 2003 (most recent
survey data), % of
Manitobans who reported
“no difficulty” accessing:
 Health information or
advice: 82%
 Immediate care: 75%
 Routine care: 81%
Improving
Wait times for key
procedures have
reduced significantly
over the last five years.
See page X of the
report for details.
The source of these
data is the Statistics
Canada Health Access
Survey. There is no
directly comparable
study prior to 2003.
The study will be
repeated in 20xx.
Comments /
recent actions /
report links
See page X of the
report for a discussion
of recent actions
addressing access to
health services.
NOTE: All information above is adapted from Reporting to Manitobans on Performance 2005
Discussion Document, and is meant for illustrative purposes only.
63
Closing Statements and Next Steps
• Workshop evaluation process
• First thoughts from the group on further
capacity building?
64
References
Government of Alberta 2005-06 Annual Report – Measuring Up.
[http://www.finance.gov.ab.ca/publications/measuring/measup06/index.html]
Manitoba Provincial Sustainability Report. Government of Manitoba
[http://www.gov.mb.ca/conservation/sustainabilityreport/]
Oregon Progress Board: www.oregon.gov/DAS/OPB
Rita Conrad 2005. Results Oriented Strategic Planning: A Framework for
Developing Effective Performance Measurement. Presentation to the
Botswana Delegation, Salem, Oregon, September 29, 2005.
Rita Conrad 2005. Oregon’s Experience with Performance Reporting.
AGA’s First National Performance Management Conference. Oregon
Progress Board, November 14 2005.
Reporting to Manitoba on Performance: 2005 Discussion Report.
[http://www.gov.mb.ca/finance/mbperformance/index.html]
65
Download