DEFAULT SETTINGS FOR ECB DOCUMENTS

advertisement
Statistics, Knowledge and Policy
OECD World Forum on Key Indicators
Palermo, 10-13 November 2004
USING INDICATORS TO ENGAGE CITIZENS:
THE OREGON PROGRESS BOARD EXPERIENCE
JEFFREY TRYENS
Jeffrey Tryens is the executive director of the Oregon Progress Board. Thanks to Rita Conrad and Zoë
Johnson for their assistance in preparing this report.
The views in this paper are those of the author and do not necessarily represent the views of the Oregon
Progress Board.
http://www.oecd.org/oecdworldforum
1.
Introduction
In May 1989 Oregon's then Governor Neil Goldschmidt unveiled Oregon Shines: An Economic Strategy
for the Pacific Century, a strategic vision that recommended a series of initiatives meant to transform the
state’s economy to meet the challenges of the twenty-first century. The vision was a holistic one that
considered social and environmental health vital contributors to a healthy economy. In 1989, the state
legislature created the Oregon Progress Board (Board) to identify and monitor a set of indicators, called
Oregon Benchmarks, designed to track progress toward achieving the Oregon Shines’ vision.1
Since then the Board has issued biennial reports on the state's progress toward the goals. Unlike publicsector performance systems in other states and localities, Oregon's system has not assigned responsibility
for meeting specific targets to any state agency. The benchmarks and the vision they embody are
theoretically the responsibility of all Oregonians
Chaired by the governor, the Progress Board is made up of business, community and political leaders
intended to represent the ethnic, cultural, social and economic diversity of the people of the state. The
Board has little statutory authority over state agencies and none over other sectors of society. As
respected leaders of the community, the Board’s influence comes primarily from its association with the
governor and other members’ standing in their respective communities.
Engaging civil society in the Oregon Shines planning process has been a key ingredient since its
inception. In its third biennial report, the Board declared “Never before has a state brought together so
many public, private and non-profit organizations to pursue a shared vision and measure progress toward
that vision.” In the fifteen years since its creation, the Progress Board has engaged civil society in many
ways. This paper will address five components of the Board’s work that best illustrate its successes and
challenges in that engagement. They are:
1. Articulate a Vision for the Future;
2. Identify What Matters;
3. Encourage Collaboration;
4. Assess Progress; and
5. Improve State Government Performance.
2.
A Short History of the Oregon Progress Board
Many believe Oregon’s most noteworthy accomplishment in this arena is keeping the same strategic
vision and indicators in place for fifteen years. The Oregon Benchmarks have survived four governors
and eight legislatures. While a noteworthy accomplishment, the Board has had its share of ups and downs.
1
The Board’s current authorizing statutes can be found at its website – www.oregon.gov/DAS/OPB.
Page 2 of 10
The development and use of benchmarks has involved civil society since their beginning. Governor
Goldschmidt personally enlisted nearly 200 business, labor, education, and government leaders to help
plan a strategy for Oregon's development over two decades. Oregon Shines outlined an economic
development strategy to: (1) transform Oregon's population into a world-class, 21st-century workforce;
(2) create an "international frame of mind" to position Oregon as the gateway to the Pacific Rim; and (3)
emphasize the comparative economic advantage of Oregon's extraordinary environmental amenities. The
benchmarks flowed out of that process.
In 1991 the state legislature reviewed and revised a package of benchmarks proposed by the Progress
Board, ultimately approving 158 indicators grouped into three categories: exceptional people, outstanding
quality of life, and diverse, robust economy. Seventeen of the 158 indicators were designated as "lead
benchmarks" representing key state problems identified in Oregon Shines.
The 14-member board is chaired by the governor with nine citizen leaders reflecting the geographic
breadth and cultural complexity of the state, two legislators appointed by the legislature and two exofficio members - a university student representing young leaders and the director of the state's
Department of Administrative Services. An executive director appointed by the governor manages day-today board activities.
The Board’s history can be divided into three overlapping periods: high expectations; disillusionment;
and rebuilding & rethinking.2
2.1
High Expectations
When Governor Goldschmidt chose not to run for re-election in 1990, the recently issued Oregon Shines
was in danger of becoming the classic study that sits on a shelf collecting dust. However, his successor,
Governor Barbara Roberts, enthusiastically supported the project. The 1990 election was noteworthy for
another reason: Oregonians approved Measure 5, a property tax limitation initiative that shifted the
primary responsibility for funding primary and secondary education from local to state government,
leaving fewer resources for other state programs and services.
The evolution of the benchmarks and the passage of Measure 5 intersected in 1992 when Governor
Roberts prepared her 1993–95 budget. The property tax limitation caused a 15 percent cut in funding for
state programs other than education. The Governor ordered cuts in all state agency budgets but allowed
restoration of part of those cuts if agencies linked their budgets to specific, high priority benchmarks. The
Governor herself described one result: "Agencies of government who hadn't been paying enough attention
to the benchmarks suddenly took the benchmark documents, and they [the documents] became dog-eared
while those agencies searched for things in the benchmarks that applied to their work.3”
2 The history section of this paper draws heavily from Achieving Better Health Outcomes: The Oregon Benchmark
Experience by Howard M. Leichter and Jeffrey Tryens.
3
Varley, Pamela. 1999. The Oregon Benchmarks Program: The Challenge of Restoring Political Support. Cambridge, Mass.:
Case Program, Kennedy School of Government, Harvard University, 11.
Page 3 of 10
As a result of this new importance, the Board was pressured to add benchmarks that reflected the issues of
particular segments of the population and of state agencies that felt unrepresented. In 1993 the Board
increased the number of benchmarks from 158 to 272. Consequently the Board’s 1995 report to the
legislature was so long that it required an index to find a benchmark on a particular issue. During this
period, many community collaborations oriented toward achieving benchmark targets were initiated.
2.2
Disillusionment
The 1995 legislative session was difficult for the Progress Board. Although the legislature appropriated
funding for the Board, it allowed the Board’s authorizing statute to expire. The newly elected governor,
John Kitzhaber, had to rescue the Progress Board by recreating it through executive order.
Like Tolstoy's families, each legislative critic was unhappy with the Progress Board and benchmarks in
his or her own way. Some argued that, in its desire to satisfy constituents, the Progress Board had adopted
so many benchmarks that they no longer served a useful purpose. Others said the unachievable targets left
legislators vulnerable to constituent criticism when unrealistic goals were not met. Still others called it "a
Democratic program with a Democratic agenda.4”
In response Governor Kitzhaber ordered a top-to-bottom review of the process by a 46-member blueribbon task force of community, business and political leaders as part of the 1996 Oregon Shines update.
After extensive community consultation, the task force recommended that: 1) the Progress Board be
continued; 2) the number of benchmarks be reduced to 100 or less; and 3) that state government take a
greater leadership role in using the benchmarks.
The changes had the desired effect. The 1997 legislature voted overwhelmingly to make the Progress
Board a permanent part of state government.
2.3
Rebuilding & Rethinking
The late 1990s marked a turning point for the Board. Oregon leaders no longer needed convincing that a
comprehensive strategy was necessary to help shape Oregon’s future. While continuing to keep an eye on
Oregon’s future, the Board would have to make the benchmarks tools for public sector accountability if it
was to survive.
In 2001, the legislature moved the Progress Board from the state’s economic
development agency to the Department of Administrative Services and established new responsibilities
for guiding agency performance.
In 2002, the Board faced another setback when a severe revenue shortfall caused the legislature to rescind
all Board funding for the remainder of the 2001–03 fiscal year. Once again, gubernatorial intervention
was necessary to allow the Board to continue operating. During this fiscal crisis, over 50 community
leaders wrote to the legislature imploring them to continue funding for the Board.
4
For a more complete discussion of Republican and Democratic criticisms, see Varley 1999, 17–9.
Page 4 of 10
Today, Board staff divides its time between tracking and reporting on benchmark trends and
institutionalizing a performance measure framework for state government. Funding is assured through
June of 2005 and relatively certain beyond that date. Funding for the now overdue Oregon Shines’
update, however, will have to come from non-government sources.
3.
The Role Of Benchmarks In Civil Society
In Oregon Shines and the benchmark system embody two cherished and celebrated traditions in Oregon
politics. The first is the pride Oregonians take in their state as an innovator. Oregon was the first state to
adopt the citizen initiative and referendum process, the first to have a bottle deposit law, the only state to
establish a prioritisation process for allocating scarce health care resources for its low-income residents
and the first to legalize physician-assisted suicide.
The second tradition is reliance on participatory democracy to initiate, legitimize, or ensure citizen
oversight of public policy. Through its initiative and referendum process, citizens can pass new laws,
nullify existing laws and change the state’s constitution. Citizen involvement is also an expected part of
deliberation at all levels of government.
The development and use of the Oregon Benchmarks exemplifies both traditions. From its inception the
Progress Board has engaged Oregonians in developing the benchmarks. The Board has two primary ways
to interact with the public regarding benchmarks and targets. First, community meetings are held all over
the state whenever Oregon Shines is updated. In 1996, more than 500 Oregon community leaders were
involved in the review and comment process. Second, the Board holds public hearings on the benchmarks
every two years. Until recently, the input came primarily from agency staff, experts and advocates. Using
an Internet-based format for the first time in 2004, the Board received comments and suggestions for
benchmark improvements from 275 citizens.
3.1
Articulate a Vision for the Future
At their core, the benchmarks represent the Board’s attempt to quantify the vision that is laid out in
Oregon Shines. They tell citizens what Oregon will be like if that vision becomes reality. They put meat
on the bones of phrases like “world-class, 21st-century workforce.”
In the early years, the benchmarks were primarily used to describe this preferred future. Visionary leaders
and citizens thought big thoughts and honed in on a list of indicators that would describe the preferred
future put forth in Oregon Shines. By developing future targets for the many indicators that emerged, the
Board painted a vivid, optimistic picture of the state’s future.
For the most part, the benchmarks logically flow from the vision articulated in Oregon Shines. The
Oregon Shines’ vision has clearly influenced how benchmark targets were set. As the process matured,
however, the benchmarks have tended to become the vision. Now in its fifteenth year, the state’s twenty-
Page 5 of 10
year vision, with its inspirational phrases like “create an international frame of mind to position Oregon as
the gateway to the Pacific Rim,” has faded into the background.
Today’s civic leaders generally know that Oregon Shines exists. While acknowledging its importance,
most would be hard pressed to describe its contents beyond a few key phrases and concepts. Without the
benchmarks, Oregon Shines would probably not exist today as an instrument of change in Oregon. On the
other hand, benchmarks would be less useful in the long run if they were not tied closely to a strategy for
achieving a preferred vision for the state.
Oregon’s Experience – Using indicators to articulate a community-driven vision for the future is a
necessary part of the process.
3.2
Identify What Matters
Settling on a set of indicators that identified what mattered most in creating the Oregon Shines’ future
took over five years. Hundreds of indicators were proposed. A churning period between 1991 and 1995
saw the introduction of nearly 300 “benchmarks,” although many had neither clear definitions nor data. In
1996, the Board settled on 92 benchmarks after an extensive citizen-led review process.
Civic leaders and interested Oregonians were deeply involved in the development and refinement of
benchmarks. Board staff estimate that over 8,000 Oregonians have helped create or refine the
benchmarks.
The benchmarks are generally accepted as the single best set of measures of overall quality of life in the
state.5 They are used extensively as guideposts in many different policy arenas from child well being to
economic prosperity. Even critics would agree that benchmarks are legitimate for this purpose.
Approximately 20 percent of Oregonians say they are familiar with Oregon Benchmarks. 6 As strategic
plans go, 20 percent familiarity is impressive. Unfortunately, this figure has not increased in four years.
The lack of improvement is probably a by-product of the Board’s increasing focus on state government
performance rather than community outreach.
Since their inception, benchmarks have been prioritised, but the importance of that differentiation has
diminished over time.. First called “lead” and “urgent,” these high priority benchmarks played an
important role in budgeting in the early 1990s. Now called “key,” the chosen benchmarks receive little
more attention than a greater weight when the Board tallies overall grades in its biennial report to the
Oregon legislature.
Oregon’s Experience – Using indicators to identify what matters has worked well.
5
Benchmarks are limited to policy sensitive topics so some quality of life measures, such as spiritual health or interpersonal
relationships, are absent from the set.
6 Preliminary estimate from the 2004 Oregon Population Survey.
Page 6 of 10
3.3
Encourage Collaboration
In the second phase of the benchmarks’ evolution, the Board encouraged the development of partnerships
that could drive change. Benchmarks became known as “magnets for collaboration” with numerous joint
efforts launched. This phrase, coined by local official (and subsequent Progress Board member) Beverly
Stein, was the battle cry of benchmark true believers during the mid-1990s. Over the years, many
community leaders have taken notice of the benchmarks, realizing that the indicators could be used to
focus disparate interests on a particular commonly held result.
Despite their economic origins, the benchmarks engendered many new collaborations around social
issues including child health, early childhood education, childcare and juvenile crime. Benchmarks were
also enlisted in structural reform efforts. Probably the most famous of those was a federal-state-local
collaboration called Oregon Option developed during the Clinton Administration. The simple premise of
this approach was the federal government would provide regulatory relief and fiscal flexibility to
Oregon’s state and local governments in exchange for a focus on improving benchmark trends. Numerous
collaborations came out of this effort.
One of the best current examples of collaboration including benchmarks is an effort known as Partners for
Children and Families (PCF).7 A 1999 law, drafted by then-Senator and current Progress Board member
Neil Bryant, requires five state agencies to work with one another, their local counterparts and related
community organizations to develop a single comprehensive, coordinated delivery system for children
and families statewide. In its fifth year, PCF has used disaggregated benchmarks (using sub-state data) to
good effect in focusing local efforts on particular topics. While supporters would argue that it’s too early
to judge, a dispassionate observer would find little evidence that benchmark trends, either positive or
negative, have been significantly affected by PCF-driven efforts.
In searching for proof of effect, leaders of collaborations can sometimes overstate the relationship
between high level outcomes trends and those efforts. Teen pregnancy in a rural Oregon county is a case
in point. In 1991 a broad-based, community-led effort was initiated to reduce a particularly high teen
pregnancy rate. Partners ranging from conservative churches to family planning organizations worked for
three years to put new procedures in place to provide education and health-related services to teens that
were aimed at reducing the risk of teens becoming pregnant. By 1994, the teen pregnancy rate had been
halved to the lowest in the state. Subsequent changes in community leadership and shifting priorities
caused the teen pregnancy focus to wane and, sure enough, the teen pregnancy rate shot back up to earlier
high levels. Leaders of the collaboration and researchers commissioned by the federal government all
agreed - the collaboration was effective while it lasted.
Revisiting the issue three years later, Progress Board staff found that the teen pregnancy rate in the county
had dipped to a similar statewide low in 2001, dropping precipitously from near the state’s highest rate
the year earlier. When queried, county leaders could point to no intervention that had occurred to cause
7
See http://www.oregonpcf.org/
Page 7 of 10
the reduction. With the total number of pregnancies averaging 19, the reduction was, apparently, simply
a function of small number variability. The earlier intervention no doubt had a positive effect on reducing
teen pregnancy, but the strength of that effect is unknown when the longer data series is considered.
Oregon’s Experience – Some success, but collaboration results can be difficult to determine.
3.4
Assess Progress
The Board’s “bread and butter” activity is its periodic assessment of progress. All the periodic Progress
Board reports – The Benchmark Performance Report, Oregon Benchmarks: A Progress Report on
Oregon’s Racial and Ethnic Minorities and the Oregon Benchmarks County Data Book – are aimed at
educating and informing citizens and community leaders. Press coverage is sought when reports are
released and community leaders are notified of their availability.
Different aspects of the data motivate particular groups in civil society. Advocates of particular positions,
like supporting the cause of disadvantaged groups, are particularly interested in using Board reports to
buttress their cases for more attention or more resources. Naturally, the data is more appealing when it
supports the arguments a particular group is making. Advocates inclined to “cherry pick” only the data
that supports their case have, perhaps, a harder time in Oregon because of the widely accepted legitimacy
of the benchmarks.
As the keeper of the benchmarks, the Board prides itself in presenting “just the facts.” Remarkably, the
benchmarks have remained above politics when assessments of progress are released. Citizens who care
about the state’s future trust the Board to tell the truth about Oregon trends. This perceived independence
from government influence, despite the Board’s ties to government, is key to its success in the long term.
Finding a proper forum for presenting progress assessments to the state legislature has been challenging.
No single committee or group within the legislature holds the responsibility for assessing progress over
the broad range of issues covered by the Progress Board. Informing legislative leaders about Board
findings has been a hit or miss affair for quite a few years.
While the veracity of the data is unchallenged, its real life, everyday use in decision-making is difficult to
discern. While many would argue that benchmark data is important to decision-making in the state,
evidence of systematic application of the Board’s analysis to policy making or resource allocation is in
short supply. The most common use of the data/analysis is to establish a general context within which
decisions are made.
Over the years the Board has tried different formats for assessing progress that were digestible by the
public. After using trend line-driven letter grades for three reporting cycles, the Board moved to a more
nuanced approach in 2003. Looking over a ten year time period, the Board answered the question “Is
Oregon Making Progress?” The Board has found that citizens are more interested in knowing whether
things are getting better or worse than knowing exactly how Oregon is doing in relation to a
predetermined target.
Page 8 of 10
Oregon’s Experience – The data generated by the Board is good but policy makers and opinion leaders in
the state do not fully utilize it.
3.5
Improve Government Performance
Since their inception, benchmarks have been used by agencies to improve performance. The most
persistently used avenue is the budget development process. In the early years, agencies received
preferential treatment in the budget process by showing linkages to key benchmarks.
In 1993, the Oregon legislature required state agencies to develop performance measures that linked to
the Oregon Benchmarks. Initial results were less than satisfactory. Performance measures changed
frequently and often had little relevance to benchmarks. In 2001, the legislature assigned responsibility
for performance measure development to the Progress Board. In 2002, the Board issued criteria-based
guidelines for performance measure development and reporting that were incorporated into the budget
development process.
All executive branch agencies now develop performance measures in the same manner and produce an
annual performance reports that document progress toward achieving performance measure targets. 8
These reports are intended to inform policymakers and citizens about state agency performance. The
Progress Board is currently working with a group of citizens on developing a set of recommendations that
would make state government performance reports more “citizen-friendly.”
By requiring linkages to benchmarks and annual self-assessments, the Board hopes to encourage public
servants to “look up” to the benchmarks. In this way, agency planners and administrators must look
beyond their day-to-day worlds to consider how they are changing things for the better for Oregonians.
Today, Board staff focus on using benchmarks and performance measures to improve government
performance. The legislature’s willingness to relent on eliminating Board staff positions was predicated
on the understanding that staff would focus primarily on improving government performance. While
high-level indicators like Oregon Benchmarks are not necessary for developing a functioning
performance measurement system, they provide a real world context that makes them more meaningful to
citizens.
Oregon’s Experience – Incorporating high level indicators into the state’s performance measure system
holds substantial promise for engaging civil society in improving government performance.
4.
HAVE THE BENCHMARKS MADE A DIFFERENCE?
To answer this question, benchmarks must be considered in the larger the Oregon Shines’ context.
Without the vision, Oregon’s indicators would not exist.
8
See http://egov.oregon.gov/DAS/OPB/GOVresults.shtml
Page 9 of 10
A quantitative case cannot be made for Oregon Shines’ success. After six years of “pedal to the metal”
economic growth in the mid -1990s, the state’s economy took a nosedive that caused Oregon’s
unemployment rate to rise to the nation’s highest. Actions emanating from Oregon’s strategic vision did
little to soften the blow of recession. When a 2002 assessment sought evidence that the Oregon Shines’
process had positively affected health outcomes, hard evidence was inconclusive.9
Some of the key policy initiatives put into place in conjunction with Oregon Shines, like reform of
primary and secondary education, remain. Others, like a “key industries” initiative, have come and gone.
While less than hard, evidence abounds that the Oregon Shines’ process has significantly contributed to a
culture change in Oregon. Many leaders will unequivocally state that this process, and especially the
Oregon Benchmarks, have made the state more results oriented in the way it develops and implements
policy. When a cross-section of state leaders was queried about the impact of the Oregon Shines’ process
on health outcomes, respondents said the most important effect was allowing them to engage with others
from different walks of life around developing the vision and exploring ways of achieving benchmark
outcomes. Outside observers believe that the longevity and continuing vitality of the process are evidence
enough of the difference Oregon Shines and the benchmarks have made in Oregon.
Critics of the process say the benchmarks have had little meaningful impact on life in Oregon. One
legislator calls the Board a “feel good operation.” However, the promising practice of using benchmarks
to improve government performance has silenced many critics, at least temporarily.
On balance, the author believes that Oregon civil society is better off for having had the benchmarks in
place for the last 13 years. As one out-of-state observer put it, “Maybe the existence of the benchmarks
does not automatically solve your problems but at least you know how you’re doing on the things that
matter.”
Over their life, Oregon’s benchmarks have evolved from the numerical manifestation of an idealistic
future to tools for improving strategic alignment, especially for state government. Putting a set of
indicators in place that can be both inspirational and practical is a challenge that every indicator project
must address when engaging civil society.
9
Leichter, Howard M. and Jeffrey Tryens, 2002, Achieving Better Health Outcomes: The Oregon Benchmark Experience.
New York, N.Y.: Milbank Memorial Fund.
Page 10 of 10
Download