So You Think There`s No Funding?

advertisement
$o You Think There’s No Money?
Securing Nonprofit Funding in Today’s Economy
presented by:
Debra Thompson, MBA, President & CEO, Strategy Solutions
Joseph Jones, Ph.D., Senior Researcher, American Institutes of Research (AIR)
Today’s Agenda
•
•
•
•
•
•
•
•
•
•
About the Presenters
Today’s Learning Objectives
The Challenge
The Vision
Where are We Now?
Organizational Capacity
Product Development Process
The Role of Innovation
What is Program Evaluation?
What Should We Do Now?
About Strategy Solutions:
Creating Healthy Communities
Wealth Creating Companies
Private Sector
Wealth Creating Companies
Public Sector
Government
Workforce Development
(including Education)
Economic Development
Community Based Nonprofits
Faith Based Nonprofits
Healthcare
About Strategy Solutions:
Creating Healthy Organizations
Product Development
Manufacturing or Service
Delivery
Product Development
Manufacturing or Service
Delivery
Financial Management
Access to Capital
Fundraising
Marketing
IT / HR
Back Office Operations
About AIR:
Mission and Vision
Mission
To conduct and apply the best behavioral and social science research and evaluation
towards improving peoples’ lives, with a special emphasis on the disadvantaged.
Vision
Within the United States and internationally, AIR will be the preeminent organization that:
• produces improvements in education, health, and the workforce;
• addresses the needs of individuals, organizations, and communities;
• designs and advances statistical and research methods;
• causes practitioners and organizations to adopt evidence-based practices; and
• informs public understanding and policymaking by the best evidence.
About AIR:
Who is AIR?
• Based in Washington DC with over 1,500 research staff around
the world
• AIR is organized into seven program areas:
•
•
•
•
•
•
•
Analysis of Longitudinal Data in Education Research
Assessment
Education
Health
Human and Social Development
International Development, Education, and Research
Workforce
• Since 1946, AIR has conducted thousands of program
evaluations for states, local school districts, the federal
government, private sector companies and non-profits.
What is the Goal?
Today’s Learning Objectives
 Learn how to approach program planning,
outcomes measurement and evaluation to
ensure a sustainable business model
 Understand the definition of “evidence
based” and how it applies to your programs/
organization
The Challenge
• Funding is not going away, but…
– more and more, funding is only available to
organizations that can “prove” that they achieve
desired outcomes
• Most nonprofit organizations lack the
sophistication to measure outcomes
How sophisticated do you feel your outcomes
measurement process is?
20%
20%
20%
20%
20%
1
2
3
4
5
1. What outcomes
measurement process?
2. Not very sophisticated
3. Somewhat sophisticated
4. Moderately sophisticated
5. Very sophisticated
0/0
Sophistication
The Vision: Healthy Organizations
The price of
sustainability is
generating
enough cash to
innovate at the
rate of change of
the marketplace
(or ahead of it)
Operations
Product Development
Manufacturing or
Service Delivery
Finance
Support Services
Financial Management
Marketing
Access to Capital
IT / HR
Fundraising
Back Office Operations
Where Are We Now?
Real-time industry and product life
cycle analysis required
Non-Profit
Organizations
Are
are
Required: Organizational Capacity Building to “get to maturity” NOW…
The Non-Profit Organizational Lifecycle Model
If your organization
is ANYTHING but
mature (4, 5 OR 7),
is your business
model sustainable?
7
5
Sustain
4
Mature
2
6
1
Gr
ow
8
nt
e
c
es
g
Be
ne
cli
De
Stagnant
ol
d
A
in
Renew
Di
ss
ol
ve
3
Start-Up
Many are here
Adapted from: Navigating the Organizational Lifecycle
©Strategy Solutions, 2011
by Paul Connolly
What stage of the organizational lifecycle are we
currently in?
1.
2.
3.
4.
5.
Start up
Transitioning to adolescence
Declining before adolescence
Adolescent, striving for maturity
Sustaining, starting to build new
curve
6. Declining before maturity
7. Actively building the new curve
8. Declining
13% 13% 13%
1
2
3
13% 13% 13% 13%
13%
4
8
5
6
7
Life Cycle Stage
0/0
4 Components of Capacity
The ability to
monitor, assess,
respond to and
stimulate internal
and external
changes
Ensure the effective
and efficient use of
organizational
resources.
Adaptive
Management
Leadership
The ability of leadership
and staff and to inspire,
prioritize, make
decisions, provide
directions and innovate.
Technical
The ability to implement all
of the key organizational
functions and deliver
products and services.
Source: Navigating the Organizational Lifecycle: A Capacity-Building Guide for
Nonprofit Leaders by Paul Connolly
Essential Elements of Adaptive Capacity
•
•
•
•
•
•
Needs assessments
Organizational assessment
Program evaluation
Knowledge management
Strategic planning
Collaborations and partnerships
Technical Capacities
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
Product Development Planning
Outreach and Advocacy
Program Operations
Outcomes Measurement, Management AND Program Evaluation
Earned Income Generation
Information Technology
Finance (including access to capital)
Accounting (including process cost accounting)
Fundraising
Legal
Facilities
Marketing and Communications
Human Resources
Other(s)…..
Limitation of Technical Capacities
• The most significant challenge today is that training
and capacity building alone (and gap filling
consulting projects) are not going to be sufficient
for the nonprofits to address these technical needs.
• All nonprofits need to build capacity in EACH of
these areas on an ongoing basis to be sustainable.
Board Program Committee Job Description
1.
Advise the board on industry trends and strategic challenges relating to
the mission of the organization, and assess community need for
programs and services.
2.
Ensure that programs are developed and implemented that meet the
needs of the community and achieve desired outcomes
3.
Identify key indicators for measuring the quality and success of
programs
4.
Track program indicators; and monitor how well agency achieves
outcomes and impacts
5.
Plan and implement strategies to increase awareness and participation
in programs
How strategic is our board Program Committee
Work (compared to this job description)?
1. What board program
committee?
2. Not very strategic
3. Somewhat strategic
4. Strategic
5. Very strategic
0/0
20%
20%
20%
20%
20%
1
2
3
4
5
Strategic
Program Development Challenges Today
• Staffing
– Can organizations afford to have someone
designated as program developers today (even as
part of their job(s)?
• Organization/Process
– Is the program development function and process
organized in a way that actually produces a new
(and funded!) program in a reasonable period of
time?
How effective is your program development
function?
1. What program development
function?
2. Not very effective
3. Somewhat effective
4. Very effective, we plan and
implement new programs and
get them funded very quickly
0/0
20%
20%
20%
20%
20%
1
2
3
4
5
Effective
Have you read this book?
Innovation’s Missing Link
Source: The Other Side of innovation by:
Vijay Govindarajan and Chris Trimble
The need
to reassess
organizing
and
planning is
often
overlooked
Ongoing Operations
Innovation
Strategy
Committing to an Innovative
Idea
Organizing
&
Planning
Execution
Making Innovation
Happen
You can’t ask the group
that is in charge of
today to also be in
charge of tomorrow,
because the urgent
always squeezes out
the important.
Organizing an Innovation Initiative
Project team = Dedicated Team + Shared Staff
The Dedicated Team is custom-built for the initiative.
The shared staff retains its existing responsibilities and supports the initiative.
Performance Engine
Shared
Staff
Partnership
Dedicated
Team
Source: The Other Side of innovation by:
Vijay Govindarajan and Chris Trimble
Steps for Building the Project Team
• Divide the labor: decide how responsibilities for
executing the initiative will be split between the
Dedicated /Team and the Shared Staff.
• Assemble the dedicated team: Determine who will
serve on the Dedicated Team and how to define their
roles and responsibilities.
• Manage the partnership: Establish clear
expectations for each partner and mediate the
inevitable conflicts that will arise between the
dedicated team and shared staff.
Source: The Other Side of innovation by:
Vijay Govindarajan and Chris Trimble
Executing the Initiative
The Partnership, not just the Dedicated Team, executes the initiative!
Shared
Staff
(routine tasks)
Dedicated
Team
(non routine
tasks)
Partnership
One project plan
Source: The Other Side of innovation by:
Vijay Govindarajan and Chris Trimble
The Challenges of the Partnership
Performance
Engine leaders
Challenge 2:
The divided
attentions of
the Shared
Staff
Challenge 1:
Competition with
Performance
Engine leaders for
scarce resources
Shared
Staff
Innovation
leader
Partnership
Challenge 3:
Disharmony
in the
partnership
Source: The Other Side of innovation by:
Vijay Govindarajan and Chris Trimble
Dedicated
Team
Performance Engine Limitations
• If the work relationships inside the
Performance Engine are inconsistent with
what is needed for a certain portion of the
innovation initiative, then that portion must
be assigned to the Dedicated Team
• You can’t ask the group that is in charge of
today to also be in charge of tomorrow,
because the urgent always squeezes out the
important.
Source: The Other Side of innovation by:
Vijay Govindarajan and Chris Trimble
Product Development Process:
Stage/Gate Model
Initial
Screen
Gate 1
Ideation
Identify
dedicated team
and shared staff
who will work on
idea; create
clear operational
definitions of
idea/initiative
Second
Screen
Stage
1
Gate 2
Post
Development
Review
Decision on
Business
Case
Stage
2
Gate3
Preliminary
Investigation
Detailed
Investigation
Identify interest
(through interviews
or survey); review
potential revenue
and costs and/or
cost savings;
vett concerns
Detailed market
and cost benefit
analysis
Stage
3
Gate 4
Development
Detailed operations
analysis and
business planning
Decision to
Launch
Stage
4
Gate 5
Testing and
Validation
Pilot
implementation,
evaluation and
refinement
Stage
5
Full
Operations
and Launch
Product Development Process:
Ideation and Preliminary Investigation
1. Ideation
•
•
Identification of outcomes and impact and
The evaluation process
2. Preliminary Investigation
•
•
Literature review of evidence based programs and best
practices
Determine type of program the agency wishes to
implement
Source: Product Development for the Service Sector
Product Development Process:
Detailed Investigation
3. Detailed Investigation – including Building the
Business Case
•
•
•
•
•
Identify the human, capital and operating resources to
“deliver” the impact (and measure it)
Direct program resources and costs
Indirect organizational sustainability resources and costs
Outcomes Measurement
Identify capacity building needs and fill gaps
Source: Product Development for the Service Sector
Product Development Process:
Development, Testing and Launch
4. Development
•
•
•
Calculate the “return on investment” based on the
outcomes, impacts and the “costs” to achieve the desired
outcomes
Identify funding sources and make the case to fund a
demonstration project or pilot
Develop a project plan for implementation
5. Testing and Validation (including evaluation)
6. Full Operations and Market Launch
Source: Product Development for the Service Sector
Types of Programs
• Programming that is based in tradition,
convention, personal or group beliefs, or
anecdotal information
• Good Ideas or Good Practice
• Promising Practices or Best Practices
• Evidence-Based
Getting
harder
to fund
Funders
want
these
How would you rate your programs today?
1. None of our programs are
evidence based or best
practices
2. Some of our programs are
best practices/evidence based
3. Most of our programs are
best practices/evidence based
4. All of our programs are best
practices/evidence based
0/0
25%
25%
25%
25%
1
2
3
4
Programs
What is program evaluation?
The systematic method for collecting, analyzing, and
using information to answer questions about projects,
policies and programs, particularly about their
effectiveness and efficiency.
Purposes of Program Evaluation
• Ensure transparency and accountability
• Measurement of outcomes and impact
• Form the basis of future programming
• Inform those who provide and administer
programming
• Provide information concerning what works, what
doesn’t work and why
Types of Evaluation
• Formative Evaluation – aims to improve the
program structure or implementation
• Summative Evaluation – assesses the merit or
worth of the program
• Developmental Evaluation – intended for situations
where it is acknowledged that the program and
organization are constantly changing
Program Evaluation Approach
Practicality,
need-to-know
applied science
Proper
methods,
data collection
& analyses
Program Evaluation Measures
Process/Output:
Outcome:
Impact:
• Are we reaching who
we should reach?
• Number of people
reached
• Program Strengths and
Weaknesses
• Feedback/ satisfaction
• What knowledge,
attitudes, behaviors or
other predetermined
variables changed AS
A RESULT of the
program/ efforts?
•
What are the longterm changes that are
a result of the
changes in outcome
variables?
• New impact
question: how does
the program “save
the system
money?”
Program Evaluation Measures
• Measures are used to make inferences about the
program
– E.g., “Results from the survey indicate that our customers
were happy with the program”
• Garbage In = Garbage Out
– Poorly planned and designed measurement leads to invalid
inferences
– E.g., “We based our survey results on 2 questions about
how often each year 3 of our 100 most loyal customers
attended our weekly program on a yearly basis”
What is a Logic Model?
• A Logic Model is a graphic representation of:
•
•
•
•
A program
series of IF, THEN statements
a series of relationships
the program roadmap
A Logic Model is a tool that is helpful to identify outcomes measures
What percentage of your current programs have logic models
(where the identified outcomes are actually measured?
1. None of them
2. Some of them (less than half)
3. Majority of them (50 to less
than 100%)
4. All of them (100%)
25%
25%
25%
25%
1
2
3
4
Measurement
0/0
What percentage of your current programs have any type of
external evaluation of the outcomes/impact?
1. None of them
2. Some of them (less than half)
3. Majority of them (50 to less
than 100%)
4. All of them (100%)
25%
25%
25%
25%
1
2
3
4
Outcomes
0/0
Logic Model:
Inputs, Outputs and Outcomes
• Inputs - The resources that go into the program
• Outputs - The activities the program undertakes
• Outcomes - The changes or benefits that are a result of the
program
H
U
N
G
E
R
Get Food
Eat Food
Feel
Better
Building a Logic Model
Inputs
Outputs
Activities
Outcomes
Participation
Short
Program
Investments
What we
invest
What we do
Who we
reach
Medium
What Results
Long
Building a Logic Model
Inputs
What we
invest
Staff
Volunteers
Money
Research Base
Equipment
Technology
Outputs
What we do
Who we
reach
Train
Participants
Teach
Clients
Deliver
Customers
Services
Agencies
Network
Decision
Partnerships
Makers
Assess
Politicians
Facilitate
Media
Outcomes
Short Term
Learning
Knowledge
Awareness
Attitudes
Skills
Opinions
Aspirations
Intentions
Medium
Term
Long Term
Action
Behavior
Decision
Making
Policy
Social Action
Conditions
Health
Well-being
Economics
Civics
Environment
Example Process, Outcome and Impact Measures
Process
Outcome
Impact
Resources (Staff, Location,
Program, Supplies, etc.)
Knowledge (E.g., understanding
the relationship between diet,
physical activity and diabetes)
Decrease in costs
associated with diabetes
care
Delivery (Protocol, Written
Training Process, Fidelity
Plan, etc.)
Attitudes (Empowerment, Desire
to change, etc.)
Decrease in hospitalization
associated with diabetes
Target Audience (Workplace, Behaviors (Dieting, Physical
Community, Enrollment
Activity, etc.)
Protocol, Drop Protocol, etc.)
Decrease in numbers of
individuals diagnosed with
diabetes
Program Strengths &
Weaknesses (What is
working, What is not
working, etc.)
Program Strategy (Revised
program strategy, reallocated
resources, etc.)
Improved program
efficiencies and reduced
costs
Feedback (Interviews, Focus
Groups, Surveys, etc.)
Communications and Outreach
(Revised communications,
updated outreach and
fundraising, etc.)
Increased program
participation and sustained
program funding
What is Evidenced-Based?
• Research shows that the program produces the expected
positive results;
• Results can be attributed to the program itself, rather than to
other extraneous factors or events;
• Evaluation is peer-reviewed by experts in the field;
• Program is “endorsed” by a federal agency or respected
research organization; and
• Program approach is documented so that it can be
implemented locally in a way aligned with the program design.
What is Evidenced-Based?
Washington State Institute for Public Policy
• Washington State Legislature created the Institute to
conduct research to examine the effectiveness of
state programs
• In 2009 tasked to conduct a study to examine “a
comprehensive list of programs and policies that
improve . . . outcomes for children and adults in
Washington and result in more cost-efficient use of
public resources.”
What is Evidenced-Based?
Washington State Institute for Public Policy
1. Gather studies that have already been conducted
on programs’ effectiveness
•
•
Consider all available studies they can locate on a topic
rather than selecting only a few studies
Conduct “meta-analysis” to determine what the weight
of the evidence tells them about program effectiveness
–
–
–
Statistical method to test an hypothesis based on aggregated
study results
Uses specific assumptions about the similarity between a “large
number” of studies
If assumptions are met, allows analysis without having to
conduct a “new” study
What is Evidenced-Based?
Washington State Institute for Public Policy
2. To be included in their review, evaluation’s research
design includes control or comparison groups
•
•
Random assignment studies preferred, but allow quasiexperimental or non-experimental studies when the
comparison group is well-matched to the treatment group
or adequate statistical procedures are employed to guard
against selection bias
Given the expected direction of selection biases, discount
the findings of less-than-randomized comparison-group
trials by a uniform percentage
What is Evidenced-Based?
Washington State Institute for Public Policy
3. Prefer evaluation studies that use “real world”
samples from actual programs in the field
•
•
•
Evaluations of so-called “model” or “efficacy” programs
are included in reviews, but discount the effects from
these types of studies
Presumption is that it is difficult to achieve, in actual
large-scale operation, the results of model programs
When conducting cost-benefit analyses, discounted the
statistical results of such studies by a fixed amount
What is Evidenced-Based?
Washington State Institute for Public Policy
4. If the researcher of an evaluation is also the
developer of the program, discount the results from
the study.
•
•
Sometimes it is difficult to duplicate the results achieved
by highly motivated individuals, who originate programs
There may also be potential conflicts of interest if
developers evaluate their own programs.
AIR Case Study:
The After-School Corporation (TASC)
•
•
•
•
Issue: The After-School Corporation (TASC) collaborated with community-based
organizations (Good Shepherd Services; Committee for Hispanic Children and
Families, Inc.; and Directions for Our Youth) and college partners (Fordham
University; Bronx Community College; Eugenio Maria de Hostos Community
College; and Mercy College) to implement "Bronx College Town“, a program
designed to enhance the academic, social, emotional, and physical development of
underserved elementary- and middle-school-aged students by focusing on high
school readiness and college and career planning. Program coordinators and
stakeholders sought information to improve on program outcomes and provide
the basis for future funding.
Type of Program Evaluation: Formative and Summative
Approach: Collection of evidence through student surveys, staff surveys, teachers
surveys, and observations of after school activities. In addition the evaluation team
explored the relationship between attending the 21st CCLC sites and student
achievement and grades..
Implications: Findings were published in a report that was used as guidance to
program modifications and as a basis for funding decisions.
AIR Case Study:
Open Society Institute (OSI)
• Issue: Reading and Writing for Critical Thinking (RWCT) is a
professional development project for teachers in countries
formerly under Soviet control. US educators provided the PD,
and the Open Society Institute (OSI) supported an evaluation
of the RWCT’s effectiveness.
• Type of Program Evaluation: Formative
• Approach: Collection of evidence through acquisition and
review of program background materials, interviews with
country coordinators, and interviews with students,
educators, and administrators familiar with RWCT.
• Implications: Findings were used to make in-country changes
to the processes used in implementing RWCT.
AIR Case Study:
Travel Security Administration (TSA)
• Issue: The United States Government created TSA to strengthen the
security of the its transportation systems. TSA secures the United States
airports and screens all commercial airline passengers and
baggage. Reconsideration of funding allocation due to concerns about the
effectiveness of TSA screening methods.
• Type of Program Evaluation: Summative
• Approach: Developed organizing, conceptual framework (logic model)
collaboratively with stakeholders to include program assumptions, inputs
(resources/activities), outputs, and outcomes. Metrics will be tracked to
gather evidence of program effectiveness, through performance
measures, observations, and interviews.
• Implications: Still in progress, but initial model findings are being used to
make policy and funding decisions about TSA programs.
Outcomes Measurement/ Evaluation Needs of Most
Nonprofits
• Support to continue to develop and refine the logic model(s) and the
output, outcomes and impact measures
– Management service and/or consultant support to assist
• Support to set up the “systems” to appropriately and effectively track and
monitor output, outcomes and impacts
– Requires investment in data collection systems and the incremental human
resources to run them
– Operationally define the program variables
– Finalize the “tools” (surveys, tracking and tabulation methods)
• Continually educate and ensure that the other members of the board
understand the resources and capacity required to implement this well
– Especially the Executive and Finance Committees
Tips for Effective Program Evaluation
• Understand the program
– Identify the decision makers and their decisions
– Understand the processes and resources
• Continuous Improvement Process
– Planning
– Ongoing needs assessment
• Understand the situation around the program
– Environmental scan
– Politics happen…
• Rigorous, not rigid, evaluation methods
– Data Collection/Analysis
– Interpretation
• Reporting that informs and improves the program
59
Our Recommendations
1. Understand that you are not alone if you don’t understand
the technical research terms (most people don’t)
– Arrange for a board/ leadership training to learn more about it, the
implications for your agency and how to tie your outcomes to your
fundraising strategy
2. Recognize that an ongoing investment in outcomes
measurement and evaluation is now required as a cost of
doing business. This is not an option!
– If your organization “cannot afford it,” you ought to be looking at ways
to restructure your organization to lower your agency’s fixed costs or
to work together with other agencies to afford the expertise required
Our Recommendations
3. Identify the evidence-based programs and approaches that
already exist in your industry/discipline related to your client
needs and existing programs
– Compare your existing programs to the literature.
– If your organization is doing something really unique or has the ability
to create an “ideal model” of innovative programming, invest in
developing a cost/benefit analysis for that program
4. Develop logic model(s) that outline the short term, medium
term and long term outcomes as well as the cost/benefit
“impacts” of your programs
Our Recommendations
5. Identify the indicators that should be measured and
develop an outcomes and impact measurement process
– Use an appropriate research design and technical assistance from a
qualified researcher with experience in your discipline
6. Determine if your organization would benefit from a formal
external evaluation of your programs and learn how to
select an external evaluator
Questions/Discussion
Download