Using the Guiding Principles for Evaluators to Improve Your Practice

advertisement
Using the Guiding Principles
for Evaluators to Improve
Your Practice
Meredith Stocking, Atlanta-Area
Evaluation Association
Note: This presentation has been adapted in part
from the AEA’s Guiding Principles Training Packet
Objectives
• Increase knowledge of the AEA Guiding
Principles for Evaluators (GP)
•
Analyze the Guiding Principles in a
program evaluation context
•
Consider how the Guiding Principles can
be used to inform your evaluation
practice
What are ethical problems in
evaluation anyway?
• Issues of moral accountability
that involve doing the “right”
or “wrong” thing
• Often involve the welfare of
others
• Choices between unfavorable
alternatives
AEA’s Development of the Guiding
Principles for Evaluators
• 1986: Founding of American Evaluation Association
• 1993-1994: Original five Guiding Principles for
Evaluators developed and ratified
• 2002-2004: GP reviewed and updated
• 2004: Revised GP endorsed through referendum of
AEA membership
Format and Purpose of the Guiding
Principles for Evaluators
• Format of the Guiding Principles (long version):
o
o
o
o
10 assumptions
5 principles with supporting sub-principles
2 pages of background
http://www.eval.org/publications/guidingprinciples.asp
• Purpose of the Guiding Principles:
o Promote ethical evaluation practice
o Foster continuing professional development
o Stimulate discussion within and outside evaluation
Assumptions Behind the Guiding
Principles for Evaluators
The Guiding Principles:
• Proactively guide everyday practice
• Cover all kinds of evaluation
• Do not apply in every situation
• Are not independent, but overlap
• Sometimes conflict
• Were developed in the context of Western cultures
Principle A:
Systematic Inquiry
•
Evaluators conduct systematic,
data-based inquiries:
o Adhere to highest technical
standards
o Explore strengths and shortcomings
of evaluation questions and
approaches
o Communicate approaches,
methods and limitations
accurately
Principle B: Competence
• Evaluators provide competent
performance to stakeholders:
o Possess appropriate skills and
experience
o Demonstrate cultural
competence
o Practice within limits of
competence
o Continually improve
competencies
Principle C:
Integrity/Honesty
Evaluators display honesty and integrity and attempt to
ensure them throughout the entire evaluation process:
o Negotiate honestly with clients
and stakeholders
o Disclose values, interests and
conflicts of interest
o Represent accurately
methods, data and findings
o Disclose source of request and
financial support for evaluation
Principle D:
Respect for People
Evaluators respect security, dignity and self-worth of all
stakeholders:
o Understand evaluation context
o Get informed consent and protect
confidentiality
o Maximize benefits and minimize harm
o Foster social equity
o Respect differences among
stakeholders
Principle E: Responsibilities for
General and Public Welfare
Evaluators take into account general and public interests:
• Include a full range of
stakeholders
• Examine assumptions and
potential side effects
• Balance client and stakeholder
needs
• Allow all relevant stakeholders
access to findings in
understandable forms
Case Study: Evaluating the Health
Care Collaborative
Background: The Health Care Collaborative (HCC), a signature program of a local nonprofit
organization, is designed to increase the delivery of primary health care services to residents in
a low-income, underserved neighborhood. The HCC uses trained neighborhood residents as
outreach health workers to raise health-issues awareness among residents and to give them
options for accessing health care. Health care providers who are collaboration partners
deliver a range of services to program participants. HCC’s principal funder recently
approached the nonprofit’s board or directors to request an external evaluation of the
program. Margaret, the nonprofit’s director, is meeting with Jane, a former board member
and professor at the local University, to ask her to consider taking on the evaluation.
Margaret: Jane, it’s been such a long time! It’s great to see you! How have things been
going?
Jane: Everything’s great! I am really busy leading a large multi-site evaluation with one of my
colleagues, and still teaching a few evaluation classes. What can I do for you today?
Margaret: Well, our main funder just asked us for an external evaluation of our Health Care
Collaborative Program.
Jane: Yes, that’s a fabulous initiative. What do they want to know?
Continued on the next slide.
Margaret: They are saying that they need more information than the program’s reporting system
alone can provide- principally about how the neighborhood residents and program participants
view HCC, and how the program is meeting or not meeting identified service needs.
Jane: That sounds pretty straight forward. Have you issued an RFP?
Margaret: Well, you know that we always have such a positive response to the work we do, but the
program has been subject to recent criticism from some new neighborhood residents. We are really
relying on the funder to renew our grant next year, and we don’t want the results of the evaluation
to be skewed by the opinion of a vocal few. We’d really like to work with someone we trust who is
familiar with our track record.
Jane: Honestly Margaret, you are a great friend, but I am really slammed right now. I don’t think I
can take on another project.
Margaret: What if you designed the evaluation and provided some oversight, but recruited one or
two graduate students to do the data collection and analysis?
Jane: I suppose that may be an option. Can you give me an idea of the current composition of the
community and the stakeholders that would be part of the evaluation team? I need to think about
whether I have any students that would be a good fit. It sounds like you’re going to need to get
some fairly broad community involvement given the information needs of the funder.
Margaret: Well, we only have about eight months to complete the evaluation and our budget is
fairly tight, so we may need to limit stakeholder participation on the evaluation team. The make-up
of the community has been in flux lately. The African American and Latino populations are still big,
but the neighborhood has seen a lot of East African refugees arrive as well as some immigrants from
Eastern European nations. It’s becoming more and more diverse.
Continued on the next slide.
A few days later: Jane is talking with one of her best graduate students, Alisa, about possibly
working on the HCC evaluation. Alisa did some data analysis for Jane last semester and she
knows he does good work. Alisa is also has the advantage of being fluent in Spanish.
Alisa: The project sounds really interesting. And I am actually looking for a subject to write my
Master’s thesis about. This could be a perfect topic! What kind of design are you thinking of?
Jane: Probably a mixed-methods approach. We could do some surveys of participants,
program staff, and health care providers. Also, one of the things they want to know is how
beneficiaries and other folks in the neighborhood feel about the program, so a few focus
groups would be good. Do you have any experience conducting focus groups?
Alisa: I could probably wing it. I took a workshop on qualitative methods one time. I mean,
its not like we are talking about rigorous experimental methods or anything. How difficult can
it be?
Jane: I suppose it’s a nice opportunity for you to get some practice. It’s just a small nonprofit
program, and I don’t imagine the stakes of the evaluation are too high. Besides, I don’t have
any other students with your language skills. When are you available to get started?
Alisa: I could start as soon as you’d like. But I would have to do all the data collection during
the day. I have all night classes this semester.
Continued on the next slide.
After data collection: Alisa and Margaret meet to discuss the initial findings and plans for data dissemination.
Alisa will report back to Jane, who is unable to make the meeting due to travel for her other evaluation
project.
Margaret: So Alisa, how did data collection go? What are you finding?
Alisa: I ended up conducting three focus groups- one for senior citizens, another for adult non-senior males,
and the third for adult, non-senior females. I am still in the process of analyzing the survey and focus group
data, but all in all participants are overwhelmingly positive and satisfied with your services.
And this is probably no news to you, but the program serves a disproportionate number of Hispanic adults
compared to the neighborhood’s composition. I think that it may be important to consider why other racial
and ethnic groups aren’t taking advantage of your services in greater numbers. But since we didn’t involve
non-participants in the study, we really can’t say.
Margaret: The board and the funder will be very happy to hear about the positive results. I am planning on
giving the main evaluation briefing at next month’s meeting of the HCC board to which the funder will be
invited. We can discuss the issue of how to better engage other racial and ethnic groups with the program
staff at a later date. It’s really a programming issue and not something the funder needs to worry about.
Alisa: Really?
Margaret: Yeah. But don’t worry about it. We will definitely use the information.
Alisa: Ok. Hey, there was one more thing I wanted to ask you about. Jane and I wanted to do a
presentation on this evaluation at our national evaluation conference next month. Would that be OK with
you? Of course, we will be sure to protect the confidentiality of all of the participants.
Margaret: That’s a great idea! I am really happy that you are getting so much out of this opportunity. And I
heard from Jane that your thesis is really looking nice. I would love to get a copy of it when you are done.
Alisa: Sure thing!
Discuss!
Food for Thought
•
What elements of competence are at stake? How do omissions in
competence affect the ability to meet the principle of systematic
inquiry? What about respect for people?
•
In what ways does the evaluation address potential weaknesses or
criticisms of convenience in gathering data as opposed to
systematic methods?
•
How would you assess the honesty and integrity of those involved
in the evaluation? If you were part of this evaluation, what would
you do to resolve or handle issues related to integrity?
•
Is there anything problematic about the way that stakeholders were included in the evaluation
activities? How does the level and kind of stakeholder involvement relate to the principles of
respect for people and public interest?
•
Is the evaluation reporting adequate? What about the use of findings?
•
To what extent does this evaluation attempt to foster social equity?
Discussion
• What ethical dilemmas have you
encountered in your own
evaluation practice? How did
you handle the situation?
• How can you use the Guiding
Principles as you design and
conduct your own evaluations?
“Not knowing what constitutes best practice is incompetence.
Knowing what best practice is, but not knowing how to achieve
it, may be inexperience. Knowingly not following best practice,
when one knows how to achieve it, is unethical.”
Smith, N. (2002). An analysis of ethical challenges in evaluation. American Journal
of Evaluation, 23, 199-206.
THANK YOU!
Download