NEW YORK UNIVERSITY

advertisement
1
NEW YORK UNIVERSITY
ROBERT F. WAGNER GRADUATE SCHOOL OF PUBLIC SERVICE
PADM-GP 2171/GPH-GU 2371 Section 2
Program Analysis and Evaluation -- Fall 2014
Tuesday 4:45-6:35 pm; Silver 410
Adjunct Prof Jodi F Paroff
LAST REVISED – 8/26/14
To contact instructor: Jodi.Paroff@nyu.edu, Phone: (973) 934-7214
Office hrs: Tues 2:30-4:30pm or by appointment, Puck Building Adjunct Offices
(Please use email to request an appointment in advance or to notify the instructor of any
papers left in the adjunct mailbox.)
Course Pre-requisites: You must have completed (or waived) P11.1011 (Statistical Methods)
and P11.1022 (Introduction to Public Policy). This course builds on these introductory
courses and lays the foundation for P11.2875 (Evaluation of Health and Social Programs).
Course Description and Objectives: Program evaluation is a critical part of designing and
operating effective programs. Evaluations supply information to program managers and
policymakers that can assist them in making decisions about which programs to fund,
modify, expand or discontinue. Evaluation can be an accountability tool for program
managers and funders. This course serves as an introduction to evaluation methodology
and evaluation tools commonly used to assess publicly funded programs. Students taking
this course can expect to
 become familiar with the concepts, methods and applications of evaluation results;
 learn how to read evaluation research critically;
 understand how to use evaluation results to improve program performance; and
 be able to propose an appropriate evaluation plan to assess the implementation and
effectiveness of a program.
Course Structure: The class includes lecture, readings, and discussion. There is no
specific policy or sector focus to this course, as evaluation tools are used in all policy areas
and by public (government) and private (foundation) funders as well as by public and
private sector program managers. You are encouraged to relate the general material of the
course to your specific policy interests.
Readings: The required textbook for this course is:
Carol H. Weiss (1998) Evaluation: Methods for Studying Programs & Policies 2nd
edition. Prentice Hall
An optional and recommended text is:
Peter Rossi, Howard Freeman, and Mark Lipsey (2004) Evaluation: A Systematic
Approach, 7th ed. Sage Publications. (abbreviated in syllabus as “RFL”)
In addition to the required text, you will need to read one chapter from the optional
textbook and many journal articles. All of the articles are available through Bobst
electronic journals and I will make them available on NYU Classes. The chapter readings
from books are available for downloading directly from NYU Classes.
2
There is a sizable body of literature which deals with program evaluation and policy
analysis. The journal Evaluation Review (previously Evaluation Quarterly) is an especially
rich source on the subject, as is the Evaluation Studies Review Annual (Sage, more or less
annually). Evaluation Practice, Evaluation and Program Planning, New Directions for
Program Evaluation, and Journal of Policy Analysis and Management are also recommended.
There are also evaluation journals for specific fields, including Evaluation and the Health
Professions, Evaluation in Education, and Evaluation and Human Services.
Course requirements: Class preparation and participation are important for this course.
You need to read required text and articles in advance and be prepared to participate in
class discussion. In addition to class participation, you will: write two brief memos,
complete one take-home exam, facilitate at least one small group journal discussion, create
a poster with your group for peer review, and write a final evaluation design paper. It is
expected that you are familiar with NYU’s honor code and will uphold the university’s
standards for academic honesty in all of your work.
Attendance: No grades are given for attendance, but your regular participation in reading
critiques IS factored into your class participation grade. Please notify the instructor of any
days you will be missing. You are responsible for any missing information and meeting all
assignment deadlines during your absences.
Program Statement Memo, due Sep 30, (5%)
Students will submit a short (1 - 2 pages) description of a selected program, indicating the
problem to be addressed by the intervention, the intended beneficiaries or targets of the
program, the intended benefits, and the causal model/program theory underlying the
program. This memo is a preliminary step in writing the final design paper.
Midterm Examination:
Take-home essay style exam due Oct 21, (35%)
Measurement Memo, due Nov 18, (5%)
Using the program model developed in the first memo, students will specify the research
questions, operational definitions, and specific measures they would use in an evaluation of
the program.
OPTIONAL Evaluation Review (for extra credit) due Dec 9
It is important to become a good evaluator and a good consumer of evaluations. Review
one of three selected evaluation articles. In 2 - 3 pages, summarize the type of evaluation
described, its design and methods, and write a critique of the evaluation.
Poster Sessions & Feedback due Dec 2nd, (10%)
Each group will create a poster that presents the evaluation design proposal of their final
paper/project to be submitted for peer critique. Posters will include program theory and
descriptions and outcome measures from earlier assignments, as well as design rationale,
and data collection strategies. Clarity, application of concepts discussed in class, and the
rationale behind your evaluation design choices will be emphasized.
3
Final Paper: Impact Evaluation Design, due Dec 15, (35%)
The final paper builds on earlier assignments. You will design a comprehensive evaluation
plan/design for your chosen program. The design proposal will focus on outcome or impact
evaluation but will include a brief section on process evaluation as well.
Relative Weight of Assignments
Midterm Exam
35%
Final Paper
35%
Two memos
Class Participation
Poster session
10%
10%
10%
Note: the above descriptions are not enough to complete the assignments adequately. More
detailed instructions for each assignment will follow.
Course Schedule
Part I:
Planning and Implementation
Sep 2
Class 1: Introduction to the course and the field of program evaluation;
stakeholders; ethical considerations
 Weiss Chapter 1, Chapter 5 pp. 100-108
Sep 9

Rogers, P. (2012). Introduction to Impact Evaluation. Guidance note found
here: http://www.interaction.org/document/introduction-impactevaluation. (read up to page 5)

Markiewicz, A. (2005). A balancing act: Resolving multiple stakeholder
interests in program evaluation. Evaluation Journal of Australasia. Vol 4
Nos. 1 & 2. March/April. p. 13-21.

Read “Michael Scriven on the Differences between Evaluation and Social
Science Research” (ww.hfrp.org)

Guiding Principles for Evaluators. American Evaluation Association.

Oakes, J.M. (2002). Risks and wrongs in social science research: an
evaluator’s guide to the IRB. (pp. 443-454)

Optional: Sieber, J.E. and M.B. Tolich (2013). Planning Ethically
Responsible Research. Chapter 5: Journalist Ethics ≠ Social Scientist
Ethics. pp. 77-92.
Class 2: Pre-program evaluation activities: needs assessment
 Weiss Chapter 2 (Purposes of eval)

Watkins, R. et al. (2012). A Guide to Assessing Needs: Tools for collecting
information, making decisions, and achieving development results. Section
1: Needs Assessment FAQ.

Ohio State University Primer based on Altschuld & Kumar’s “Needs
Assessment” (2009).

O’Toole, J et al. (2013). Closing the Gap: A Needs Assessment of Medical
Students and Handoff Training. Journal of Pediatrics. (CS)
4
Sep 16
Sep 23

Berberet, H.M. (2006). Putting the pieces together for queer youth: a model
of integrated assessment of need and program planning.

http://india.blogs.nytimes.com/2012/07/22/mapping-toilets-in-amumbai-slum-yields-unexpected-results/?emc=eta1
Class 3: Explicating and assessing program theory
 Weiss Chapter 3 (Understanding the program)

Kaplan, S.A. and Garrett, K.E. (2005). The use of logic models by
community-based initiatives.

Chen, W. & Lin (1997). Evaluating the process and outcome of a garbage
reduction program in Taiwan. (CS)

Proscio, Tony (2000). In Other Words. The Edna McConnell Clark
Foundation. New York, NY.

Optional: RFL Chapter 5

Optional: Cooksy, G. & Kelly (2001). The program logic model as an
integrative framework for a multi-method evaluation.
Class 4: Formative evaluation, program monitoring, and implementation
analysis
 Curran, Gittelsohn, Anliker, et al (2005). Process evaluation of a storebased environmental obesity intervention on two American Indian
reservations. (CS)

Heinz & Grant (2003). A process evaluation of a parenting group for parents
with intellectual disabilities.

Weiss Chapter 11 (Qualitative Methods)

Optional: Onyskiw, Harrison, Spady, & McConnan. (1999). Formative
evaluation of a collaborative community-based child abuse prevention
project.

Optional: Giorgio, Kantor, Levine, & Arons (2013). Using chat and text
technologies to answer sexual and reproductive health questions: Planned
Parenthood pilot study. Journal of Medical Internet Research [J Med
Internet Res], Sep 20; Vol. 15 (9).
Part II:
Measuring the Impact of Programs
Sep 30
Class 5: Outcome/Impact evaluation: design, internal and
external validity
 Weiss Chapter 8 (Design of evaluation, pp 180-214)

http://www.ted.com/talks/esther_duflo_social_experiments_to_fight_povert
y.html
5

Oct 7
DUE: Program memo & ungraded evaluation challenge (Brooklyn
clinics and HIV/AIDS identification)
Class 6: Outcome/Impact evaluation: randomized experimental design
 Weiss Chapter 9

Seron, C., Ryzin, G.V., Frankel, M., & Kovath, J. (2001). The impact of legal
counsel on outcomes for poor tenants in New York City’s housing court:
results of a randomized experiment. (CS)

Robertson, A., St. Lawrence, J., & Morse, D.T. et al (2011). Comparison of
health Education and STD Risk Reduction Intervention for Incarcerated
Adolescent Females.

Optional: RFL Chapter 8

Program memo returned, full midterm question pool shared (date
subject to change)
Oct 14
NYU closed for Fall Recess
Oct 21
Class 7: Outcome/Impact evaluation: quasi-experimental designs with
comparison groups
 RFL Chapter 9 pp 265-286
Oct 28
Nov 4

Optional: Cumberland, P., Edwards, T., Hailu, G., et al. (2008). The impact
of community level treatment and preventative interventions on trachoma
prevalence in rural Ethiopia.

Optional: McMillan, J. (2007). Randomized Field Trials and Internal Validity:
Not so Fast My Friend.

DUE: Midterm (email to Instructor before class).
Class 8: Finishing Quasi-experimental design, Quality comparisons, and
Formulating evaluation questions
 Weiss, Chapter 6 (Measures), Chap 4 (Planning the Eval) pp. 74-82.

Litwin, M.S. (2003). How to Assess and Interpret Survey Psychometrics, Ch
2 (Reliability) & 3 (Validity)

http://www.nytimes.com/2011/09/23/opinion/why-cyberbullyingrhetoric-misses-the-mark.html

Optional: Chandrika Ismail A., et al. (2009). A model of substance abuse
risk: Adapting to the Sri Lankan context.

Mid-terms returned to you
Class 9: Measurement, continued
 Lester, P. et al. (2012). Evaluation of a family-centered prevention
intervention for military children and families facing wartime deployments.
(CS)
6
Nov 11
Nov 18
Nov 25

Colosi and Dunifon, "What's the Difference? "Post then Pre" & "Pre then
Post". www.human.cornell.edu. (2006) Cornell Cooperative Extension.

Optional: Dufrene, R. (2000). An evaluation of a patient satisfaction survey:
validity and reliability.

Optional: Fisher, A. Designing HIV/AIDS Intervention Studies. Chapter 6
(Operational Definitions) See:
http://www.popcouncil.org/pdfs/horizons/orhivaidshndbk.pdf
Class 10: Sampling
 Babbie, E. (2013). The Practice of Social Research, Chapter 7.

Patton, MQ. (1990). “Purposeful Sampling” in Qualitative evaluation and
research methods. (pp. 169-186). Beverly Hills, CA: Sage.

Weiss Chapter 7, pp. 163-166.

Optional: Bartlett III, J.E., Kotrlik, J.W., Higgins, C.C. (2001).
Organizational research: determining appropriate sample size in survey
research.

Optional: Taylor-Powell, E. (1998). Sampling. University of WisconsinExtension.
Class 11: Strengthening full coverage and reflexive designs
 Weiss, review Chapter 8 pp. 191-199

RFL Chapter 9 pp. 289-295

Ballart, X. & Riba, C. (1995), Impact of legislation requiring moped and
motorbike riders to wear helmets.

Pettifor, A., Taylor, E., Nku, D., Duvall, S., Tabala, M., Mwandagalirwa, K.,
Meshnick, S., & Behets, F. (2009). Free distribution of insecticide treated
bed nets to pregnant women in Kinshasa: an effective way to achieve 80%
use by women and their newborns. (CS)

Optional: Cook, C. (2002). The effects of skilled health attendants on
reducing maternal deaths in developing countries: testing the medical
model.

DUE: Measurement memo
Class 12: Ethical Considerations in Program Evaluation (continued)
 Weiss, Chapter 14 (Ethics)

Bluestein, J. (2005). Toward a more public discussion of the ethics of
federal social program evaluation. (pp. 823-840)

Oakes, J.M. (2002). Risks and wrongs in social science research: an
evaluator’s guide to the IRB. (pp. 460-467)

Optional: Shaw, I.F. (2003). Ethics in qualitative research and evaluation.

Measurement memo returned
7
Dec 2
Class 13: Poster Session, critique and feedback
Dec 9
Class 14: Evaluation Synthesis and Meta-Analysis
 Weiss Chapter 10, pp 235 – 244 (good design)
Dec 16

Webb, T.L., Joseph, J., Yardley, L., and Michie, S. (2010). Using the internet
to promote health behavior change: a systematic review and meta-analysis
of the impact of theoretical basis, use of behavior change techniques, and
mode of delivery on efficacy. (CS)

Gansle, K.A. (2005). The effectiveness of school-based anger interventions
and programs: a meta-analysis.

http://www.nytimes.com/2013/02/02/opinion/health-cares-trickcoin.html by Ben Goldacre

Optional: Independent Evaluation Group. (2010). What Can We Learn from
Nutrition Impact Evaluations? Lessons from a Review of Interventions to
Reduce Child Malnutrition in Developing Countries. Washington, DC: World
Bank.

Optional: Visher, C.A., Winterfield, L., & Coggeshall, M.B. (2005). Exoffender employment programs and recidivism: a meta-analysis.

OPTIONAL Evaluation review memo due (for extra credit)
DUE: Final Paper, no class.
Submit group paper via email as MSWord or pdf attachment.
Download