Strategic Management & Planning in the Public Sector PLSC 592 Summer 2014

advertisement
Strategic Management & Planning in the Public Sector
PLSC 592
Summer 2014
Livonia Campus
TR 6-8:40 pm
Instructor: Barbara Patrick, Ph.D.
Office: Department of Political Science, Pray Harrold Hall
Office Hours: By Appointment
Phone: (734) 487-1453
E-Mail: bpatric1@emich.edu
The course is also designed to introduce students to the basic elements associated with research
design and statistical analysis in social sciences. The course presents statistical concepts, their
application to social science problems, and the interpretation of results. Usage of the computer
in aiding research will be emphasized throughout the course.
COURSE OBJECTIVE:
Program evaluation is an important component of the field of public administration. The course
is designed to provide a basic understanding of public policy and program evaluation. Elements
included in this process include needs assessments, the role of stakeholders, methods of
evaluation and assessment, monitoring programs, performance measures, and the role of
efficiency and effectiveness in the evaluation process. The organizational, political, ethical and
culture context of evaluations will also be discussed. Students will also conduct an evaluation of
a program or policy of their choice providing it is approved by the instructor.
Class Attendance and Etiquette:
Class attendance is very important. You have 2 absences, after the second absence your grade
will be lowered ½ a letter for each day you receive an unexcused absence. Repetitive late
arrivals can also result in an absence. Three unexcused late arrivals will be treated as one
absentee. If you arrive late or must leave early it is your responsibility to inform me. Also it is
very important that you arrive on time on exam days. If you arrive after the first student
completing the exam has left the room, you will not be allowed to complete the exam. Make-up
exams are only given with adequate documentation at the end of the semester. Examples of
excusable and limited circumstances under make ups are permitted include: travel to represent the
University, serious documented illness, or a real family emergency. Class attendance on days that
student present days is also very important. Failure to appear on present days without an
excused absence will result in a grade deduction. Late arrivals on presentation days will also
affect your presentation grade.
Students with Disabilities:
If you wish to be accommodated for your disability EMU Board of Regents policy #8.3 requires
that you first register with the Access Services Office (ASO) in room 203 King Hall. You may
contact ASO by telephone at (734) 487-2470. Students with disabilities are encouraged to
register with ASO promptly as you will only be accommodated from the date you register with
them forward. No retroactive accommodations are possible.
Academic Integrity:
Academic dishonesty, including all forms of cheating and plagiarism, will not be tolerated.
Penalties for an act of dishonesty may range from receiving a failing grade for a particular
assignment to receiving a failing grade for the entire course. In addition, you may be referred to
Office of Student Judicial Services for discipline that can result in either a suspension or
permanent dismissal. The Student Conduct Code contains detailed definitions of what
constitutes academic dishonesty, and it can be accessed online at www.emich.edu/sjs
GRADING SCALE:
Summer Project and Presentation: 40%
Critiques: 20%
Midterm One: 20%
Class Assignments: 20%
Critique
Students will review critically two articles of their choosing (each 10% of the grade). The first
article may be taken from the syllabus.
The second article may be taken from the syllabus or selected from an outside source of the
student’s choosing. The student must critically review an evaluation report.
The evaluation report to be critiqued must present results about an impact or outcome evaluation
of an existing program, not an article about how to conduct surveys or research.
The 2 to 3 page critique of the evaluation should be prepared in the following format:
*a brief description of the focus and findings
*identification of the key evaluation questions addressed
*a brief summary of the research design and data collection method used
*a list of threats to the measurement validity and reliability
Exam
The midterm exam will cover the readings and content of the course will contribute 20% to the
course grade.
Applied Project
Members of the class will be expected to participate in a program evaluation project either alone
or with one other student during the semester. Students choosing not to participate in an
evaluation project (Option 1) or performance management case studies (Option 2) are required to
prepare an evaluation design for an actual program (Option 3). The project contributes 40% to
the course grade.
Option One
Student groups of not more than 2 may perform an evaluation for a nonprofit or public agency.
A statement of the work to be performed must be submitted to the instructor for approval. The
Statement of the Work should include:
*a concise description of the evaluation questions that the primary stakeholders have identified
*a description of the methodology to be employed by the students to address the evaluation
questions
*identification of specific task to be accomplished
*identification of the expectations of information that the agency will provide to the students,
along with expected dates when they will provide the information, e.g., contact information for
clients or other data required
*a time line depicting deadlines for the tasks identified
Option Two
Student groups of no more than 2 students may conduct a case study of one intergovernmental
program that focuses on the use of performance data collected on service provided. The case
study should describe how agencies analyze and use performance and related information, and
include:
*documentation illustrating how agencies analyzed goals, performance measurements, data
systems, program evaluation studies, and related information
*information on factors related to variations in program performance
*information on the use of resulting information on efforts to improve program management,
performance, and results, and to the extent such information is available
*information on subsequent changes in program management, performance, and results
Option Three
Students may develop a scoping and evaluation design paper for a program. You will conduct
the scoping activities but will only propose the evaluation; you are not expected to conduct the
actual evaluation itself.
Scoping out the evaluation may entail collecting information on the program through interviews
with key contacts (decision-makers, staff, etc) on current information on needs, and conducting a
synthesis of past related research and evaluation studies. With the focus of the evaluation
identified, the project will then involve laying out an evaluation design, data collection plan,
analysis plan, and briefing and presentation plan. Students are expected to prepare a logic
model, and design data collection tools and pre-test, e.g., survey or interview schedules. The
design should be developed with clear awareness of the political aspects of the situation and
tailored to the needs of the agency leadership. Students are expected to research evaluations
undertaken on similar sorts of programs to offer a comparative perspective. Strategies for
encouraging the use of the resulting evaluation findings also should be discussed. The report
should have all of the components identified in the list below.
Required Elements of the Report for the Applied Project
*Introduction and Background: An introduction to the project, including the names of the team
and how/why they became involved, should be given along with a description of the scoping
activities, including a brief description of the program, and a synthesis of relevant past research
and evaluation findings. Also, cite relevant literature on the program.
*Evaluation Questions: The issues that have been identified and the specific questions that were
addressed, or should be addressed in the project is an evaluation plan should be provided.
*Evaluation Design: A brief summary of the design undertaken, or to be undertaken, including
the concepts and variables, the theory underlying the policy/program, etc. should be provided. A
logic model of the program/policy must be developed with clients and presented in the body of
the report with an appropriate introduction, i.e., stating what it is, how it was developed and how
it may be used by the client.
*Data Collection: The source of data available, measures used to address the research questions,
data collection methods, and sampling procedures should be discussed. Also, there should be a
list of limitations to validity and reliability, as well as actions undertaken to reduce the impact of
the limitations identified.
*Data Analysis: If the project is an evaluation plan, propose analytic strategies should be
discussed. Appropriate tables and figures should be constructed in accordance with guidance
given in class for projects that are completed.
*Proposed Presentation and Utilization Plan (for Evaluation Plans): Strategies for presenting the
results to key stakeholders and decision-makers and strategies for facilitating utilization should
be provided
*Potential Problems and Fall-back Strategies (for Evaluation Plans): Identify the potential
problems that may arise in conducting the evaluation and the strategies that should be used to
either avoid the problem or deal with its occurrence.
*Proposed Budget, Budget Narrative, and Work plan (for Evaluation Plans): Budgetary estimates
may range from specific to general depending upon the complexity of the proposed project.
*Conclusion: A brief conclusion should be provided.
I will allow students time to work on summer projects during class.
COURSE OUTLINE
Week One (May 6-8, 2014)
Tuesday
Introduction
Thursday
Introduction to Evaluation
Powell, Ronald (2006). “Evaluation Research: an Overview.” Library Trends, 55(1): 102 – 120.
(you can get it on line)
Levin-Rozalis, Miri (2003). “Evaluation and research: differences and similarities.” The
Canadian Journal of Program Evaluation, 18(2):1-31. (you can get it on line)
Dreolin, Fleischer and Christina Christie (2009). “Evaluation Use: Results From a Survey of
U.S. American Evaluation Association Members.” American Journal of Evaluation, 30(2): 15875. (you can get it on line)
Gard, Carol, Peggy Flannigan, and Maureen Cluskey (2004). “Program Evaluation: An Ongoing
Systematic Process.” Nursing Education Perspectives, 26(4): 176-9.
Taylor-Powell, Ellen, Sara Steele, and Mohammad Douglah (1996). “Planning a Program
Evaluation.” Program Development and Evaluation, University of Wisconsin Extension
Handout-Handbook of Practical Program Evaluation. Joseph Wholey, Harry Hatry, and
Kathryn Newcomer. Jossey-Bass, 3rd Edition, 2010. Chapters 1 and 2.
Week Two (May 13-15, 2014)
Tuesday
Introduction to Evaluation…continued
Powell, Ronald (2006). “Evaluation Research: an Overview.” Library Trends, 55(1): 102 – 120.
(you can get it on line)
Levin-Rozalis, Miri (2003). “Evaluation and research: differences and similarities.” The
Canadian Journal of Program Evaluation, 18(2):1-31. (you can get it on line)
Dreolin, Fleischer and Christina Christie (2009). “Evaluation Use: Results From a Survey of
U.S. American Evaluation Association Members.” American Journal of Evaluation, 30(2): 15875. (you can get it on line)
Gard, Carol, Peggy Flannigan, and Maureen Cluskey (2004). “Program Evaluation: An Ongoing
Systematic Process.” Nursing Education Perspectives, 26(4): 176-9.
Taylor-Powell, Ellen, Sara Steele, and Mohammad Douglah (1996). “Planning a Program
Evaluation.” Program Development and Evaluation, University of Wisconsin Extension
Handout-Handbook of Practical Program Evaluation. Joseph Wholey, Harry Hatry, and
Kathryn Newcomer. Jossey-Bass, 3rd Edition, 2010. Chapters 1 and 2.
Thursday
Needs Assessment, Logic Models, and Performance Measures
Heinrich, Carolyn (2002). “Outcome Base Performance Management In the Public Sector:
Implications for Government Accountability and Effectiveness.” Public Administration Review,
62(6): 712-25.
Brown, Mitchell (2012). “Enhancing and Measuring Organizational Capacity: Assessing the
Results of the US Department of Justice Rural Pilot Program Evaluation.” Public Administration
Review, 72(4): 506-515.
Weissert, Carol and Malcolm Goggin (2002). “Nonincremental Policy Change: Lessons from
Michigan’s Medicaid Managed Care Initiative.” Public Administration Review, 62(2): 206-16.
Courty, Pascal and Gerald Marschke (2007). “Making Government Accountable: Lessons from
a Federal Job Training Program.” Public Administration Review, 67(5): 904-16.
Handout-Handbook of Practical Program Evaluation. Joseph Wholey, Harry Hatry, and
Kathryn Newcomer. Jossey-Bass, 3rd Edition, 2010. Chapter 3.
Week Three (May 20-22, 2014)
Tuesday
Methods of Evaluation
Greene, Jennifer (1994). “Qualitative Program Evaluation: Practice and Promise.” In Handbook
of Qualitative Research, edited by N.K. Denzin and Y.S. Lincoln. (Google Scholars)
Loriz, Lillia and Patricia Foster (2001). “Focus Groups: Powerful Adjuncts for Program
Evaluation.” Nursing Forum, 36(3): pp. 31-36.
Lewis, Ann (1992). “Group Child Interviews as a Research Tool.” British Educational
Research Journal, 18(4): 413-421.
Cannell, Charles, Peter Miller, and Lois Oksenberg (1981). “Research on Interviewing
Techniques.” Sociological Methodology, 12, 389-437. (Google Scholars)
Rao, Vijayendra and Michael Woolcock (2003). “Integrating Qualitative and Quantitative
Approaches in Program Evaluation.” In Francois Bourguignon and Luiz Pereira de Sliva (eds).
The Impact of Economic Policies on Poverty and Income Distribution: Evaluation Techniques
and Tools. New York: Oxford University Press, pp. 165-90.
Quantitative Methods Handout
Thursday
Methods of Evaluation…continued
Greene, Jennifer (1994). “Qualitative Program Evaluation: Practice and Promise.” In Handbook
of Qualitative Research, edited by N.K. Denzin and Y.S. Lincoln. (Google Scholars)
Loriz, Lillia and Patricia Foster (2001). “Focus Groups: Powerful Adjuncts for Program
Evaluation.” Nursing Forum, 36(3): pp. 31-36.
Lewis, Ann (1992). “Group Child Interviews as a Research Tool.” British Educational
Research Journal, 18(4): 413-421.
Cannell, Charles, Peter Miller, and Lois Oksenberg (1981). “Research on Interviewing
Techniques.” Sociological Methodology, 12, 389-437. (Google Scholars)
Rao, Vijayendra and Michael Woolcock (2003). “Integrating Qualitative and Quantitative
Approaches in Program Evaluation.” In Francois Bourguignon and Luiz Pereira de Sliva (eds).
The Impact of Economic Policies on Poverty and Income Distribution: Evaluation Techniques
and Tools. New York: Oxford University Press, pp. 165-90.
Quantitative Methods Handout
Week Four (May 27-29, 2014)
Tuesday
Methods Examples of Techniques
Battaglio, R. Paul, Jr. (2010). “Public Service Reform and Motivation: Evidence From an
Employment At Will Environment.” Review of Public Personnel Administration, 30(3): 341363.
Bowman, James, Marc Gertz, and Sally Gertz (2003). “Civil Service Reform in Florida State
Government: Employee Attitudes 1 Year Later,” Review of Public Personnel Administration,
23(4): 286-304.
Howard, Joseph, Sharon Wrobel, and Keith Nitta (2010). “Implementing Changes in an Urban
School District: A Case Study of the Reorganization of the Little Rock School District.” Public
Administration Review, 70(6):934-41.
De Lancer Julnes, Patria and Derek (2011). “Strengthening Efforts to Engage the Hispanic
Community in Citizen-Driven Governance: An Assessment of Efforts in Utah.” Public
Administration Review, 71(2):221-231.
Weissert, William and Lucy Frederick (2013 Special Issue). “Pay for Performance: Can It Help
Improve Nursing Home Quality?” Public Administration Review 73 (1s):140-151.
Thursday
Ethics in Evaluation
Cousins, J. Bradley (2004). “Commentary: Minimizing Evaluation Misuse as Principled
Practice.” American Journal of Evaluation, 25(3): 391-97. (you can get it on line)
Rosas, Scott (2006). “Nonparticipant to Participant: A Methodological Perspective on Evaluator
Ethics.” American Journal of Evaluation, 27(1): 98-103. (you can get it on line)
Mello, Robin (2005). “Close Up and Personal: The Effect of a Research Relationship on an
Educational Program Evaluation.” Teachers College Record, 107(10): 2351 – 2371.
Exam Review Discussion
Week Five (June 3-5, 2014)
Tuesday
Midterm Exam
Thursday
One on One Session Project Meetings
Lab Day
Week Six (June 10-12, 2014)
Tuesday
Work on Evaluation Project
Thursday
Work on Evaluation Project
Week Seven (June 17-19, 2014)
Tuesday
Presentations
Thursday
Presentations
Project’s Due
Week Eight (June 24-27, 2014)
Tuesday
Wrap Up Session
The instructor reserves the right to amend the syllabus.
Download