Tabia Henry Akintobi, PhD, MPH Director, Prevention Research Center Director, Evaluation and Institutional Assessment Research Associate Professor Department of Community Health and Preventive Medicine Morehouse School of Medicine Morehouse School of Medicine Prevention Research Center 2013 © Objectives Detail the elements of evaluation specific to community needs assessment and program development Describe qualitative and quantitative data collection and analysis considerations in community contexts To discuss the application of evaluation frameworks applied in community-responsive needs assessment To discuss lessons learned, issues and recommendations in community –engaged practice Morehouse School of Medicine Prevention Research Center 2013 © Prevention Research Centers The MSM PRC functions within the only Historically Black College and University funded among The Centers for Disease Control and Prevention’s 37 Health Promotion and Disease Prevention Research Centers Morehouse School of Medicine Prevention Research Center 2013 © Morehouse School of Medicine Prevention Research Center (MSM PRC) Theme - Risk Reduction and Early Detection in African American and Other Minority Communities: Coalition for Prevention Research Morehouse School of Medicine Prevention Research Center 2013 © Where the MSM PRC Works Morehouse School of Medicine Prevention Research Center 2013 © MSM PRC Community Coalition Board (CCB) The MSM PRC CCB is comprised of: Community Residents Academic Institution Representatives Agencies within the City of Atlanta Neighborhood Planning Units V, X, Y and Z. Morehouse School of Medicine Prevention Research Center 2013 © Evaluation and Institutional Assessment Unit: Technical Capacity Assessments conducted through direct or collaborative acquisition of federallyfunded, private, and local grants and contracts Morehouse School of Medicine Prevention Research Center 2013 © Morehouse School of Medicine Prevention Research Center 2013 © Evaluation and Institutional Assessment Unit: Guiding Principles • Evaluations Should be Participatory • • • Build Evaluation Capacities • • • Demystify evaluation Evaluation as essential to program planning, implementation, and measurement Both Formative and Summative Evaluation is Critical • • • Partnership between evaluators and stakeholders Sustained ownership and involvement Identifying best or promising practices Defining success and how it is achieved Evaluations Should Lead to Decision-Making • Formulations of recommendations for programmatic improvements, practice/policy changes, or subsequent research to address identified needs Morehouse School of Medicine Prevention Research Center 2013 © Morehouse School of Medicine Prevention Research Center 2013 © To guide the process and encourage collaboration Morehouse School of Medicine Prevention Research Center 2013 © The Participatory Framework for Designing the Program and Its Evaluation Community Assessment Report the Findings Analyze and Interpret the Data S T A K E H O L D E R S Public Health Initiative Design the Evaluation Collect the Data Morehouse School of Medicine Prevention Research Center 2013 © Engaging Stakeholders Stakeholders: – Individuals, groups or organizations having a significant interest in how well a program functions and/or in the health topic . – For instance, those with decision-making authority over the program, funders and sponsors, administrators and personnel, clients or intended beneficiaries Rossi, Lipsey & Freeman (2004) Morehouse School of Medicine Prevention Research Center 2013 © Engaging Stakeholders Type of Stakeholder What they do Who they are Primary Daily contact, request report, major decisionmakers, current program participants sponsors, collaborators, coalition partners, funding officials, administrators, managers, and staff Secondary Little or no daily contact, potential program participants, current partners clients, family members, neighborhood organizations, academic institutions, elected officials, advocacy groups, professional associations Tertiary Potential partners/funders, target population Staff, board members, administrators, volunteers Morehouse School of Medicine Prevention Research Center 2013 © Engaging Stakeholders Identify leaders first Use “snowballing” Understand the role and importance of potential stakeholders To the extent possible, consider all stakeholder perspectives Be inclusive Morehouse School of Medicine Prevention Research Center 2013 © Why Needs Assessment? Provides a systematic process to guide decisions. Provides justification for decisions before they are made. It is scalable for any size project, time frame, or budget. Facilitates community engagement in defining needs, assets and priorities. www.worldbank.org/ieg/training/TNA.ppt Morehouse School of Medicine Prevention Research Center 2013 © 14 Needs Assessments Help Us Avoid… “What we really need is training on XYZ.” “But that is the way we have always done it here.” Programs that are not aligned with community / funder priorities Answers that are simple, straightforward, acceptable, understandable… and yet wrong www.worldbank.org/ieg/training/TNA.ppt Morehouse School of Medicine Prevention Research Center 2013 © Needs Assessment Process Plan Coalition Building; Consultation Implement Data Collection – Describe Needs / Assets Analyze Prioritize Assess Data Strengths/ Weaknesses Prioritize Needs Enabling, Predisposing Factors Morehouse School of Medicine Prevention Research Center 2013 © Develop Strategies http://www.nswphc.unsw.edu.au/pdf/VitalLinks06/ Powerpoints/VanessaTraynorSarahDennis_Demy stifyingneedsassessment.pdf Planning - Consideration What other needs assessments have been done in this area or with the demographic group? What questions remain to be answered? What form of data collection is appropriate to answer these questions? What resources are available (money, skills etc)? What steering / reference groups exist or need to be established to guide the assessment? Who else can / should we involve? http://www.nswphc.unsw.edu.au/pdf/VitalLinks06/Powerpoints/VanessaTraynorSarahDennis_Demystifyingneedsassessment.pdf Morehouse School of Medicine Prevention Research Center 2013 © Needs Assessment Method and Plan Development Driven by Objective(s) – Developed through Review of Literature – Shaped by Identification of Problem or Issue at the local level – May be Driven by a Theoretical Framework – Developed by Input from Stakeholders and Experts Morehouse School of Medicine Prevention Research Center 201 2013 © What Do You Want to Understand, Change or Measure? Attitudes Perceptions Preferences Knowledge or Skills Behavior Behavioral Intentions Morehouse School of Medicine Prevention Research Center 2013 © What do You Want to Know? Examples of Evaluation Questions: Who needs this program? What is the magnitude of the need? What should be done to meet the need? What are the existing resources or capacity to meet the need? Morehouse School of Medicine Prevention Research Center 2013 © Needs Assessment Process Plan Coalition Building; Consultation Implement Data Collection – Describe Needs / Assets Analyze Prioritize Determine Data Strengths/ Weaknesses Prioritize Needs Enabling, Predisposing Factors Morehouse School of Medicine Prevention Research Center 2013 © Develop Strategies http://www.nswphc.unsw.edu.au/p df/VitalLinks06/Powerpoints/Vanes saTraynorSarahDennis_Demystifyi ngneedsassessment.pdf Needs Assessment Process Plan Coalition Building; Consultation Implement Data Collection – Describe Needs / Assets Analyze Prioritize Assess Data Strengths/ Weaknesses Prioritize Needs Enabling, Predisposing Factors Morehouse School of Medicine Prevention Research Center 2013 © Develop Strategies http://www.nswphc.unsw.edu.au/p df/VitalLinks06/Powerpoints/Vanes saTraynorSarahDennis_Demystifyi ngneedsassessment.pdf Analysis & Next Steps - Considerations Analysis Prioritization “What are the most Comprehensive Picture critical unmet needs?” (for Identify Needs whom, etc) Identify Contributing “What can be changed?” Factors – Predisposing, “What will it take to Enabling, etc. address the needs?” (resources, cost, strategies) http://www.nswphc.unsw.edu.au/pdf/VitalLinks06/Powerpoints/VanessaTraynorSarahDennis_Dem ystifyingneedsassessment.pdf Morehouse School of Medicine Prevention Research Center 2013 © Morehouse School of Medicine Prevention Research Center 2013 © Evaluation in Context Planning, Implementation, and Effect of Initiative Planning Formative Evaluation Implementation Process monitoring and Evaluation Effect Outcome and Impact Evaluation Evaluation Continuum Morehouse School of Medicine Prevention Research Center 2013 © What is Evaluation? A systematic process Involves data collection Process for enhancing knowledge and decision-making Adapted from Russ-Eft and Preskill (2001) Morehouse School of Medicine Prevention Research Center 201 2013 © Role and Importance of Evaluation Planning Guides planning of proposed programs and activities prior to full implementation of program Monitors and documents implementation of programmatic activities Assesses and documents whether activities and interventions achieve desired outcomes Informs programmatic decisions Morehouse School of Medicine Prevention Research Center 2013 © Evaluation Planning Evaluation planning includes: Clarifying the purpose of the evaluation Decide what to evaluate Understand what is involved Choose the evaluation type and design Gather necessary resources Determine how to use the results Morehouse School of Medicine Prevention Research Center 2013 © Choosing The Evaluation Design Selection of Design should be based on: Decisions May Be Based on Results of the Needs Assessment Key Program Questions, Performance Indicators Resource (program and evaluation) availability How the data will be used Timeline Adapted from Sharma, Lanum, Suarex-Balcazar (2000). A Community Needs Assessment Guide. Loyola University Chicago Morehouse School of Medicine Prevention Research Center 2013 © Choosing The Evaluation Design Consider stakeholder needs: The information needs of key stakeholders and primary users How the information will be used Who will use it What kind of information will have the most credibility for the intended users Morehouse School of Medicine Prevention Research Center 2013 © Other Evaluation Design Considerations Consider participant characteristics: Literacy Geographic dispersion Cultural issues (including language) Accessibility Logistic and contextual constraints Alternative sources of information Time and resource constraints Morehouse School of Medicine Prevention Research Center 2013 © Morehouse School of Medicine Prevention Research Center 2013 © The “Evidence” in Evidence-based Interventions is the Data Collection, Analyze and Interpreted May be primary and/or secondary Primary Data-you collect as a result of your activities or research Secondary Data-data already collected and independent of your activities or research May be qualitative or quantitative Should be systematically collected/ reviewed, analyzed and interpreted Morehouse of Medicine Prevention Research Center 2010 © 201 MorehouseSchool School of Medicine Prevention Research Center 2013 © Data Collection • Data collection includes: Identifying existing data sources (secondary) Determining the best data collection method (focus group, survey, interview…) Selecting or creating data collection instruments Deciding on the most appropriate procedures for collecting data Morehouse School of Medicine Prevention Research Center 2013 © Frequently Used Qualitative Data Collection Tools In-Depth Interviews Complex subject matter and expert respondents Highly sensitive subject matter Geographically dispersed participants Aim to diminish peer pressure or minimize influence on responses Focus Groups Group interaction to stimulate richer responses Observation of behaviors, attitudes and language Idea generation Pre-testing Evaluation of message concepts Problem identification Morehouse School of Medicine Prevention Research Center 201 2013 © Considerations Morehouse School of Medicine Prevention Research Center 2013 © Frequently Used Quantitative Data Collection Tools: Surveys Self-Administered Surveys independently completed by participant Face-to-Face Survey Interviews participant completion is facilitated by program staff person Morehouse School of Medicine Prevention Research Center 2013 © Considerations Morehouse School of Medicine Prevention Research Center 2013 © Morehouse School of Medicine Prevention Research Center 2013 © Frequently Used Qualitative Data Collection Tools In-Depth Interviews Complex subject matter and expert respondents Highly sensitive subject matter Geographically dispersed participants Aim to diminish peer pressure or minimize influence on responses Focus Groups Group interaction to stimulate richer responses Observation of behaviors, attitudes and language Idea generation Pre-testing Evaluation of message concepts Problem identification Morehouse School of Medicine Prevention Research Center 201 2013 © Morehouse School of Medicine Prevention Research Center 2013 © Empowerment Evaluation Provides stakeholders with tools and skills to evaluate their program Ensures that the evaluation is part of the planning and management of the program (Fetterman, 2008). Morehouse School of Medicine Prevention Research Center 2013 © Empowerment Evaluation Characteristics Values improvement in people, programs, and organizations to help them achieve results Community ownership of the design and conduct of the evaluation and implementation of the findings Inclusion of appropriate participants from all levels of the program, funders, and community Democratic participation and clear and open evaluation plans and methods. Commitment to social justice and a fair allocation of resources, opportunities, obligations, and bargaining power Citations: Fetterman, 2008, Wandersman, 2005; Sufian, Grunbaum, Henry Akintobi, Dozier, Eder, Jones, Mullan, Weir, White-Cooper, 2011 Morehouse School of Medicine Prevention Research Center 2013 © Empowerment Evaluation Characteristics Use of community knowledge to understand the local context and to interpret results Use of evidence-based strategies with adaptations to the local environment and culture Building the capacity of program staff and participants to improve their ability to conduct their own evaluations Organizational learning, ensuring that programs are responsive to changes and challenges Citations: Fetterman, 2008, Wandersman, 2005; Sufian, Grunbaum, Henry Akintobi, Dozier, Eder, Jones, Mullan, Weir, White-Cooper, 2011 Morehouse School of Medicine Prevention Research Center 2013 © Case Study 1: PAATH-II Community Coalition PAATH-II Community Coalition Board Stakeholders - Parents, education professionals, civic leaders, and representatives from government agencies Funded to develop community-led approaches to address youth substance abuse and violence Target Community-Metropolitan Atlanta Zip Code 30318 Morehouse School of Medicine Prevention Research Center 2013 © Case Study 1: PAATH-II Community Coalition PAATH-II Partnered with MSM-PRC to Evaluate the Coalition through a Demographic Profile Needs assessment Consisted of: Secondary Data Primary Data Morehouse School of Medicine Prevention Research Center 2013 © Case Study 1: PAATH-II Community Coalition-Secondary Data • The MSM-PRC reviewed zip code 30318 statistics (general demographics and other risk factors associated with violence and substance abuse) including: racial background, household make-up, income level, vacant housing, public housing, and surrounding prison and addition facilities, truancy, etc. Morehouse School of Medicine Prevention Research Center 2013 © Case Study 1: PAATH-II Community Coalition-Primary Data Key Informant Interviews To gather insights, opinions and best thinking from the 30318 community on substance abuse and violence prevention Target audience: adult stakeholders representing community organizations, healthcare, neighborhood businesses, private practice, public housing, law enforcement, and the school system Youth ages 11-16 Recommendations led to prioritization Youth Mentoring and Alternative Education goals, evaluation activities, and outcomes Morehouse School of Medicine Prevention Research Center 2013 © Case Study 1: PAATH-II Community Coalition Alternative Education Goal Activities • Collaboration between the MSM-PRC and the PAATH-II Coalition Board to develop and administer an Alternative Education Survey (AES) to capture demographics, perceptions, and recommendations among students, parents, and stakeholders familiar with alternative education/non-traditional schools in the Atlanta Public School System (APS) • Analysis of evaluation data collected through the AES among 24 students, 22 parents, and 12 stakeholders • To submit and present the AES Report to the PAATH-II Coalition Board • To provide recommendations on key findings to include in the final presentation the Atlanta Board of Education Morehouse School of Medicine Prevention Research Center 2013 © Participatory Evaluation Involves key stakeholders in evaluation design and decision-making Acknowledges and addresses inequities in power and voice among stakeholders Morehouse School of Medicine Prevention Research Center 2013 © Participatory Evaluation Characteristics The focus is on participant ownership; the evaluation is oriented to the needs of the program stakeholders rather than the funding agency Participants meet to communicate and negotiate to reach a consensus on evaluation results, solve problems, and make plans to improve the program Input is sought and recognized from all participants. The emphasis is on identifying lessons learned to help improve program implementation and determine whether targets were met Citations: Patton, 2008; Sufian, Grunbaum, Henry Akintobi, Dozier, Eder, Jones, Mullan, Weir, White-Cooper, 2011 Morehouse School of Medicine Prevention Research Center 2013 © Participatory Evaluation Characteristics The evaluation design is flexible and determined (to the extent possible) during the group processes The evaluation is based on empirical data to determine what happened and why Stakeholders may conduct the evaluation with an outside expert serving as a facilitator Citations: Patton, 2008; Sufian, Grunbaum, Henry Akintobi, Dozier, Eder, Jones, Mullan, Weir, White-Cooper, 2011 Morehouse School of Medicine Prevention Research Center 2013 © Case Study 2: 2 HYPE Abstinence Education Club The 2 HYPE Abstinence Education Club (2 HYPE "A" Club) is a co-educational intervention targeting African American youth ages 12-18 in Fulton, DeKalb and Clayton counties within Metropolitan Atlanta The program serves youth in communitybased settings, schools and juvenile facilities, including probationary and longterm detention centers Morehouse School of Medicine Prevention Research Center 2013 © Case Study 2: 2 HYPE Abstinence Education Club Comprehensive Approach Including: promotion of delayed sexual activity violence prevention stress reduction & understanding of abstinence benefits Creative Arts Reinforce Abstinence Curriculum Club Activities i.e. Hip Hop Café (hip-hop rap, poetry, & dance performances) Peer Educator Training Parent Workshops Morehouse School of Medicine Prevention Research Center 2013 © Case Study 2: 2 HYPE Abstinence Education Club Youth Participants Wholistic Stress Control Institute Morehouse School of Medicine Prevention Research Center Morehouse School of Medicine Prevention Research Center 2013 © Case Study 2: 2 HYPE Abstinence Education Club Needs Assessment Survey Development Literature Review Comparable Programs Tools, Instruments – Validity, Survey Pilot Testing Four focus groups 30 African American youth ages 12 to 19 Reliability, Cultural Context Identified Trends in Program Data Process Evaluation Measures Evaluation Advisory Group Morehouse School of Medicine Prevention Research Center 201 2013 © Case Study 2: 2 HYPE Abstinence Education Club-Results Expanded understanding of how African American youth conceptualize of marriage, family and their futures Evaluation Implications: Findings signal the need for an expanded approach to implementing and assessing programs designed to reduce adverse sexual health outcomes among African American youth in urban settings Survey Revision Context for Interpretation of Results Morehouse School of Medicine Prevention Research Center 2013 © Evaluation Plan Development Basics Consider Reporting/Dissemination Who will you share results with? Will you share results with your community or selected stakeholders? Who? How? When? Why will this reporting/dissemination activity be important? Morehouse School of Medicine Prevention Research Center 2013 © Questions to Consider When Evaluating Community Engagement • Are the right community members at the table? This is a question that needs to be reassessed throughout the program or intervention because the “right community members” might change over time • Does the process and structure of meetings allow for all voices to be heard and equally valued? For example, where do meetings take place, at what time of day or night, and who leads the meetings? What is the mechanism for decision-making or coming to consensus; how are conflicts handled? • How are community members involved in developing the program or intervention? Did they help conceptualize the project, establish project goals, and develop or plan the project? How did community members help assure that the program or intervention is culturally sensitive? Citations: CDC, 2009; Green et al , 1995; Israel et al , 1998; Sufian, Grunbaum, Henry Akintobi, Dozier, Eder, Jones, Mullan, Weir, White-Cooper, 2011. Morehouse School of Medicine Prevention Research Center 2013 © Questions to Consider When Evaluating Community Engagement •How are community members involved in implementing the program or intervention? Did they assist with the development of study materials or the implementation of project activities or provide space? • How are community members involved in program evaluation or data analysis? Did they help interpret or synthesize conclusions? Did they help develop or disseminate materials? Are they coauthors on all publication or products? • What kind of learning has occurred, for both the community and the academics? Have community members learned about evaluation or research methods? Have academics learned about the community health issues? Are there examples of co-learning? Citations: CDC, 2009; Green et al , 1995; Israel et al , 1998; Sufian, Grunbaum, Henry Akintobi, Dozier, Eder, Jones, Mullan, Weir, WhiteCooper, 2011. Morehouse School of Medicine Prevention Research Center 2013 © Challenges in Community Needs Assessment Partnership Development, Relationship, Formation and Maintenance Different Individual and Organizational Cultures Values –Practical vs. Statistical Significance Varying Understanding of the Evaluation or Research Process Morehouse School of Medicine Prevention Research Center 2013 © Key Steps in Community Partnership Development, Relationship, Formation and Maintenance Clearly Define Target Population Provide Detailed Descriptions of Process and Outcomes Specify the Measures and Instruments for Data Collection Clarify Timeline for Conducting all Evaluation Activities Morehouse School of Medicine Prevention Research Center 2013 © Benefits in Community Evaluation Partnership Development, Relationship, Formation and Maintenance Community Credibility Increased Recruitment Sustained Retention Expanded Funding and Human Resources Collaboration in Dissemination of Emerging Models, Best Practices and Outcomes of Community Engagement and Associated Research Morehouse School of Medicine Prevention Research Center 2013 © Community Engagement & Evaluation Blumenthal, D. “How do you start working with a community?” Section 4a of “Challenges in Improving Community Engagement in Research,” Henry Akintobi, T, Goodin, L., Trammel, E., Collins, D., & Blumenthal, D. “How do you set up and maintain a community advisory board?” Section 4b of “Challenges in Improving Community Engagement in Research,” Sufian, M., Grunbaum, J., Akintobi, T., Dozier, A., Eder, M., Jones, S., Mullan, P. Weir, C.R., & White-Cooper, S. Program Evaluation and Evaluating Community Engagement http://www.atsdr.cdc.gov/communityengagement Morehouse School of Medicine Prevention Research Center 2013 © Benefits in Community Evaluation Partnership Development, Relationship, Formation and Maintenance Akintobi, T.H., Trotter, J.C., Evans, D., Johnson, T., Laster, N., Jacobs, D., & King, T. (2011). Applications in Bridging the Gap: A Community-Campus Partnership to Address Sexual Health Disparities Among AfricanAmerican Youth in the South. Journal of Community Health, 36, 486-494. PMID: 21107895 Mayberry, R., Daniels, P., Yancey, E., Henry Akintobi, T., Berry, J., & Clark, N. (2009). Enhancing communitybased organizations' capacity for HIV/AIDS education and prevention. Journal of Evaluation and Program Planning, 32(6), 213-220. PMID: 19376579 Mayberry, R., Daniels, P., Henry Akintobi, T., Yancey, E., Berry, J., & Clark, N. (2008). Community-based organizations’ capacity to plan, implement, and evaluate success. Journal of Community Health, 33(5). PMID: 18500451 Henry Akintobi, T, Goodin, L., Trammel, E., Collins, D., & Blumenthal, D. (2011). How do you set up and maintain a community advisory board? Section 4b of “Challenges in Improving Community Engaged Research,” Clinical and Translational Science Award Community Engagement Key Function Committee Task Force on the Principles of Community Engagement (Chapter 5, Section 4b). Principles of Community Engagement, 2nd edition. Washington, DC: U.S. Department of Health and Human Services. Sufian, M., Grunbaum, J., Akintobi, T., Dozier, A., Eder, M., Jones, S., Mullan, P. Weir, C.R., & White-Cooper, S. (2011). Evaluating Community Engagement. Chapter 7 of Clinical and Translational Science Award Community Engagement Key Function Committee Task Force on the Principles of Community Engagement. Principles of Community Engagement, 2nd edition. Washington, DC: U.S. Department of Health and Human Services, 2011 Morehouse School of Medicine Prevention Research Center 2013 © Lessons Learned • • • • • • • Document Progress Watch and Learn Communication Patience Remembering Your Role Finding Mutual Benefit Community As a Partner Morehouse School of Medicine Prevention Research Center 2013 © Contact Information Morehouse School of Medicine Prevention Research Center 720 Westview Dr., SW, Atlanta, GA 30310 Phone: 404-752-1022 Email: takintobi@msm.edu Website: www.msm.edu/prc Morehouse School of Medicine Prevention Research Center 201 2013 ©