National Trade Adjustment Assistance Community College Career Training (TAACCCT) Evaluation Virtual Roundtable: Using Implementation Analysis for Formative Feedback and Program Improvement March 11, 2014 Summary Notes This webinar was the second in a series of “virtual roundtable” discussions convened to encourage TAACCCT local third-party evaluators to raise questions and share information and experiences in evaluating college and consortium-based TAACCCT program. It was hosted by Jobs for the Future, which provides peer learning and event coordination to the Urban Institute and its partners in support of the national TAACCCT evaluation. The webinar was open to the first 20 Round 1 and 2 TAACCCT evaluators who registered. Facilitators: o Randall Wilson and Dudney Sylla, Jobs for the Future Additional Participants: o Lauren Eyster, Urban Institute, project director, National TAACCCT Evaluation o Shayne Spaulding, Urban Institute Overview: When evaluators study the process of program implementation in TAACCCT and similar initiatives, their findings can provide program managers and other stakeholders with valuable, formative feedback about what is working well and where improvements can be made. This is especially vital in complex and demanding program models such as TAACCCT, where the goals include system-level changes as well as improved participant outcomes. The goal of this Virtual Roundtable was to give TAACCCT third-party evaluators an opportunity to discuss their use of formative feedback to support continuous program improvement; explain how they communicate findings to grantees; and share the results of this feedback for grantee programs. Discussion Highlights TAACCCT evaluators are collecting and analyzing data on feedback from grantee staff, students, employers and other partners, and college officials to: help grantees achieve better student outcomes; improve relationships with external partners, such as employers, and internal partners within colleges; keep projects on track operationally; document strategic components of programs; and adjust and improve the evaluation process, among other uses. General Lessons Ensure that everyone is on the same page. Formative evaluation has to be based on a clear understanding of what the grantees are trying to accomplish, especially in complex projects such as TAACCCT Feedback is the soul of improvement. Regular, well understood, and simple feedback charted in such a way as to show change over time, is useful Relationships matter. Regular helpful interactions and face-to-face meetings – to collect data as well as to present findings – help break down barriers Periodic, face-to-face presentations and discussions of findings can improve grantee understanding and engagement with evaluation Employer relationships are critical – but must be ongoing Consider offering formative findings in concise “briefs” or other accessible formats, as an alternative or supplement to lengthy reports Ensuring confidentiality can be a challenge, particularly in smaller programs; in focus groups, stress general perceptions and issues in common. Discourage participants from sharing things considered private; offer them the option of speaking outside of the group. Assess potentially revealing findings case-by-case Be mindful of the balance between “prescription” – directive feedback that potentially changes the program – and “presenting lessons,” while maintaining a neutral and objective stance Be prepared for resistance to feedback: those being evaluated are not always ready (or receptive) to hearing it. Steer discussions to factual inquiry, using rubrics to compare program criteria to actual implementation Examples of using formative feedback for program improvement: A consortium of tribal colleges sought feedback about student participants’ level of self-confidence in their program of study and their ability to obtain jobs in their chosen field. While expressing confidence about mastering the material in their program, students in focus groups were less confident about securing employment, particularly among those who had not spent significant time away from the reservation. These results were common across the four colleges. Result: Colleges are using the findings to help bridge the gap between training and employment, and to engage employers beyond those under tribal management. Evaluators interviewed employers associated with a training grant for logistics jobs about their preferred methods of working with colleges. The employers initially expressed a need for greater flexibility from the colleges, who were finding it difficult to recruit employers. Later interviews revealed that employers saw the colleges as erring on the side of too much flexibility, of “being all things to all people.” Result: the colleges have adopted a more tailored yet prescriptive and direct approach to their partner employers, who report that the process is now improved and more efficient. 2 Examples of how to conduct formative evaluation One TAACCCT evaluator employs the “What/So What/Now What” model with grantees. They begin by presenting the data (“what”) and leading a discussion to clarify what the data say. They proceed to a discussion of what this means for the grantee (“so what”), and then to determine action steps in response to the findings (“now what”). An evaluator has developed a spreadsheet-based planning guide for colleges to track their implementation against progress milestones and grant requirements, resulting in rich details about the timing and operational details of the project. Another team employs focus groups early on during a site visit, and then uses the presentation of findings on this, and/or previously collected survey data, to develop and fine-tune questions for subsequent visits 3