Chapter 12 Identifying and Selecting the Evaluation Questions and Criteria

OST Certificate Program Program Evaluation Course Fitzpatrick, Sanders, and Worthen. (2004). Program Evaluation. Boston, MA: Pearson Education Inc.

Objectives for Identifying and Selecting the Evaluation Questions and Criteria 1. Identify, clarify, and select evaluation questions

  

Four things inform evaluation questions:

Clarification of the evaluation purpose Development of the program definition   Nature of the evaluation (formative or summative) Stakeholders’ needs      

If important questions are overlooked or trivial questions allowed, the following could result:

an evaluation that has little or no payoff for the expense an evaluation focus that misdirects future efforts loss of goodwill or credibility because an audience’s important questions are omitted disenfranchisement of legitimate stakeholders unjustified conclusions

Objectives for Identifying and Selecting the Evaluation Questions and Criteria



Identify criteria that will be used to judge the object of the evaluation  Ex: attendance or performance on a test


Specify the standards the object must achieve on the criteria to be considered successful  Ex: a benchmark of 85% attendance, or a score of 68

Two Stages of Identifying and Selecting Questions for an Evaluation

 

Divergent phase

—generate a “laundry list” of potentially important questions and concerns.

Convergent phase

—select the most critical questions to be addressed from the “laundry list,” from which criteria will be developed.

The Divergent Phase

Identifying Appropriate Sources of Questions and Criteria

Sources of Questions During the Divergent Phase

1. Questions, concerns, and values of stakeholders 2. Evaluation models, frameworks, and approaches 3. The literature in the field of the program 4. Professional standards, guidelines, or criteria developed or used elsewhere 5. Expert consultants 6. Evaluator’s professional judgment

Sources of Questions During the Divergent Phase (cont’d)

1. Questions, concerns, and values of stakeholders     Stakeholders are the single most important source of questions during this phase.

Interview them to determine what they would like to know about the program (questions, concerns, perceptions, ideas for change).

Including stakeholders adds validity to the study because they are program “experts.” But, the evaluator must remember that she is the evaluation expert.

Sources of Questions During the Divergent Phase (cont’d)

1. Questions, concerns, and values of stakeholders (cont’d)  Three procedural rules for using stakeholders: 1. Use them in their expertise area.

2. Consider carefully the methods you use to extract information from them.

3. Ensure equitable participation • • Judge thoughtfulness and importance of their questions.

Generate all possible questions; don’t be openly judgmental.

• See Figure 12.1 on page 239 for ways to generate questions with the stakeholders.

Sources of Questions During the Divergent Phase (cont’d)

2. Evaluation models, frameworks, and  approaches Don’t start with models to generate questions.

  But these conceptual frameworks stimulate questions that might not emerge from other sources (particularly if stakeholders are only focusing their questions on outcomes).

 stakeholders often assume that outcomes are the only thing evaluations can address Using the models as a heuristic allows the evaluator to educate the stakeholder.

Sources of Questions During the Divergent Phase (cont’d)

3. The literature in the field of the program  Literature draws evaluator’s attention to issues that should be raised.

 Previous evaluators are useful sources.

 They generate questions, and they provide causative models that guide question development.

 Previous evaluations give insight into possible methods.

Sources of Questions During the Divergent Phase (cont’d)

4. Professional standards, guidelines, or criteria developed or used elsewhere • The Program Evaluation Standards, for example, provide insight into appropriate questions and also into criteria for judging results. 5. Expert consultants • • • Often evaluators are not experts in the targeted content area.

Outside consultant can provide a more neutral and broad view.

• Their input can reflect current knowledge and practice.

The Joint Committee recommends teams of experts for most evaluations.

Sources of Questions During the Divergent Phase (cont’d)

6. Evaluator’s professional judgment  Prior experience with similar evaluations may inform an experienced evaluator as to which questions are going to be useful.

 Important questions may be omitted unless the evaluator raises them herself.

The Convergent Phase

Selecting the Questions, Criteria, and Issues to Be Addressed

The Convergent Phase

This phase is always necessary because: 1. There is nearly always a budget limit.

2. A study becomes increasingly complicated and hard to manage without it.

3. The attention span of the audience is limited.

Selecting the Questions

•  This step must include both the evaluator and stakeholders.

Selected Evaluation Questions Should:    Hold interest of key audiences Who would use the information? Who will be affected if this evaluation question is dropped?

  Reduce present uncertainty Would an answer to the question reduce uncertainty or provide information that is not readily available?

 • Yield important information Would the answer provide important information/have an impact on the course of events (as opposed to being “nice to know”)?


Selecting the Questions (cont’d)

Selected Evaluation Questions Should:   Be of continuing interest • Is this question merely of passing interest or does it focus on critical dimensions of continued interest?

Be critical to the study’s scope and comprehensiveness  • • Would the scope or comprehensiveness of the evaluation be limited if this question were dropped? Be answerable in terms of resources Is it feasible to answer this question given available financial and human resources, time, methods, and technology?

Selecting the Questions (cont’d)

 Final Steps:   Using a matrix is one way of selecting questions from among the laundry list (see Figure 12.2 on p.249). Meet with stakeholders/client; review “laundry list,” selected questions, and any outstanding issues.

 Get reactions to selected list.

 If sponsor or client demands too much control over the selection of questions, evaluator must decide whether the evaluation will be compromised.

 At the end of the process, there are usually between 3 and 12 final evaluation questions.

Specifying Criteria and Standards

    Identify standards and criteria for each question that requires a final judgment. Having standards prior to getting results is useful in helping groups to be clear, realistic, and concrete concerning what acceptable expectations for program success.

Criteria specify those characteristics of the program that are critical to a program’s success. Standards represent the level of performance a program must reach on the criteria to be considered successful.

 

Absolute Standards

assessment – e.g., 80% of students must pass the state

Relative Standards

the pass rate this year compared to the year before the program was implemented – e.g., there must be a 10% improvement in

Specifying Criteria and Standards (cont’d)

 By now, you have a fairly good working list of questions, criteria, and standards, but remain flexible.

 What sort of things might change over the course of an evaluation that would require new or revised questions?

    changes in scheduling, personnel, or funding unanticipated problems in program implementation evaluation procedures that aren’t working new critical issues that emerge

Working With Stakeholders (cont’d)

1 . Commitment and Support  Managers and supervisors must:     demonstrate that evaluation leads to more informed decisions.

reinforce and reward employee’s use of evaluation knowledge and skills.

integrate evaluation into the organization’s daily work (make it a part of everyone’s job) Important to consider who’s buy-in is needed and whose name and involvement would add credibility

Working With Stakeholders (cont’d)

2. Use Participatory and Collaborative Approaches  Provides opportunities for:      Ensuring that all voices are heard.

Addressing a diverse set of evaluation questions.

Increasing the credibility of the data and results.

Sharing the workload.

Surfacing individual’s values, beliefs, and assumptions about the program and evaluation.

Working With Stakeholders (cont’d)

3. Choose an Evaluator Role      Decide on a role based on: Evaluation context The role the organization will accept Which role is your strength What role will support the greatest use of evaluation findings

Strategies for Getting Buy-In

 Link evaluation work to the organization’s mission.

 Involve stakeholders throughout and communicate with them.

 Link evaluation work to management.

 Start with small projects involving only a few stakeholders, then widely disseminate findings throughout the organization.

Major Concepts and Theories from Chapter 12

      Evaluation questions help to lend focus to the evaluation and guide data selection choices.

Criteria specify characteristics of the program that are critical to success of a program.

Standards point to the level of performance a program must attain for success.

The divergent phase of question development involves all key stakeholders and results in a comprehensive list of potential evaluation questions and concerns.

Additional sources of questions are: evaluation models, existing standards, research literature, and the evaluator’s experience.

The convergent phase involves the final selection of questions for the evaluation.