Leadership Development Evaluation Handbook

advertisement
Leadership Development Evaluation Handbook
First Draft – August 2005
Building Leadership Development, Social Justice and
Social Change in Evaluation through a Pipeline
Program: A Case Study of the AEA/DU Graduate
Education Diversity Internship Program
Prisca Collins & Rodney Hopson
Building Leadership Development, Social Justice and
Social Change in Evaluation through a Pipeline
Program: A Case Study of the AEA/DU Graduate Education
Diversity Internship Program
Authors: Prisca M. Collins & Rodney K. Hopson
The Need for Pipeline Development Efforts for Training
Evaluators of Color
There is a need for increased representation of people of color
and other underrepresented groups in various disciplines, not
simply in terms of numbers but for the diversity of ideas,
intellectual discovery, and thought. Even though various
institutions and academic disciplines have adopted efforts to
diversify, these efforts have either not always included
integrating people of color (especially American born) into
positions of power and influence (Stanfield, 1999) or these
efforts have, as of late, been faced with the chilling effect of
court challenges in the last couple of years resulting in
dilution of institutional will and resources (The Woodrow Wilson
National Fellowship Foundation, 2005).
Various professions such as teaching, nursing, medicine, and
engineering have developed and implemented pipeline efforts to
recruit and train professionals of color and also made attempts
to incorporate issues of multiculturalism in their curriculum.
2
Many of these pipeline efforts are aimed at either increasing
the overall number of students entering those professions or
more specifically in recent years the emphasis has been to
increase the number of minority students entering various fields
such as teaching, engineering, nursing, medicine and other math
and science related disciplines (Griffin, 1990; JBHE, 1999;
Olson, 1988; Post & Woessner, 1987). These pipeline efforts have
focused more on methods of recruitment and sometimes retention
of students in training programs; however, very few professions
have made any emphasis in leadership development.
Professions
such as nursing have taken the next step of realizing that it is
not sufficient to only be concerned that the demographics of the
nursing professional pool mirror the demographics of the
nation’s but rather that the leadership in the profession should
also reflect the same (Washington et al., 2004). Washington et
al. suggest that to really create a culture of inclusion
requires leadership and this must be done with intention.
People of color need to be empowered to assume positions where
they can contribute to the operations and transformations of
institutions. This requires incorporating innovative ways to
build a generation of ethnic and racial minority leaders who can
be instrumental in developing new ways of thinking that
incorporate methods for considering communities of color and
underrepresented groups and generating futuristic frameworks
3
that take into consideration issues of diversity, social change
and social justice.
Hernez-Broome & Hughes (2004) state that there is more to
leadership development than just developing individual leaders;
one needs to take into consideration a leader’s emotional
resonance with and impact on others.
This is especially
important in developing leadership development pipeline efforts
for people of color. Pipeline development efforts aimed at
people of color must take into consideration the lifetime and
workplace experiences of people of color, the challenges they
encounter as they seek to balance the delicate issues of
identity and aspirations that tend to complicate lives of
professionals of color ( Davidson & Johnson, 2001; Washington et
al., 2004).
The need to build advanced training mechanisms in
evaluation for diverse racial and ethnic groups is due to a
couple of reasons. On one hand, despite the increased demands
for accountability and evaluation in foundation and government
sectors, there have been decreasing formal graduate programs
over the last decade, including predictions that these programs
are not likely to expand (Fitzpatrick, et.al, 2004).
On another
hand, when one examines the numbers of African Americans,
American Indians, Mexican Americans, and Puerto Ricans, for
instance, who receive doctorates in research-based educational
4
fields and in social sciences, one notices the dearth of
doctoral degrees from a potential base of evaluation colleagues
(Frierson, 2003; Hood, 2000).
An additional purpose for increasing diverse evaluators
includes building on the significance and relevance of cultural
context and competence in the field.
In recent years, there has
been growing momentum to apply a cultural litmus test to
evaluation processes, standards, use, and especially in
situations where communities of color are participants and
stakeholders in evaluations (Hood, 2001; Hopson, 2003; Robinson,
Hopson, & SenGupta, 2004).
Beyond increasing numbers, these and
other efforts point to the increasing concerns to contribute to
conceptual and methodological implications of those who are of
an oppressed minority group status in the evaluation discipline.
Hood's (2001, 1998) attention to notions of culturally
responsive evaluation approaches are timely investigations and
considerations for evaluators who attempt to derive meaning from
data in diverse cultural contexts.
La France's (2004) work
extends evaluation orientations and frameworks in Indian country
whereby the cultural competent evaluator seeks and uses
particular understandings, methodologies, and practices to
ground their evaluation in an tribal community context.
Inherent in the efforts to build innovate training
mechanisms to increase diverse racial and ethnic evaluators is
5
the realization of potential contribution to new ideas,
paradigms, and realities for the field. Frierson (2003) suggests
schools of education play a major role in both educating and
training program evaluators and increasing the numbers of
diverse groups in the field.
Discussions from a recent National
Science Foundation workshop on the role of minority evaluator
professionals expand potential strategies to a multiple
agency/organizational approach to develop training programs and
models for prospective minority evaluators. These models, for
example might include innovate involvement from colleges and
universities, government agencies, and professional
organizations through mentoring, potential employment, and other
strategies (Davila, 2000).
The AEA/DU Graduate Education Diversity Internship Program as a
Leadership Development Pipeline effort
The American Evaluation Association/Duquesne University
(AEA/DU) Graduate Education Diversity Internship Program is a
pipeline development program designed not only to increase the
number of evaluators of color but also to train evaluators of
color who have a potential to be future leaders in the
profession. The type of program participants recruited, the
context within which the program was conceived and the design of
the program lend themselves well to producing evaluators who
6
will be leaders in the field. Components of leadership
development are engrained within the program itself and in the
training of the participants.
The components of the AEA/DU internship program are
several:
attendance and participation in evaluation seminars at
Duquesne University, attendance of professional development
workshops and sessions at the AEA annual conference, placement
with a local sponsoring agency for providing practical hands-on
evaluation experience, matching of the interns with a
facilitating mentor and an academic advisor, and an embedded
communication and feedback system through a (virtual classroom)
blackboard website site and reflective journaling. This type of
classroom type training combined with developmental experiences
and relationships and a formal feedback system have been
described as some of the most commonly used leadership
development approaches in the literature (Busch, 2003; Campbell
et al.,2003; Howe & Stubbs, 2001; Washington et al, 2004).
Even though the internship program is nine months long, the
supportive mechanisms of mentorship and the communication and
feedback system continue beyond the interns’ graduation from the
program. Connaughton et al (2003) explain that short term
approaches to training leaders such as two hour or week long
workshops are not realistic approaches to developing leadership
capabilities. They suggest that “leadership competencies are
7
best developed over time through a program that fosters
personalized integration of theory and practice and that
conceives leadership as a recursive and reflective process
(Connaughton et al. 2003:p46) and that utilizes highly focused
multidisciplinary approaches.
The developmental relationship between the interns and
their mentors and academic advisors are intended to be long
term, a resource the intern has available to tap on even as they
pursue his/her career in evaluation in the future. The interns
are invited to continue being a part of the blackboard
communication system to share their evaluation experiences with
the new cohorts of interns and seek any feedback on projects
from the internship staff and other members of the AEA
leadership who are working closely with the internship program.
Howe & Stubbs (2001) reported that this type of long term
support is essential in leadership development for the creation
of a community of practice among program participants that
reinforces and stabilizes the new structures the participants
have developed.
The program recruits graduate students who are either in
their second year of a Masters program or already enrolled in a
doctoral program, hence having been exposed to
research methods
and having substantive knowledge about their area of
concentration and better positioned for professional development
8
in evaluation. The participants are admitted into the program
not only based upon their academic qualifications but through
recommendations from their professors highlighting their strong
personal attributes, skills and experiences that set them apart
as potential leaders and their interest in evaluation as a
career interest and application of social justice and social
change. The internship taps on and fosters any personal
interests the interns may already have in serving specific
populations or pursuing certain social justice agendas. Such
type of leadership development resounds with suggestions of
Connaughton et al (2003) that leadership be considered a science
and art
where leaders learn to apply available theory and
research findings in a way that is compatible with their own
personality, skills, experiences, values, capabilities, goals
and contextual assessment.
The program was conceived following the recommendation of
the Building Diversity Initiative of the American Evaluation
Association, a critical intervention funded by the WK Kellogg
Foundation, to encourage the recruitment of diverse racial and
ethnic persons to evaluation and to encourage the evaluation
field to more work in diverse cultural contexts. Collaboration
between AEA and the internship program staff is integrated
through all the various phases of the internship as demonstrated
by the various committees established to provide guidance at
9
various stages such as the curriculum planning committee, and
the evaluation design committee. Such collaboration
stakeholders is evident in various
among
leadership development
programs in the literature (Busch 2003; The David and Lucille
Packard Foundation and Bill & Melinda Gates Foundation, 2003;
Connaugton et al, 2003).
Description and Components of the AEA/DU Internship Program
The goals of the AEA/DU graduate Education Diversity
Internship Program are to 1) expand the “pipeline,” or in other
words, recruit students who already have the basic research
capacities and substantive knowledge about their area of
concentration to extend their capacities to evaluation, 2)
stimulate evaluation thinking concerning communities and persons
of color by providing professional development training
opportunities for social science, public health, and other
research graduate students, and 3) deepen the evaluation
profession’s capacity to work in racially, ethnically, and
culturally diverse settings. Students who have already been
exposed research methods and substantive issues in their field
of expertise are ripe for the internship program because they
can focus on issues specific to evaluation, and are about to
join the ranks of a full-time professional. The program builds
10
on existing and natural interest among private foundations and
non-profit agencies to improve the quality and effectiveness of
evaluation by increasing the racial and ethnic diversity
capacity within the evaluation profession.
An important purpose of the internship experience is to
reflect upon the current context of evaluation and the extent to
which it is meeting the needs of communities and persons of
color.
In doing so, attention is paid to contemporary
evaluation models and issues that spur the need for an approach
to evaluation that focuses primarily on the experiences of
traditionally disenfranchised and underrepresented groups.
The
internship experience argues the importance of social justice
and social change frameworks and approaches that have emerged in
recent years and the need for finding ways to use and practice
these frameworks and approaches that stimulate evaluation
thinking concerning communities and persons of color and deepen
the evaluation profession’s capacity to work in racially,
ethnically, and culturally diverse settings.
Intern selection: Selection of graduate students of color is
guided by the following considerations:

Currently in her/his year 2 of a Masters or year 2 or 3 of a
combined masters/doctoral program;
11

Can demonstrate the relevance of evaluation training in
his/her current and future work through a short essay.

Has support from his/her academic advisor, demonstrated
through a letter and an agreement of the number of credit
hours that can be earned from the internship and applied to
his/her graduate program.
Interns are recruited through several avenues, including:

AEA and evaluation-related institutes, listserves, and
programs;

Professional organizations, such as American Educational
Research Association, American Anthropological
Association/Council on Anthropology and Education, American
Public Health Association, Council on Social Work Education,
American Sociological Association, and the
Society for
Community Research and Action;

Graduate career development/awareness offices, specifically
those in minority-serving institutions;

Select departments and schools at minority-serving
institutions and predominately white institutions.
Interns submit applications, responding to questions about the
relevance of the internship program in their current and future
work aspirations and the relevance of their own
academic/professional backgrounds to the internship program.
12
Selections are made by a committee of the Advisory Board which
consists of senior AEA colleagues.
Curriculum outline: The internship staff guides the development
of the curriculum for the internship program. The curriculum
consists of the following activities: orientation and the
development of a lesson plan by the intern with guidance from
the sponsoring agency’s representative, his/her academic
advisor, and the facilitating mentor,
field experience through
placement with a sponsoring agency courses/seminars during
visits to Pittsburgh, PA three times during the year,
participation in professional development workshops and sessions
at the annual AEA conference; and an evaluation project write-up
with the intern at the end of the program.
The success of the program depends largely on the best
match between the intern and the sponsoring agency with adequate
support from the intern’s academic advisor and a facilitating
mentor. This four-person relationship ensures that the intern
gets the necessary support during the yearlong internship
experience and a collaborative mentoring relationship with a
facilitating mentor can be achieved. Even though the ultimate
goal of the internship is to build the evaluation leadership
capacity among evaluators of evaluators of color, and promote
social change and social justice, the leadership development
13
approaches employed are aimed at developing individual
leadership capabilities that will sufficiently impact them at a
personal level as well as the larger evaluation community.
Seminars and Professional Development Workshops
The seminars and professional development workshops provide the
interns with the theoretical knowledge in evaluation they need
and provides them and opportunities for networking and exposure
to leading scholars to leading scholars in the field who can
serve as role models for the interns. The seminars are designed
around a theoretical framework for social justice and social
change. This framework highlights social agenda/advocacy
evaluation models that are directed at making a difference in
the society by concomitantly addressing issues of access,
opportunity, and power. These evaluation models have been hailed
recently as the best and most applicable for 21st century
evaluation in an analysis of twenty two approaches by
Stufflebeam (2001). Examples of topics covered in the internship
include evaluation in foundations, culturally responsive
evaluations, evaluation design and planning, working with
multiple stakeholders, evaluation theories and practices, and
power and evaluation. The interns will also have the opportunity
to participate in an alumni group after they complete the
internship to ensure ongoing access to resources and support.
14
Howe & Stubbs (2001) explain that the process of leadership
development can span many years and hence the importance of
putting in place mechanisms for long term support to help
participants actualize the potential created by the program.
Sponsoring agency selection: The interns are matched with a
sponsoring agency based on their geographic location, evaluation
topical interests and any other qualities of interest to the
intern such as special populations served by the agency. The
sponsoring agency which should preferably be a foundation with
in-house evaluators, consulting firm that work with foundations
or established grantees that receive grants from foundations.
Agencies were selected based on the following considerations:

Is located within the same geographic location as the intern;

Can provide a minimum of 240-480 hours1 of supervision;

Can provide the intern with opportunities to help design,
budget, and/or implement evaluations;

Can provide the intern with opportunities to work with
multiple stakeholders.
The agency provides the intern with opportunities to design,
budget, implement evaluations and work with multiple
15
stakeholders. The placement with sponsoring agencies allows for
action learning as the interns design and conduct evaluations in
real life settings. This helps the interns develop interpersonal
and intrapersonal skills, problem solving and problem defining
skills, task specific skills, and communication skills. These
skills are essential leadership skills (Busch, 2003; Connaughton
et al., 2003; Campbell et al., 2003). The interns are encouraged
to use reflective journaling throughout their field experiences
and to share their experiences with other interns and get
feedback from the internship staff through a web-based
blackboard communication. This facilitates the formation of a
learning community between the interns, the internship staff and
mentors who exchange ideas and experiences through the web-based
communication. The interns are also required to post weekly
updates on the website on their progress with the evaluation
projects, any difficulties they may have encountered in the
field, and updates on meetings with advisors and mentors
relating to their project.
Facilitating mentor selection: Another critical component of the
AEA/DU internship program is the collaborative mentorship
offered through a facilitating mentors and academic advisors.
Each intern is matched with a facilitating mentor who is a
1
6 - 12 hours per week for a total of ten months.
16
senior colleague in the evaluation profession, most likely an
AEA member who shares similar topical and possibly career
interests with the intern. The mentors and interns are
encouraged to view this mentoring as a way of preparing the
students of color to future evaluation leaders and to build a
strong professional culture of evaluators dedicated to improving
programs that address the needs of ethnically and racially
diverse communities. The mentor is accessible to the intern
through electronic mail, telephone, and/or in person and is
someone who can provide a minimum of 40 hours2 of support, from
helping to resolve conflicts between the intern and the agency
to raising questions related to culturally responsive
evaluations, if necessary and appropriate.
The interns are also matched with an academic advisor of their
choice from their own institution who can act as a guide through
their academic institution and help them balance their school
work with the internship responsibilities. This type of formal
mentoring with informal components such as voluntary
participation, choice in matching process and focus on protégé’s
professional development has been associated with positive
outcomes in career and job attitudes (Davidson & Foster-Johnson,
2001). Recent visions on mentoring (Hargreaves & Fullan, 2000;
Kochan & Trimble, 2000; Mullen, 2000, Mullen & Lick, 1999)
2
A minimum of one hour per week for a total of ten months.
17
suggest mentoring initiatives promote shared learning,
responsibility, and authority where balances between guidance
and autonomy can be maintained.
The design of this internship
program is consistent with collaborative and participatory
models of mentoring where facilitating mentor and intern (or
protégé, in this case) function as a network to support joint
inquiry and can benefit from engaging in mutually beneficial
learning through feedback and group problem-solving. Interns are
encouraged to develop group efforts and activities, such as
joint publishing, collaborative presentations at AEA and other
evaluation conferences, team teaching or professional
development, and other activities.
Evaluation of the AEA/DU Graduate Education Diversity Internship
Program
The purpose of the evaluation effort is to document the
implementation process of the internship program and the impact
of the program on the interns, the organizations involved
(sponsoring agencies, universities attended by the interns, the
AEA) and on the broader communities (local communities where the
sponsoring agencies and interns are located, and the broader
national and international evaluation community.
This type of
process and outcomes oriented evaluation is essential in
18
providing a comprehensive understanding of what activities are
in place to effect the desired program outcomes.
In conceptualizing the evaluation process of the internship
program we acknowledge the fact that even though the training
approaches are aimed at improving the individual interns’
abilities in becoming culturally competent and leaders in the
field, the ultimate goal is change organizations or society. The
focus here is on individual development since the individual is
the agent of change. The evaluation framework used here
incorporates both evidential and evocative approaches described
by Grove et al.(2002) in order to allow the evaluation to
capture both the observable (evidential) and the not so
observable
but discernable changes (evocative) in participants
such as personal assumptions, attitudes, values, beliefs, and
vision. Like other evaluations of leadership development
programs, the evaluation of the AEA/DU internship program seeks
to identify changes at the personal level, organizational and at
the community/systems level. Kirkpatrick’s (1998) four level
framework for evaluating training programs is also implicit in
the design of this evaluation. Kirkpatrick ‘s four levels are as
follows:

Reaction: Provides information how the participants feel
about the program. This includes the reaction of the
interns, mentors, academic advisors and sponsoring
19
agencies. This information is critical in informing the
program staff as to whether the way the program is
delivered and the components of the curriculum are meeting
the needs of the immediate stakeholders.

Learning: This includes any change in attitudes, beliefs
and values; knowledge and skills gained by the interns.

Behavior: This includes any change in behavior of the
interns and may include demonstration of problem solving
capabilities, reaction to various situations during their
practical experiences, involvement in scholarly work.

Results: Long term impact of the program on the interns and
what they do which may include engaging of interns in
evaluation thinking concerning communities of color and in
activities that promote social change and social justice
within the field of evaluation and beyond.
Developing a Theory of Change for the AEA/DU Internship Program
In designing, implementing and interpreting the results of
this evaluation the unique context of this internship program as
described earlier has to be taken into consideration. The
collaborative nature of this internship program extends into the
evaluation process. The planning and designing of the evaluation
was done through collaboration between the internship staff and
the AEA Diversity internship subcommittee of design and
20
evaluation. The theory of change helps in identifying the
activities of the program and how these activities connect to
the short term, intermediate and long term outcomes of the
programs and how they relate to the changes in the individual
interns, the organizations involved and the community at large.
The theory of change model helps look at the progression of
impacts at the individual level, organizational level and at the
community level as shown Table 1: Evaluation Plan for the AEA/DU
Graduate Education Diversity Internship Program.
Insert Table 1 here
Impact at the individual level addresses the immediate
short term gains in the interns’ level of knowledge and skills
in evaluation theory and practice. This includes the development
of intrapersonal qualities and interpersonal skills; cognitive,
communication and task specific skills possibly resulting from
exposure to the activities of the internship program. In
examining this impact at the individual level one reflects upon
the activities of the program using the criteria of assessment,
challenge and support proposed by McCauley et al., to gauge the
effectiveness of the leadership development strategies employed
in this program (Beeler, 1999). This criteria allows us to
examine whether the activities provide the interns with adequate
21
opportunities for gradual progression in evaluation-related
knowledge gain and practice, and whether the practical
experiences have provided them with learning opportunities that
have challenged their existing knowledge and skills, stretching
them to learn and develop new capacities; and whether the
supportive mechanisms are adequate to help the interns deal with
the difficulties they encounter during their development.
At the organizational level, one reflects upon the
intermediate outcomes related to how the interns apply the
knowledge they have gained in their projects with the sponsoring
agencies and in the academic and other professional work. This
includes a look at how the interns incorporate the knowledge
they have gained into the planning, design, and conduct of the
evaluation project, how the interns negotiate their roles as
evaluators in the field, how they define and solve the problems
they encounter in the field, and the communication skills that
they demonstrate working with the various stakeholders.
In
addition, intermediate outcomes reveal their ability to take the
knowledge and skills they have gained and tailor it to a
specific context in the field and how beneficial the interns’
work is to the sponsoring agencies. Of particular interest is
how the interns apply the concepts of conducting culturally
responsive evaluations i.e.: evaluations that give voice to all
the stakeholders even the ones whose voices have traditionally
22
been suppressed.
Outcomes at the organizational level also
address the impact of the program on the interns’ academic work
and the evaluation profession, exploring how the activities of
the internship have enhanced the interns’ schoolwork and their
engagement in scholarly work (See Table 1 for list of outcome
indicators).
Outcomes at the community level are more long term and not
likely to be evident until after the interns have completed the
internship and academic training and are practicing evaluators.
This could be years after graduation from the internship and the
interns would have been engaged in many other forms of training
and/or professional development which could also contribute to
the impact they have at a community level. In designing
evaluations of training programs such as the AEA/DU internship,
one has to also acknowledge that not all the changes in the
interns can be solely attributed to the program and that the
further away we move away from the direct impact of the
knowledge and skills gained it becomes increasingly more and
more and more difficult to attribute to attribute the impact to
the internship program. Hence Grove et al (2002) suggests that
in evaluating leadership development efforts one needs an
objective distance and a subjective presence to accurately
assess impact.
23
Evaluation Methodology
Instrumentation
A mixed method approach using surveys, semi-structured
interviews, and a focus group for data collection was employed.
Data collection instruments were developed by the internship
staff to measure the outcomes of the program. These included the
following data collection items:
 Surveys
 Focus Group
 Semi-structured interviews
 Review of intern journals
 Review of intern application essays
 Blackboard website discussions
 Sponsoring agency site visit notes
Surveys were used to gather baseline information at the
beginning of the internship program and after the fall, winter
and spring seminars. The surveys at the end of each seminar were
critical in providing instant feedback on the interns’
satisfaction with the training activities and the knowledge
gained allowing for continuous program improvement.
A focus
group was conducted in conjunction with a survey to gather
baseline information at the beginning of the program. This was
24
essential to complement the information provided by the interns
in their application essays and provide rich information on the
background, goals, career and topical interests that guided the
matching of the interns with sponsoring agencies and mentors.
Interviews of mentors, advisors and sponsoring agency personnel
were conducted half-way through the internship and at completion
of the internship. The interns were required to post weekly
reflections on the internship blackboard website to facilitate
discussions on their projects and get feedback from internship
staff.
The interns shared the progress they made on their
evaluation projects, any challenges they encountered and any
attempts at dealing with the challenges, posted any questions
they had, or shared any valuable knowledge they gained from
other sources with each other. This helped foster a learning
community for the interns and staff.
The internship coordinator
conducted a site visit to each sponsoring agency to gather
additional contextual information on the sponsoring agencies,
interview key sponsoring agency staff and academic faculty, and
to foster a good working relationship between the program and
the agency.
Summary of Evaluation Findings
25
The following section introduces a summary of findings of the
inaugural year of the internship program, with specific
attention to description of internship program components
achieved as well as discussion of outcomes and impact of the
program.
Internship Participants, 2004-2005
The first group of interns consisted of four graduate
students of color: one second year masters student and three
doctoral students from three universities whom the program
considered prime candidates for leadership development goals of
the program because of their personal and educational
backgrounds. They majored in various disciplines including
Applied Anthropology, Clinical Psychology, Educational
Psychology and Educational Policy. These students were in their
2nd and third year of their doctoral programs and the Masters
student already had another master’s degree in public health;
and hence had already attained basic research training offered
in graduate school and acquired substantial knowledge in their
academic area of concentration.
The students were also from diverse cultural backgrounds
(that included African, Caribbean, Mexican and African American)
which brought a wealth of personal and cultural experiences to
the learning community that developed within the program. The
26
interns also had diverse research interests that included the
exploration of issues of power dynamics within social groups,
agency, and civic efforts towards social change, educational
experiences of immigrant students in the U.S.,
language and power,
issues of
access to higher education for racialized
and marginalized communities, issues surrounding the
resettlement of refugees in the U.S..
Critical design and matching of interns
The matching of interns to academic advisors, mentors, and
sponsoring agencies was an important and critical element of the
program design. All the interns were successfully matched with
mentors who are all active senior AEA members with leadership
roles in the organization and have similar research and /or
career interests. All the interns were requested to select
academic advisors from their local universities and who are
within the areas of their academic program concentration. All of
the interns were matched with sponsoring agencies that work with
populations that are of interest to them and address social
issues that the intern has research interest in. All the mentors
and three of the academic advisors were identified to provide a
supportive network around the interns, provide input regarding
available resources for the internship,
and help the program
remain current with events going on within the evaluation. At
27
the AEA conference, all the interns had the opportunity to get
acquainted with the AEA board through attending a breakfast and
dinner with the board.
The academic advisors described their role as both
mentoring the intern and helping the intern coordinate his/her
academic work and internship work and supervise the interns
work. Two of the academic advisors are also evaluators and hence
were actively involved in the securing of sponsoring agency
placement, in negotiating the evaluation project agreement with
the agency, and in developing an evaluation plan. All the
academic advisors felt that the internship was very beneficial
to the interns. There are two interns from the same institution
and their academic advisors felt that this allowed for more
networking and collaboration among all four. The two interns and
their advisors met regularly as a group to share ideas and
discuss the interns’ progress with the internship project.
Curriculum Delivery
All the interns attended the three seminars offered. The
seminars were centered around a theoretical framework of
“Evaluation towards Social Change and Social Justice” and
included topics such as Evaluation in Context –history,
paradigms, design; Evaluation in Foundations; Program evaluation
with a transformative lens; Planning evaluation with diversity,
28
power, and emancipatory considerations; Multicultural Validity;
and Evaluation Influence. The seminar workshops were conducted
by various AEA members who are well known in the field for their
expertise in these topics. The seminar workshops were highly
interactive and the interns were encouraged to reflect on what
they learned through journaling. Responses from post seminar
surveys showed that all the interns
were satisfied with the
seminar schedules, workshop format, quality of speakers,
internship staff responsiveness to their needs, and facilities
where the seminars were conducted. The interns continued to
share their reflections on materials learned, progress and
challenges with their evaluation projects, and exchange ideas
via the internship blackboard website on a weekly basis.
All the interns attended the AEA annual conference in Atlanta,
GA.
They each attended a professional development session and
various conference sessions and reflected on what they learned
through communication interchange on the internship blackboard
website after the conference.
Collaboration/Partnership Activities
The internship staff collaborated with the members of the
Internship advisory committee during the process of selection of
interns, hiring of internship coordinator, curriculum planning,
and evaluation planning through three workgroups namely; the
29
Design workgroup, Curriculum Workgroup, and Application Review
workgroup. Duquesne University provided housing, access to the
internet and library services for the interns during the
interns’ stays in Pittsburgh. The Duquesne library has also
provided opportunity for a list of evaluation related library
resources to be ordered for the internship.
The interns had an
opportunity to meet and interact with the Duquesne University
School of Education faculty and the university provost.
Outcomes Achieved
Table 1 provides a list of the various levels of outcomes
measured , the outcome indicators and the instruments used to
track the outcomes.
Individual-Level Outcomes: Knowledge and Skills gained by
Interns
Information gathered from the interns’ application essays, fall
focus group and pre-fall survey revealed that the interns hoped
to gain the following skills through the internship program:

Learn how to plan and implement an evaluation project

Learn how to link evaluation theory with evaluation in the
real world

Learn how to conduct evaluations that address issues of
diversity, culture and empowerment of people of color.
30

Gain network opportunities
with evaluators outside their
institutions and with other students of color

Learn how to use their evaluations towards dissertations,
publications and presentations.
Results from the surveys, interviews, documentations from
intern journaling and discussions on the blackboard website
showed reported gains in feelings of competence in applying
basic evaluation theories in the real world, understanding
critical cultural issues in program evaluation; collecting,
analyzing, interpreting data in a culturally responsive manner.
They reported “receiving concrete tools with which to think
about multicultural validity”, gaining a better understanding of
“culture” taking into consideration the existence of in-group
differences and ensuring that all the voices are heard during
the evaluation, and having learned to consider themselves as
part of the instrumentation in the evaluation and to pay better
attention to how their perceptions can impact the validity of
their evaluation findings.
The interns reported that the placement with sponsoring
agencies afforded them the
opportunity to use the data
collection skills they had learned, improve their observation
skills, learn more about conducting evaluations in the
community, further develop their interpersonal, learn how to
31
deal with diverse stakeholders, develop negotiating skills. When
asked what has been the most meaningful aspect of their
practical experience, two of the reported that working with
participants that they cared deeply about and hearing their
stories was gratifying. One of the interns expressed her wishes
to “take the evaluation further and be able to bring social
change, and improve understanding of the issues the population
she was working with dealt with”.
Interpersonal Skills Developed
The field experiences at the sponsoring agencies provided the
interns with opportunities to apply the knowledge they gained
and demonstrate their problem defining and problem solving
skills, and communication skills. The interns shared how
overwhelmed they were by their initial experiences at the
sponsoring agencies and how they were able to step back,
renegotiate their entry into the evaluation settings, negotiate
their roles and gradually gain rapport with the sponsoring
agency staff and program participants dealing with difficult
personalities and multiple stakeholders with different
interests. The demonstration of good problem identifying and
problem solving skills in the interns was also validated by the
sponsoring agency supervisory staff who commented:
32
“He gets along well with others, is considerate,
knowledgeable, and has the qualities we need.”
“She is moving quite well and positioning herself to be
leading the evaluation for us”
“She is very sensitive in the groups and she understands the
directions of the conversations and our goals.”
Impact on Intern’s Academic Work
All the academic advisors felt that the internship was very
beneficial to the intern’s academic work and that all the
interns were using their projects for their dissertation or
masters’ project. Other benefits included
exposure to
evaluation expertise the interns would not have under normal
academic circumstances, added confidence in the interns ability
to do their work, collegiality and companionship working with
other students of similar interests and much needed financial
support. All the academic advisors reported being part of the
interns’ dissertation/thesis committee.
Impact on a Professional level
All the interns are registered members of the AEA.
Some
of the interns are already assuming leadership roles in research
and/or evaluation related scholarly work through participating
as
member of the student editorial board of the New Directions
33
in Evaluation Journal, membership in the Graduate Student
Topical Interest Group leadership, and presenting at national
conferences (1st International Congress of Qualitative Inquiry at
the University of Illinois-Champaign and Ethnography in
Education Forum at the University of Pennsylvania). All the
interns submitted an abstract towards a proposal that has been
submitted for presentation of their evaluation project work at
the upcoming joint American Evaluation Association/Canadian
Evaluation Society conference in Toronto.
Organizational Level Outcomes
Most of the outcomes at this level are more intermediate
and long term and will be evident beyond the first year of this
internship program. However some of the short term outcomes
documented are presented below.
Sponsoring agencies
All the interns developed evaluation plans for their
projects with the guidance of their academic advisors which they
presented to their sponsoring agencies and the agencies felt
that the interns’ evaluation goals were compatible with the
agencies’ evaluation goals.
All the sponsoring agencies rated
the interns’ interpersonal relations, dependability, judgment,
initiative and overall quality of work as good to excellent.
34
The agencies reported that the interns were a good fit for
their agencies. They reported that the evaluation work the
interns were doing was very beneficial to them and planned to
use the evaluation report to gain a better understanding of
their programs, for program improvement,
to seek funding, to
provide “serious, sophisticated evidence” that
their program
has merit. One of the agencies indicated that the information
from the evaluation would help inform the education system on
how to better educate individuals, and that the demonstration of
the program’s successes would help reduce the political barriers
that their participants currently have to deal with. One agency
acknowledged that having a doctoral student evaluating their
program meant a lot to the agency staff and it validated the
program.
Two of the programs expressed gratitude over the fact that
the interns had similar backgrounds and/or experiences as the
participants and hence would have a better understanding and
interpretation of the participants’ experiences. One of the
agencies described the placement of the intern with their agency
as “serendipitous and a blessing”.
One agency reported that
the intern’s value to the agency was beyond the evaluation
expertise she brought, she also helped with translations for
them on occasions when they don’t have an interpreter available
for some of their participants.
35
Interns Academic Institutions
Some of the academic advisors mentioned the invaluable
knowledge and experience that the interns gained from working
closely and learning from some of the key leaders and theorists
in program evaluation who came to teach the seminars or served
as mentors. Some of the advisors reported passing on what the
interns shared with them with their students and the value of
the interns sharing their internship experiences with other
students in classes who had not had the opportunity to engage in
hands-on evaluation work.
Another benefit to the academic institution were the
presentations by interns at professional meetings and future
publications highlighting the universities the interns are
affiliated with. Also the participation of the interns in a
program aimed at promoting diversity in the field of evaluation
and highlighting issues of social justice and social change
would stimulate thinking about these issues within the immediate
departments that the interns are enrolled and hopefully
eventually draw the attention of other institutional
departments. At Duquesne University, the publication of articles
about the internship in the university alumni newspaper and
magazine; and in the city of Pittsburgh newspaper provided the
university and Pittsburgh community with exposure to the role of
36
evaluation in addressing issues of diversity, social justice and
social change.
Impact on the AEA and Evaluation Community
Even though evaluation thinking relating to issues of
culture and diversity is already evident within the AEA as
evidenced by the birthing of this internship and other
significant follow through activities incorporated as a result
of the Building Diversity Initiative in the association, the
successful implementation of this internship facilitates the
ongoing dialogue and strengthens the capability of the AEA to be
a significant contributor and example in building leadership
development mechanisms for graduate students of color. At the
2004 AEA annual conference in Atlanta, the interns were
introduced to the AEA board members at two key events and had
opportunity to interact with the board members at these and
informally at other events throughout the conference, thus
generating increased involvement in these issues.
III. Impact on the Broader Community
Through placement of the interns with various community
organizations for practical experiences, and through
presentations at national and international conferences and
publications the interns aims at promoting discussions and
37
engagement in issues related to diversity, social change and
social justice in evaluation.
Challenges and Lessons Learned
There were a few problems related to the timing of
placements with sponsoring agencies, interactions with some
stakeholders, communication from internship staff about
internship requirements with academic advisors, and inadequate
communication between some of the interns and their mentors.
Some of the challenges or difficulties the interns reported
encountering during their placement included: limited evaluation
resources at their institution, need to reassess the evaluation
questions, data collection seemingly overwhelming, limited
access to certain specific demographic group of participants the
intern was interested in, learning how to negotiate their role
and deal with conflict and dealing with institutional issues.
Even though some of the challenges the interns encountered
could have been alleviated through improved communication
between the internship staff, agency personnel and academic
advisors, the project related challenges were necessary to
expose the interns to real life challenges to help them develop
problem defining and problem solving skills necessary in
leadership. Campbell et al (2003) explain that for developmental
effectiveness job assignments must be challenging enough but
38
still within the individual’s capabilities and that the
individual has available supportive mentors who can provide
perspective, constructive feedback and assignment –relevant
expertise. The interns tapped on the expertise of their
advisors, mentors and internship staff and through reflecting on
their challenges on the blackboard site were also able to work
with the other interns to come up with solutions.
The interns and advisors made recommendations on how to
deal with some of the challenges
in the future which included;
initiation of sponsoring agency placements much earlier,
providing advisors with a list of deliverables expected from the
interns and due dates at the beginning of the internship program
in order to facilitate the advising process, requiring advisors
to submit a brief report twice a semester updating the
internship program about the interns progress and that the
program staff occasionally send advisors alerts about new
developments or exciting events going on with the program.
The success of a program such as the AEA/DU internship
program depends heavily on the effective coordination and
engaging of the all the stakeholders in dialogue. Because of
sparse location of the stakeholders, collaboration and
partnership with AEA, and student institution faculty and their
ability to be fully engaged at an early stage was essential to
the full and successful implementation of the program. Effective
39
communication and coordination of activities of the program was
facilitated by a good understanding of each stakeholder’s role
and buy-in to the goals of the program.
Having interns who already had developed basic research
skills and knowledge in the area of concentration greatly
facilitated the selection of sponsoring agency placement since
they could more effectively articulate their research interests
and this was added value to the agencies as evidenced by the
acknowledgement of the agencies about the knowledge, skills the
interns brought, the independence and ability to articulate the
evaluation roles and added value of the interns’ personal
experiences.
Training of culturally responsive evaluators calls
for evaluators who are able to not only understand the cultures
of their clients but have the ability to reflect upon the impact
of their own experiences on the evaluation.
The cultural
diversity of this group of interns has better positioned them to
engage in the type of evaluations they are doing and give the
agencies confidence about the ability of the interns to engage
the culturally diverse stakeholders they serve.
For an
internship program aimed at developing leaders who can be agents
of social change and advocates for social justice it was an
added benefit that despite the small number of interns admitted
each year, there was representation of diverse voices, creating
40
a community of future leaders who are aware of multiple cultural
dynamics and can be responsive to them.
The convening of all the interns at Duquesne University for
the seminars and at annual AEA conferences provided facilitated
the development of a learning community consisting of the
interns, internship staff and the many AEA leaders involved with
the internship. This provided invaluable opportunities for
networking, exchange of ideas,
and formation of personal and
professional relationships that are likely to go beyond the
length of the internship.
Summary and Implications: Future of Leadership Development
Pipeline Efforts
It is important to realize that different leadership
development program will have different emphases. Hence each
program should be examined in the basis of its goals, target
population
and context/environment within which the program is
conceived and implemented (Hernez-Broome & Hughes, 2004;
Grove
et al.,2002; Connaughton et al.,2003).
Evaluations of leadership development pipeline efforts are
critical in providing us with an understanding of the processes
that are currently being used and how effective those processes
are in achieving the specific leadership development outcomes
that a particular effort seeks to achieve. Evaluations that are
41
not only outcomes oriented but also process-oriented are
invaluable in highlighting the critical components (the
leadership development methods and experiences) that are
essential in attaining specific outcomes. With the influx of
pipeline efforts across many disciplines, evaluations of these
programs will help provide us valuable data that can be used to
develop discipline-specific criteria for judging effective
leadership development programs, establishing standards of
leadership development practice
and informing policies and/or
decision making process relating to funding of these efforts.
Evaluating pipeline development efforts such as the AEA/DU
Graduate Education Diversity Internship program that aimed at
aimed at addressing issues of social change and social justice
provides with an opportunity to explore outcomes beyond the
individual level outcomes that are commonly presented in most
evaluations of training programs. Since the ultimate goal of
such a program is to see change at the systems level, evaluation
of this type of program forces us to go beyond the short term
more personal outcomes and document the changes in organizations
and communities, thus providing information at a level that is
more likely to meet policy-makers needs.
As we seek to document
outcomes at systems’ level, it becomes increasingly difficult to
attribute these outcomes solely to the program, hence the
importance of the proliferation of evaluation information from
42
multiple sources to validate individual program evaluation
findings.
43
References
1. Beeler, J.R. Sr. “The Handbook for Leadership Development.”
Human resource Development Quarterly, 199, 10(3), 297-300.
2. Busch JR. “Leadership Formation: A Multimethod Evaluation
Study of the Southern Tier Leadership Academy.” Published
Dissertation.
State University of New York at Binghamton,
2003.
3. Campbell,D.J., Dardis,G., and Campbell,K.M. “Enhancing
Incremental Influence. A focused Approach to Leadership
Development.” Journal of Leadership & Organizational
Studies, 2003, 10(1), 29-44
4. Connaughton, S.L., Lawrence, F.L., and Ruben, B.D.
“Leadership Development as a Systematic and
Multidisciplinary Enterprise.” Journal of Education for
Business, 2003, 79 (1), 46-51
5. Fitzpatrick, J.L., Sanders, J.R., and Worthen, B.R. “
Program Evaluation: Alternative Approaches and Practical
Guidelines (3rd ed.). Boston: Allyn & Bacon, 2004
6. Frierson, H.T.
“The importance of increasing the numbers
of individuals of color to enhance cultural responsiveness
in program evaluation.”
In C. C. Yeakey & R. Henderson
(Eds.), Surmounting all odds:
society in the new millennium.
Education, opportunity, and
Greenwich, CT:
Information
Age.
44
7. Griffin, J.B. “Developing More Minority Mathematicians and
Scientists: A new approach.” Journal of Negro Education,
1990, 59(3), 424-438.
8. Grove, J. & PLP Team & Leadership Evaluation Advisory Group
(LEAG) members. “The Evalulead Framework: Examining Success
and Meaning: A framework for evaluating leadership
interventions in global health. Public Health Institute,
2002.
9. Hargeaves, A. & Fullan, M. “Mentoring in the new
millennium.”
In
C.A. Mullen, C.A. & W. Kealy (Eds.).
visions of mentoring.
New
Theory Into Practice, 2000, 39(1),
50-56.
10.
Hernez-Broome, G. & Hughes, R.L., Center for Creative
Leadership. “Leadership Development: Past, Present, and
Future.” Human Resource Planning, 2004.
11.
Hood, S. “Responsive evaluation Amistad style:
perspectives of one African-American evaluator.”
In R.
Sullivan (ed.) Proceedings of the Stake Symposium on
Educational Evaluation.
Urbana-Champaign, IL:
University
of Illinois at Urbana-Champaign, 1998.
12.
Hood, S. “New look at an old question.” The Cultural
Context of Educational Evaluation: The Role of Minority
Evaluation Professionals. Arlington, VA: National Science
Foundation, 2000.
45
13.
Hopson, R. . Toward Participation and Liberation in
Educational Evaluation: Developing educational pipelines to
increase minority evaluators. The Cultural Context of
Educational Evaluation: The Role of Minority Evaluation
Professionals. Arlington, VA: National Science Foundation,
2000.
14.
Howe, A.C. and Stubbs, H.S. “From Science Teacher to
Teacher Leader: Leadership Development as Meaning Making in
a Community of Practice.” Science Teacher Education, 2001,
281-297
15.
“JBHE’s Survey of Colleges and Universities Taking
Concrete Steps to Attract Black Students.” The Journal of
Blacks in Higher Education, 1999, 24, 28-31.
15. Kirkpatrick, D.L. “Evaluating Training Programs: the four
levels.” 2nd Edition. Berrett-Koehler Publishers, 1998.
16. Kochan, F.K. and Trimble, S.B. “From mentoring to comentoring:
Establishing collaborative relationships.”
C.A. Mullen, C.A. & W. Kealy (Eds.).
mentoring.
16.
In
New visions of
Theory Into Practice, 2000, 39(1), 20-28.
LaFrance, J. “Cultural competent evaluation in Indian
country.”
In M. Thompson-Robinson, R. Hopson, & S.
SenGupta (Eds.).
In search of cultural competence in
46
evaluation.
CA:
17.
New Directions for Evaluation.
San Francisco,
Jossey-Bass, 2004.
Mullen, C.A. “Constructing co-mentoring partnerships:
Walkways we must travel.”
(Eds.).
In
C.A. Mullen, C.A. & W. Kealy
New visions of mentoring.
Theory Into Practice,
39(1), 4-11.
18.
Mullen, C.A. & Lick, D.W. “New directions in
mentoring:
Creating a culture of synergy.”
London:
Falmer, 1999.
21. Olson C. “Recruiting and Retaining Minority Graduate
Students: A systems perspective.” Journal of Negro Education,
1988, 57(1), 31-42.
19.
Stanfield, J.H. “Slipping through the Front Door:
Relevant Social Scientific Evaluation in the People of
Color Century.” American Journal of Evaluation, 1999,
20(3), 415-431.
20.
“Guide to Evaluating Leadership Development Programs”.
Packard Foundation Population Program and the Bill &
Melinda Gates Foundation Global Health Program, The
Evaluation Forum, Seattle , WA, 2003.
47
21.
Post, L.M., and Woessner, H. “ Developing a
Recruitment and Retention Support System for Minority
Students in Teacher Education.” Journal of Negro Education,
1987, 56(2), 203-211.
22.
Stufflebeam, D.L. “Interdisciplinary PhD Programming
in Evaluation” American Journal of Evaluation, 2001, 22(3),
445-455.
23.
Thompson-Robinson, M., Hopson, R., & SenGupta, S.
search of cultural competence in evaluation.
Directions for Evaluation.
In
New
San Francisco, CA:
Jossey-
Bass, 2004.
24.
Washington, D., Erickson, J.I., and Ditomassi, M.
“Mentoring the Minority Nurse Leader of Tomorrow.” Nursing
Administration Quarterly, 2004, 28(3), 165-169.
27.
The Woodrow Wilson National Fellowship Foundation.
Diversity and the Ph.D.:
A Review of Efforts to Broaden Race
& Ethnicity in U.S. Doctoral Education.
Princeton, NJ:
Author, 2005.
48
Table 1:
Evaluation Plan for the AEA/DU Graduate Education Diversity Internship Program
Evaluation
Sub-Questions to be
Outcome Indicators
Instrument/Tracking
Question
addressed
1. How has the
What knowledge and skills
-Scores of Pretest
Baseline knowledge and
program
have the interns gained?
(start of internship)
skills survey
impacted the
and post-test
-Prefall Focus group
interns?
every seminar and at
-Application essays
(Impact on a
end internship)
-Post seminar surveys
(after
personal
-Sponsoring agency
level)
site visit interview
notes
-End of Internship
survey and focus group
-End of internship
interviews
49
How have the interns
- Evaluation skills
-Evaluation projects
applied the knowledge and
and knowledge
reports
skills they gained?
demonstrated during
-Sponsoring agency
the planning, design,
evaluation of intern
and conducting of the
-Intern evaluation of
evaluation project
the placement
with sponsoring agency experience.
- Demonstration of
- Program Records
problem defining and
- Blackboard board
problem solving
Communication
skills, communication
-Review of
skills during their
presentations and
field experiences.
published manuscripts
- Quality of
- Sponsoring agency
Evaluation Report
site visit interview
examined using the
notes
Evaluation report
-End of internship
outline adopted from
interviews
the OERL website
- Integration
of
50
To what extent did the
- Review
of
evaluation projects
evaluation reports
reports
incorporate culturally
using culturally
-Dissertation/thesis
responsive concepts that
responsive evaluation
proposals (if
are reflected on the
checklist developed
available)
Culturally Responsive
based on the
- Onsite Interview of
Evaluation Checklist?
guidelines from the
participants and
“NSF 2002
sponsoring agency
User
-Evaluation Project
Friendly Handbook for
staff
Project Evaluation
-End of internship
Section”
survey and focus group
-End of internship
interviews
51
(Impact on an
How has the program
- Any incorporation of -Advisor survey
academic
enhanced or hindered the
internship work into
level)
intern’s academic work?
dissertation proposals survey and focus group
or other class work.
-End of internship
-End of internship
(How has it contributed to
interviews
the interns’ dissertation
-Review of
or thesis?)
dissertation proposal
52
What ways In has the
- Conference
-End of internship
internship provided
attendance
survey and focus group
additional experiences that - Presentation &
-End of internship
have enhanced the intern’s
publication
interviews
academic work?
opportunities
-Dissertation/thesis
-networking
Proposal
opportunities with
-Presentations and/or
other evaluators
publications
-Program Records
-Post internship
follow-up surveys
53
How has the internship
- Publications or
-Mentor
interviews
provided support mechanisms presentations
-Advisor interviews
or resources, (mentors,
generated from intern
-End of Internship
advisors, program staff,
evaluation projects.
survey and focus group
evaluation resources) and
- Documentation of
-End of internship
shared experiences that
consultation with
interviews
have facilitated the
mentors, advisors, and -Dissertation/thesis
interns’ ability to produce internship staff on
proposal
scholarly work?
-Program Records
scholarly work.
-Post internship
follow-up interviews
54
(Impact on a
What professional
- Intern’s listing of
-Post AEA reflections
professional
affiliations or networks
professional
-End of internship
level)
has the intern developed as affiliations or
survey and focus group
a result of the program?
valuable networks
- End of internship
developed during
interviews
internship period
-Program Records
and/or as a result of
intern’s affiliation
with internship
55
How has the internship
-Intern’s
mention of
-End of internship
contributed to the intern’s specific events or
survey and focus group
decision on the career path experiences such as
-End of internship
or evaluation interest upon seminar topics,
interviews
graduation?
coaching experiences
-Program Records
during workshops or
-Post-internship
with mentors,
follow up interviews
evaluation project
experiences; that
steered intern to
pursue further work in
a
particular
area of
evaluation.
56
What contributions has the
- Evaluation-related
-Post Internship
intern made to the field of Publications or
interviews
evaluation?
presentations.
-End of
What evaluation leadership
- AEA TIG leadership
survey and Focus Group
roles has the intern
roles
-End of internship
assumed?
-Evaluation-related
interviews
volunteer work such
-Post-internship
How has the internship
as, participation in
follow-up interviews
impacted the intern beyond
student editorial
evaluation experience?
board.
Internship
-List of any
leadership roles or
participation in
academic or
professional
activities that are
not evaluation-related
where participation
was somewhat
influenced by the
57
What specific activities
-AEA Multicultural TIG -Application essay
did the interns engage in
participation
academically, and/or
- Involvement in other -Academic advisor
-Program records
professionally that address professional, academic survey
issues of
or social activities
-End of internship
diversity/Multiculturalism? that address issues of survey & focus group
diversity and social
- End of internship
justice
interviews
- Publications or
-Post-internship
presentations
follow-up interviews
addressing these
issues by interns
- Utilization of
intern evaluation
report findings
58
2. How has the
To what extent were the
- Agency goals as
-Sponsoring agency
program
evaluation goals of the
listed by agency
survey
impacted the
agency met?
-Intern evaluation of
sponsoring
project experience
agencies?
-Sponsoring agency
application form
What were the perceived
-Benefits listed by
-Sponsoring agency
benefits by the sponsoring
agency staff
survey
agency of hosting an
-Site visit interviews
intern?
What did the interns
-Contributions
perceive as their
by intern
listed -Intern evaluation of
project survey
contribution to the
-End of internship
sponsoring agency?
interview
59
3. What is the
How has the implementation
- Discussions on the
-Internship records
impact of the
of the program stimulated
Duquesne campus
- End of internship
program on the
any events/discussions
prompted by
survey and focus group
broader
related to issues of
publications in the
-End of internship
community?
diversity/multiculturalism
Duquesne and local
interviews
at the intern’s institution newspapers.
-Post-internship
or at any other
-Discussions of issues follow-up interviews
organizations the interns
of diversity at local
-Program Records
are affiliated with such as AEA affiliates or
the AEA local affiliates?
intern academic
institutions prompted
by awareness of the
AEA/DU program.
60
What activities have the
- Presentations or
-End of internship
interns or program staff
newspaper articles
survey and focus group
engaged in to promote
highlighting the
-End of internship
evaluation
internship focus on
interviews
thinking/awareness about
promoting diversity in -Program Records
issues of
the evaluation
- News articles
diversity/multiculturalism
profession.
-Publications or
at the interns’
-Presentation of
presentations
institutions or other
intern evaluation
interns
organizations they are
findings highlighting
affiliated with?
these issues.
by
61
Do any of the interns
-Intern mention of
-End of internship
engage in research work
participation in
survey and focus group
that addresses issues
research projects
-Post-internship
pertaining to communities
pertaining to persons
follow up interviews
or persons of color, beyond of color in school or
the internship evaluation
as professionals.
project?
What is the perceived
-Contributions
-Interviews with key
contribution of the
mentioned during the
AEA board members,
internship program to the
interviews with key
past and present, key
evaluation profession?
AEA board members, TIG TIG leaders and key
leaders and key
evaluation
evaluation
professionals
professionals
62
(Evaluation
How many interns pursue an
-Numbers of interns
-Post-internship
capacity to
evaluation career working
working as evaluators
follow-up interviews
work in
in racially, ethnically or
in racially,
culturally,
culturally diverse
ethnically or
racially,
settings?
culturally diverse
ethnically
settings.
diverse
settings)
(Creation of a
How many trainees does the
-Number of interns
-Post internship
pipeline of
internship program enroll?
enrolled in program
follow-up interviews
evaluators of
How many of the interns
-Number of interns who -Program records
color)
pursue a career in
complete internship.
evaluation?
-Number of interns who
pursue a career in
evaluation
63
64
Download