Members` Experiences with AEA - Goodman Research Group, Inc.

advertisement
American Evaluation Association Internal Scan Report to the Membership
The American Evaluation Association undertook an internal scan to learn more about its
membership during the period September 2007 to January 2008. Methods included an online
survey of the full membership, follow-up interviews, and online Q&A groups.
What follows is the basic report provided to the AEA Board of Directors in March of 2008, with
some updates based on Board feedback.
The comprehensive list of currently available reports from the scan includes:
 American Evaluation Association Internal Scan Report to the Membership
 Index of Quantitative Analysis of the 2007 AEA Member Survey
 Index of Qualitative Analysis of the 2007 AEA Member Survey
 Index of Qualitative Analysis of Interviews from the 2007-2008 AEA Internal Scan
 Index of Qualitative Analysis of Online Q&A Groups from the 2007-2008 AEA Internal Scan
 Presentation of the American Evaluation Association Internal Scan Findings
16 Sconticut Neck Rd #290 ▪ Fairhaven MA 02719 ▪ www.eval.org ▪ 1-508-748-3326
American Evaluation
Association Internal Scan
Report to the Membership
PREPARED BY
Colleen Manning, M.A.
Elizabeth Bachrach, Ph.D.
Margaret Tiedemann
Marianne E. McPherson, M.S.
Irene F. Goodman, Ed.D.
SUBMITTED TO
American Evaluation Association
Fairhaven, MA
March 2008
Revised April 2008
16 Sconticut Neck Rd #290 ▪ Fairhaven MA 02719 ▪ www.eval.org ▪ 1-508-748-3326
ACKNOWLEDGEMENTS
Goodman Research Group, Inc. (GRG) had the good fortune of collaborating on
the internal scan with a stellar AEA Board appointed task force, including:





Leslie Goodyear, Ph.D., AEA Board Member, Internal Scan Task Force
Chair and Co-Chair AEA Qualitative Methods TIG, Research Scientist,
Education Development Center,
Susan Kistler, Executive Director, AEA,
Thomas Chapel, M.A., M.B.A., former AEA Membership Committee
leader (at the time of the member survey), Senior Health Scientist,
Centers for Disease Control and Prevention
Thomas Schwandt, Ph.D., former AEA Board member (at the time of the
member survey) and Professor, Educational Psychology, University of
Illinois at Urbana-Champaign, and
Mary Stutzman, Ph.D., Director, Florida State University Survey
Research Laboratory.
We are very appreciative of their insight, support, and guidance on all aspects of
the project, and especially for their role in co-developing the member survey.
We thank the AEA office, especially Susan Kistler and Heidi Nye, for their
invaluable assistance along the way.
Thank you to the AEA committee members and others who pilot tested the
survey and took the time to provide thoughtful and useful feedback that informed
the final version.
We thank GRG staff members Peggy Vaughan, Ph.D. and Rucha Londhe, Ph.D.
and GRG intern Katie Handwerger, for their assistance coding and helping to
interpret data. Thank you, too, to Nina Grant and Jennifer Parks for their
administrative assistance.
A special thanks, as always, to GRG consultant Robert Brennan, Ed.D. for his
special brand of wisdom and moral support.
Most importantly, we thank the AEA members for their time and input during the
member survey, interviews, and/or online Q&A groups. It was a special
privilege and pleasure to learn about the experiences of our fellow evaluators and
AEA members!
GOODMAN RESEARCH GROUP, INC.
April 2008
TABLE OF CONTENTS
EXECUTIVE SUMMARY ............................................................................. I
INTRODUCTION .......................................................................................... 1
METHODS ..................................................................................................... 2
COMPOSITION OF THE AEA MEMBERSHIP .......................................... 5
BACKGROUND CHARACTERISTICS OF RESPONDING
MEMBERS .............................................................................................. 5
PATHWAYS INTO EVALUATION: ACADEMIC BACKGROUND . 7
MEMBERS’ PROFESSIONAL IDENTITIES IN EVALUATION ........ 9
MEMBERS’ EMPLOYMENT IN EVALUATION .............................. 12
NEXUS OF PROFESSIONAL IDENTITY AND EMPLOYMENT .... 12
MEMBERS’ EVALUATION-RELATED WORK ...................................... 13
TIME DEVOTED TO EVALUATION ................................................. 13
EVALUATION-RELATED WORK ..................................................... 14
CONTENT AREAS ............................................................................... 15
CONDUCTING EVALUATIONS ........................................................ 17
EVALUATION-RELATED PROFESSIONAL CHALLENGES ......... 19
HOW MEMBERS EXPLAIN THEIR EVALUATION WORK ........... 20
MEMBERS’ EXPERIENCES WITH AEA ................................................. 21
STRENGTH OF AFFILIATION ........................................................... 21
THE VALUE OF CURRENT RESOURCES ........................................ 22
ENVISIONING THE FUTURE OF AEA ............................................. 25
CONCLUSIONS .......................................................................................... 31
CONSIDERATIONS FOR USE .................................................................. 32
CONSIDERATIONS FOR LONG-TERM DATA COLLECTION ...... 34
ADDENDUM: PRESENTATION AND DISCUSSION OF FINDINGS AT
WINTER 2008 AEA BOARD MEETING ................................................... 35
APPENDICES
APPENDIX A:
APPENDIX B:
APPENDIX C:
APPENDIX D:
APPENDIX E:
AEA RFP ......................................................................... A-1
MEMBER SURVEY ....................................................... B-1
INTERVIEW PROTOCOL ............................................. C-1
ONLINE Q&A GROUP PROTOCOL ........................... D-1
METHODS ....................................................................... E-1
GOODMAN RESEARCH GROUP, INC.
April 2008
EXECUTIVE SUMMARY
The American Evaluation Association (AEA) internal scan, conducted by
Goodman Research Group, Inc. (GRG) between September 2007 and January
2008, had the goal of providing the association with accurate and comprehensive
data on the membership and their professional development needs. The scan
included a web-based survey of the entire membership, and follow-up interviews
and online Q&A groups with samples of the membership.
KEY FINDINGS
AEA Composition
 A majority of responding members is female and White and resides
primarily in the United States; however, the proportion of members of
color and international members appear to be on the rise.
 A substantial proportion of brand new members are already moderately
to very experienced in evaluation.
 The primary academic backgrounds of members are Education and
Psychology.
 The most common primary professional identity among responding
members is that of evaluator; however, members wear many hats and
their external identification as evaluators depends on context and
audience.
Nature of Members’ Evaluation Work
 Members are employed in a variety of settings. While the most
frequently reported setting is college/university, the majority are
primarily employed in non-university settings.
 Next to conducting evaluations, the two most commonly practiced forms
of evaluation work are providing technical assistance and evaluation
capacity building.
 Aside from program evaluations, the only type of evaluation conducted
by a majority of responding members, the most three most common
types of evaluations conducted are performance monitoring, policy
evaluations, and curricula evaluations.
 Education and health/public health are the membership’s top two content
areas. Eight in ten members work in one or both of these areas.
 Approximately two in ten U.S. members focus at least some of their
evaluation work outside the U.S.
 Key evaluation-related professional challenges include how others
(mis)understand evaluation, pressure from clients and funders to use
specific (and sometimes inappropriate) research methods, underutilized
evaluations, and, for new members in particular, sufficient guidance and
support for their evaluation work.
 Members with various primary professional identities (i.e., evaluator,
faculty, researcher) differ significantly by background characteristics,
time devoted to evaluation, types of evaluation-related work, and content
areas.
GOODMAN RESEARCH GROUP, INC.
April 2008
i
Members’ Experiences with AEA
 Most members’ strongest professional association affiliation is with
AEA. Of those who affiliate most strongly with another association,
AERA is the most frequently mentioned.
 AEA publications – American Journal of Evaluation, New Directions for
Evaluation, and the Guiding Principles for Evaluators – are the most
widely used and among the most useful of the association’s resources. A
majority of the members also find the annual meeting very useful.
Relative to other resources, the EVALTALK listserv, Topical Interest
Groups, and the electronic newsletter are considered less useful.
 Two potential new resources are endorsed highly by a majority of the
responding members: an online archive of evaluation materials (e.g.,
reports, instruments) and new live regional training opportunities. A
journal targeted to practitioners and updates on relevant public policy
issues that affect the field of evaluation are also quite popular. Generally,
the association’s least experienced members, particularly students, are
most enthusiastic about these offerings.
 There is some uncertainty among members as to what AEA’s current
role, if any, is in public conversations about evaluation policy, although
they endorse the association’s involvement in maintaining and promoting
high quality evaluations and want to be kept abreast of the progress and
status of such conversations.
 While some members view the evaluation field as tumultuous due to
ideological and methodological tensions, members generally feel secure
in the future of evaluation, recognizing that an increasing range of
organizations are expressing interest in evaluation.
GOODMAN RESEARCH GROUP, INC.
April 2008
ii
INTRODUCTION
The American Evaluation Association (AEA) contracted with Goodman
Research Group, Inc. (GRG) in late July 2007 to conduct an internal scan of the
AEA membership. The Board appointed a five-person task force to work with
GRG. The overarching goal of the scan was to provide the association with
accurate and comprehensive data on the membership and their professional
development needs. The questions of interest at the outset of the scan (see AEA
RFP in Appendix A) included:
Composition of the Membership:
 What is the composition of the association’s membership?
 How experienced are members in the field of evaluation?
 What academic and non-academic preparation do members have for their
current positions?
How Members Practice Evaluation:
 How do members practice evaluation and in what fields?
 In what settings do members carry out their evaluation work?
 What are members’ job responsibilities?
 What is the nature of members’ work in particular sectors?
 What are members’ evaluation-related professional challenges?
Value of AEA Membership:
 How does AEA membership benefit members?
 What is members’ involvement in other professional associations?
 How useful are AEA’s current services and products?
 Why are members more or less satisfied with particular
products/services?
 What alternative services/products do members desire from AEA?
 How do members envision the future of AEA and the field of evaluation?
The primary audience for the scan is the AEA Board and the intended use of the
scan results is to help inform the Board’s strategic planning, in conjunction with
the results of other association initiatives and considerations (e.g., financial
considerations, capacity). AEA is also disseminating the internal scan to the
membership in a variety of ways.
Following a description of the internal scan methods, we present the results in
three main sections, beginning with the composition of the membership, turning
to members’ evaluation-related work, and finally considering members’
experiences with AEA. We then offer conclusions and considerations for use by
the AEA Board. An addendum to the report (at the end of this report) outlines
the presentation and discussion of the findings at the winter 2008 Board meeting.
GOODMAN RESEARCH GROUP, INC.
April 2008
1
METHODS
The scan was a descriptive study, meant to characterize the AEA membership,
the nature of members’ work, and their experiences with AEA. The scan
included three components (described below) that gathered both quantitative and
qualitative data.
A WEB-BASED SURVEY OF THE AEA MEMBERSHIP
The AEA Member Survey was conducted with all members, including U.S. and
international members. AEA and GRG believed it was important to survey the
full membership, rather than a sample, in order to emphasize the value AEA
places on each member’s input (noted in the AEA RFP for the internal scan).
GRG and the AEA task force co-developed the member survey and GRG pilot
tested the survey with AEA committee members and a small purposive sample of
other members. The survey consisted of 28 distinct web pages, including a
welcome page and thank you/confirmation page, and featured a number of
branching patterns. The survey primarily consisted of close-ended questions but
also included three opportunities for open-ended comments. (See Appendix B
for a paper copy of the survey.)
A total of 5,460 surveys were distributed and we received valid responses from
2,657 members, yielding a response rate of 49%. We believe the response rate
achieved in this survey is good. Nonetheless, half of the membership did not
respond, and therefore the possibility of nonresponse bias cannot be overlooked.
We took three steps to explore the possibility of nonresponse bias: 1) we
conducted a nonrespondent bias survey, 2) we investigated differences between
earlier and later responders, and 3) we compared the respondents to known data
for the AEA membership.
The results of the nonrespondent bias survey raise the possibility that stronger
affiliation with a professional association other than AEA may have been a factor
in nonresponse. This is not altogether surprising, as others studies have linked
salience of issues to response rate.1 Comparing earlier and later respondents to
the member survey, we found that earlier responders were more likely to be
White and were somewhat more likely than later respondents to be longer-term
members of AEA. However, our comparison of respondent and known
demographic data suggests the member survey respondents were proportionally
representative of the entire membership in terms of race; they also were
proportionally equivalent in terms of gender and US/international status.
1
Sheehan, K., & McMillan, S. (1999). Response variation in e-mail surveys: An
exploration. Journal of Advertising Research, 39, 45-54.
GOODMAN RESEARCH GROUP, INC.
April 2008
2
The 2007 member survey response rate is slightly higher than the two AEA
member surveys of which we are aware: the 2001 AEA member survey (44%)2
and the 2004 AEA Independent Consulting TIG member survey (37%)3. The
response rate is also higher than the response rates of a few other professional
association member surveys we found in a cursory search of relevant
professional association web sites: 2003 APSA international membership survey
(38%)4, 2007 APHA Community Health Planning and Policy Development
Section member survey (12%)5, and 2008 APHA Statistics Section member
survey (29%)6.
Finally, in a meta-analysis exploring factors associated with higher response rates
in electronic surveys, Cook et al. (2000) reported the mean response rate for the
68 surveys reported in 49 studies was 39.6% (SD=19.6%).7 The studies included
in this meta-analysis included those published in Public Opinion Quarterly,
Journal of Marketing Research, and American Sociological Review as well as
unpublished research.
INTERVIEWS
In order to enhance the findings from the member survey, GRG conducted
follow-up interviews with a sample of 56 AEA members who responded to
the survey. Approximately half of the interviews were completed in person
at the annual AEA conference in Baltimore in November and the other half
were completed by telephone in December 2007 and January 2008.
The interview sampling plan was a stratified random sample by evaluator type
(i.e., evaluators in firms, independent contractor evaluators, evaluators in
universities, and evaluators in government) and experience in evaluation (i.e.,
number of years working in the field). The sampling is described in further detail
in Appendix D. One exclusion criterion for the interview selection was
affiliation with the AEA Board or committees. We also used quotas to limit the
number of international and Beltway area interviewees (for the in-person
interviews in Baltimore).
GRG and the AEA task force co-developed the interview protocol and then GRG
pilot tested the protocol with a small purposive sample of members. (A copy of
the protocol is provided in Appendix C.)
2
Unpublished data from the AEA office
Jarosewich, T., Essenmacher, V. L., Lynch, C. O., Williams, J. E., Doino-Ingersoll,
J. (2006). Independent consulting topical interest group: 2004 industry survey. New
Directions For Evaluation, 111, 9-21.
4
Retrieved from http://www.apsanet.org/imgtest/survey.pdf
5
Retrieved from http://www.apha.org/NR/rdonlyres/01EB89FB-FEF6-4E8F-A7F95F1E0DE2CF61/0/chppd_2007_2.pdf
6
Retrieved from http://www.apha.org/NR/rdonlyres/9983F55B-B29A-465C-AFA6269272210411/0/StatSurveyHighlights08_Website.pdf
7
C. Cook, F. Heath, R.L. Thompson. (2001). A meta-analysis of response rates in
web- or Internet-based surveys. Educational and Psychological Measurement, 60(6),
821-836.
3
GOODMAN RESEARCH GROUP, INC.
April 2008
3
ONLINE Q&A GROUPS
To further explore themes of interest arising from the scan (and in a costeffective way), GRG conducted three online Q&A groups, one group with new
evaluators, one with moderately experienced evaluators in firms (i.e., with 6-10
years of experience in evaluation), and one with experienced independent
contractor evaluators (i.e., 11-15 years of experience). We explored the same
three topics in each group: professional identity in evaluation, evaluation-related
professional challenges, and AEA’s role in evaluation policy. We used a semistructured protocol, developed in consultation with the task force. (A copy of the
protocol is provided in Appendix D.)
We assigned every member in each of the strata a random number and initially
invited 20 from each group (the maximum number of participants we desired per
group). (Our exclusion criteria were “in an AEA leadership position” and
“participated in an internal scan interview.”) As we received declinations (or no
response), we invited the next member from our random numbers table.
Eventually, we exhausted that table, so, ultimately, every member of the strata
had received an invitation. Thus, the online Q&A group participants should be
viewed as a self-selected sample.
Thirty-two members contributed to the groups. On average, each participant
posted three responses over the course of one week, with the level of response
decreasing over the Q&A period. Despite lower than expected participation, this
cost-effective data collection method did stimulate some creative thinking among
members that we believed was worth mining for insights.
DATA ANALYSIS
Survey data were imported into SPSS, where analyses included frequencies,
crosstabs (with appropriate statistical tests), and nonparametric statistical
tests. Where we comment on group differences in the report, they are
statistically significant at the p<.05 level.
The qualitative data from the survey, interviews and Q&A groups were
analyzed inductively, allowing for emergent themes. We analyzed the
qualitative data in three phases – as we completed the survey, interviews, and
Q&A groups, respectively, and our approach was to analyze the data by
question.
Appendix E provides more details on the internal scan methods, including survey
procedures, a breakdown of the survey response rate over the time it was
conducted, steps that we took to minimize survey nonresponse, the data from the
three steps to explore the possibility of nonresponse bias (mentioned above), the
full sampling plan for the interviews, online Q&A group procedures, and level of
participation in the online Q&A groups.
GOODMAN RESEARCH GROUP, INC.
April 2008
4
COMPOSITION OF THE AEA MEMBERSHIP
BACKGROUND CHARACTERISTICS OF RESPONDING
MEMBERS
A majority of responding members is female and White and resides primarily in
the United States. About half of responding members are in their 40s or 50s and
one-third are younger than 40. About half of respondents have doctorates and
most of the remaining respondents have Master’s degrees (as their highest level
of education). Responding members have a range of experience in evaluation,
with one-third being newer evaluators with less than five years of experience in
the field. About two-thirds of respondents have been members of AEA for four
years or less. (See Table 1.)
Table 1
Background Characteristics of Responding Members
Percentage
Female
67%
Male
33%
Race/ethnicitya (n=2,620)
White
73%
Black or African American
7%
Asian
5%
Hispanic or Latino
5%
American Indian or Alaskan
2%
Native
Native Hawaiian or Other
<1%
Pacific Islander
International member
8%
Chose not to respond
3%
Other
2%
Primary residence (n=2,648)
United States
86%
Other
14%
Age range (n=2,619)
20s or 30s
33%
40s
24%
50s
29%
60s or older
14%
Highest degree (n=2,537)
Doctorate
52%
Master’s
42%
Bachelor’s
7%
Years experience in evaluation
Less than 5 years
33%
(n=2,652)
6-10 years
24%
11-15 years
16%
16 or more years
27%
Length of membership (n=2,633) b
Less than 1 year
21%
1-4 years
44%
5 or more years
36%
a
159 respondents selected more than one response.
b
Respondents self-reported their years of membership, therefore, the possibility of
measurement error cannot be discounted. Response choices were <1 year, 1-2 years,
3-4 years, 5-6 years, 7-8 years, 9-10 years, and 10+ years. We then separated
respondents into the three categories shown. These same three categories were used
in the AEA 2001 Member Survey to distinguish brand new members, from those
with a shorter-term commitment, from those with a longer-term commitment.
NOTE: Due to rounding, percentages may not total 100.
Gender (n=2,637)
GOODMAN RESEARCH GROUP, INC.
April 2008
5
Experience and Education of New Members
Generally, respondents who are longer-term members of AEA have more
experience in the evaluation field and have more education (i.e., they are more
likely to have doctorates) than newer members. However, it is worth noting that
among the newest responding members (those who have been members for less
than one year):
 38% are moderately to very experienced in evaluation (with six or more
years of experience) and
 33% of these new members have doctorates.
Gender, Race, and Education of International Members
The background characteristics of international members, specifically their
gender, race, and education, differs from U.S. members. As seen in Figure 1,
compared to U.S.-based members, higher percentages of those who reside
outside the United States are male and people of color, and a lower percentage of
those living outside the U.S. have doctorates.
Figure 1
Comparison of Gender, Race, and Education of U.S versus International
Residents
60%
50%
40%
Resi de i n U.S.
30%
Resi de o utsi de U.S.
20%
10%
0%
Male
People of c olor
GOODMAN RESEARCH GROUP, INC.
Have doc torate
April 2008
6
Trends in Composition of AEA Membership over Time
Using total years of membership as a proxy for historic intervals, we observe that
the female, people of color, and international composition of AEA is on the rise.
As illustrated in Figure 2, compared to those with intermediate- and longer-term
commitments to the association, higher percentages of new members are female,
people of color, and reside outside the United States.8
Figure 2
Trends in Gender, Racial, and International Composition of AEA Membership
over Time
80%
70%
60%
50%
Fe male
40%
Peo ple of colo r
30%
Resi de o utsi de U.S.
20%
10%
0%
5+ y ears
1-4 y ears
Les s than 1 y ear
Total years of me mbership
PATHWAYS INTO EVALUATION: ACADEMIC
BACKGROUND
As a way to understand members’ academic preparation for evaluation,
respondents checked off all of the degrees they hold and the fields in which they
hold those degrees. (From these data, we computed each respondent’s highest
degree, shown in Table 1).
Bachelor’s Degree
Ninety percent (n=2,389) of participants indicated that they hold a Bachelor’s
degree, with Psychology, by far, the most common Bachelor’s degree field
(26% of those who indicated a field). Other commonly indicated fields
included Education and Sociology. (See Table 2). Additionally, 9% of
respondents indicated holding a second Bachelor’s degree (n=235), most
commonly in Education (n=47) or Psychology (n=29).
8
The increase in the percentage of females is supported by AEA membership data.
According to the AEA office, the percentage of females in the association rose from
60% in 2001 to 66% in 2008. Among new members, the percentage of females
climbed from 60% in 1996 to 71% in 2007.
GOODMAN RESEARCH GROUP, INC.
April 2008
7
Table 2
Bachelor’s Degree Field, Most Common Responses
Percentage
Psychology
26%
Sociology
9%
Education
8%
Other fields not listed (each field <4%)
58%
n = 2,389. Percentages include both first and second Bachelor’s degrees. Due to
rounding, percentages may not total 100.
Master’s Degree
Eighty-seven percent (n=2,298) of responding members indicated holding a
Master’s degree and described the field in which they hold that degree;
Education and Psychology topped the list. (See Table 3). More than onetenth of survey respondents (13%, n=351) indicated having a second
Master’s degree, most commonly in Education (n=36, Health/Public health
(n=33), Public policy/public administration (n=33), or Business and
management (n=26).
Table 3
Master’s Degree Field, Most Common Responses
Percentage
Psychology
15%
Education
14%
Health/Public health
10%
Public policy/public administration
9%
Other fields not listed (each field <7%)
50%
n = 2,289. Percentages include both first and second Master’s degrees. Due to
rounding, percentages may not total 100.
Doctorate
Just over one-half of members indicated that they hold a doctoral degree
(54%, n=1,422) and described the field of that degree. Once again,
Education and Psychology were the most common degree fields. (See Table
4). In addition, 27 participants indicated holding a second doctorate. The
most common field listed was Evaluation (19% of dual-doctorate holders).
GOODMAN RESEARCH GROUP, INC.
April 2008
8
Table 4
Doctorate Field, Most Common Responses
Percentage
Education
23%
Psychology
18%
Educational psychology
10%
Evaluation
10%
Sociology
8%
Health/Public health
6%
Other fields not listed (each field <5%)
25%
n = 1,422. Percentages include both first and second doctoral degrees.
MEMBERS’ PROFESSIONAL IDENTITIES IN EVALUATION
The results of the member survey show four overarching member identities, in
response to the close-ended question, “Many of us wear more than one hat in the
evaluation field. What is currently your primary professional identity in the
evaluation field?” Not surprising, the most common professional identity among
responding members was that of evaluator; nearly half of the respondents
identified primarily as evaluators. The other key identities included university
faculty members, researchers, and students. The four identities of evaluator,
faculty, researcher, and student account cumulatively for 85% of the responding
members. (See Table 5.)
Table 5
Primary Professional Identity in the Evaluation Field
Percentage
Evaluator (in any capacity)
49%
College or university faculty member or instructor
15%
Researcher
14%
Student involved in evaluation (paid or unpaid)
7%
Other (each identity <2%)
15%
n = 2,655
Aside from these predominant identity groups, members are composed of myriad
smaller groups, including those who choose to describe their professional identity
using their job titles (e.g., research assistant, research associate, project manager),
those who identify according to how they are employed (e.g., consultants,
university employees), trainers, retirees, and others.
GOODMAN RESEARCH GROUP, INC.
April 2008
9
A Closer Look at Professional Identity
The online Q&A groups further explored the topic of professional identity.
Participation in the groups was lower than expected; however, the available
responses help us understand why members do or do not (or would or would not)
identify themselves as evaluators. Two factors seem especially important to new
members: experience and perceived competence.
New members, in particular, use phrases such as “don’t have enough
experience,” and “too new to the field” in commenting on why they may not
identify themselves as evaluators. They also refer to their competence using
expressions such as “don’t know enough” or “don’t feel able to offer competent
evaluation services to someone yet.” A few new evaluators point to lack of
certification in evaluation as a factor holding them back from identifying as an
evaluator.
A common theme among more experienced members is that they “end up
wearing many hats” in their jobs; thus, the label “evaluator” does not feel
inclusive enough for them, as one member explains, “Describing myself solely as
an evaluator can be limiting in the work I do. Evaluation is a key part of that
work, but I have expanded my consulting to include fund development, research
design, survey development, marketing analysis, training and presentations, etc.”
However, perhaps even more important than their self-reference is their
consideration of what the term means to others, primarily potential clients. For
some, the broader term “consultant” is a common sense marketing strategy. In
this way, the discussion revealed a chameleon-like quality among members, who
identify themselves differently depending on the context and/or audience. One
member relayed that she uses the terms evaluator and consultant interchangeably,
but as she unpacked her thoughts in the discussion realized that she tends to “use
evaluator when I want to stress function and consultant when I want to stress
business relationship.”
Finally, another opinion is that evaluation (still) has a negative or ambiguous
connotation to potential clients and consumers of evaluation. This causes some
members to shy away from identifying themselves (at least to others) as
evaluators. One member, who prefers the term “consultant evaluator” to
“evaluator,” explains that one of the advantages in this is “altering the
perceptions of those I am working with, so that we are working together toward a
goal, rather than me being there to judge them or their work.”
Background Differences by Professional Identity
Compared to other members, a far higher percentage of faculty members have
doctorates; 90% of the faculty members have doctorates, compared to 60% of
researchers and 49% of evaluators.9
9
As expected, none of the responding members who identified as students had
doctorates.
GOODMAN RESEARCH GROUP, INC.
April 2008
10
The faculty group has a higher percentage of males than do other member
groups, so there is more of a gender balance; the female/male percentage split for
faculty is 54% females/46% males, compared to 67%/33% for researchers,
71%/29% for evaluators, and 74%/26% for students.
Among members who identify as evaluators, faculty, or researchers, faculty
members are the most experienced in the evaluation field and researchers are the
least experienced.10 For example, 41% of faculty members have more than 16
years of experience in the field, while only 29% of evaluators and 21% of
researchers have more than 16 years of experience. Researchers also are younger
than evaluators and faculty and, as a group, are newer to AEA.
Faculty Members’ Academic Appointments
Those respondents who indicated a faculty identity in evaluation completed a
separate set of questions about their academic appointments. Most faculty
members hold their primary appointment in an Education department, followed
by equal numbers in Psychology and Health/Public health departments. A
majority are in full-time tenured or tenure-track positions, most commonly as full
professors. Table 6 displays those results.
Table 6
Faculty Members’ Academic Appointments
Department
appointment (n=395)
Level of courses
(n=369)
Appointment (n=391)
Status (n=396)
Education
Psychology
Health/Public Health
Educational Psychology
Social Work
Public Policy
Other (each department <5%)
Graduate only
Graduate and undergraduate
Undergraduate only
Tenured or tenure-track full
professor
Tenured or tenure-track
associate professor
Tenured or tenure-track
assistant professor
Nontenure track position
Other
Full-time
Part-time
Number
124
51
51
24
24
19
102
209
118
42
122
Percentage
31%
13%
13%
6%
6%
5%
26%
57%
32%
11%
31%
95
24%
69
18%
73
32
367
29
19%
8%
93%
7%
10
The vast majority (85%) of students had less than five years of evaluation
experience, and only 1 percent had 16 or more years.
GOODMAN RESEARCH GROUP, INC.
April 2008
11
MEMBERS’ EMPLOYMENT IN EVALUATION
The highest percentage of members is employed at colleges or universities.
Others’ key ways of being employed include as employees in research,
evaluation, and/or consulting firms, as independent contractors, as employees in
local, state, or federal government and as employees in non-profit organizations.
(See Table 7.)
Table 7
Primary Employment in Evaluation
Percentage
Employee of a college/university
29%
Employee of a research evaluation and/or consulting firm
19%
Self-employed independent contractora
16%
Employee of a local, state, or federal government
12%
Non-profit organization
7%
Student involved in evaluation (paid or unpaid)
6%
Other b
11%
n = 2,649
a
Independent contractors work primarily in non-profit companies/agencies (59%) and
state or local government (45%). They also are contracted by colleges/universities
(38%), non-profit research firms (33%), foundations (33%), federal government (27%),
for-profit research firms (24%), and for-profit companies (16%).
b
Some of the other places of employment mentioned by respondents included
foundations, schools, and international organizations.
NEXUS OF PROFESSIONAL IDENTITY AND EMPLOYMENT
Respondents who identified as evaluators were distributed primarily among
firms, independent contractors, universities, government, and non-profits. These
five subgroups account cumulatively for 89% of the members who identify as
evaluators.
In contrast, the vast majority of respondents who identified as faculty members
were employed in evaluation in universities,11 while a small percentage were
employed in evaluation as independent contractors.
Respondents who identified as researchers were primarily and evenly distributed
between universities and firms. (See Table 8.)
11
In addition to the evaluation-related work they do at their colleges/universities,
74% of faculty are contracted to do evaluation-related work in other settings. Of
those who practice evaluation outside their universities, the highest percentage (37%)
are contracted to do work in state or local governments.
GOODMAN RESEARCH GROUP, INC.
April 2008
12
Table 8
Primary Employment in Evaluation by Key Professional Identity Groups
Percentage
Evaluator (n=1,311)
Faculty (n=395)
Researcher (n=358)
Student (n=194)
Employee of a research evaluation and/or
consulting firm
Self-employed independent contractor
Employee of a college/university
Employee of local, state, or federal
government
Employee of a non-profit
Other
Employee of a college/university
Self-employed independent contractor
Other
Employee of a college/university
Employee of a research evaluation and/or
consulting firm
Employee of local, state, or federal
government
Self-employed independent contractor
Other
Student involved in evaluation (paid or
unpaid)
Employee of a college/university
Other
25%
22%
18%
14%
10%
11%
79%
13%
8%
33%
31%
12%
8%
16%
73%
10%
17%
MEMBERS’ EVALUATION-RELATED WORK
TIME DEVOTED TO EVALUATION
Overall, AEA members devote more than half of their work time to evaluation.
However, as Figure 3 shows, members who identify as evaluators spend
considerably more time on evaluation than do faculty, researchers, and students.
While this is not very surprising, the contrast in time devoted to evaluation is
nonetheless remarkable.
GOODMAN RESEARCH GROUP, INC.
April 2008
13
Figure 3
Percentage of Members who Spend More than 75% Work/Study Time on
Evaluation by Professional Identity
70%
60%
50%
40%
30%
20%
10%
0%
Evaluators
Researc hers
Students
Fac ulty
Evaluators’ work settings shape the amount of time they devote to evaluation. Of
note, evaluators in non-profits spend less of their work time on evaluation than
do other types of evaluators. Evaluators in nonprofits also are less experienced in
evaluation compared to their counterparts in other employment settings. Fortyfive percent of evaluators in non-profits have less than five years of experience in
evaluation, compared to 35% of university-based evaluators, 30% of evaluators
in government, 27% of evaluators in firms, and 14% of independent contractor
evaluators who have less than five years of experience.
EVALUATION-RELATED WORK
Nearly all members are involved in conducting evaluations, yet only 8% of them
focus exclusively on this form of evaluation. Most practice other evaluationrelated work, as presented in Table 9. Next to conducting evaluations, the two
most commonly practiced forms of evaluation are:
 providing technical assistance and
 evaluation capacity building.
Table 9
Types of Evaluation-Related Work in Which Members Engage
Conducting evaluations (in any capacity)
Technical assistance
Evaluation capacity building
Training others in evaluation
Writing about evaluation
Planning/contracting for evaluations that others conduct
Teaching evaluation
n = 2,645
GOODMAN RESEARCH GROUP, INC.
April 2008
Percentage
91%
57%
54%
48%
45%
39%
30%
14
Evaluators in nonprofits – more than other types of evaluators – focus on
evaluation capacity building (71%) and training others in evaluation (61%).
In the time they devote to evaluation, compared to other members, more of the
faculty teach (74%), train (60%), and write about evaluation (60%). Similarly,
among those who identify as evaluators, those employed in universities do more
writing (50%) than those employed in other settings.
As expected, students have far less experience in all forms of evaluation-related
work than do other members. Of note, 33% of students are not involved (in any
capacity) in conducting evaluations.
CONTENT AREAS
The most common area in which AEA members (62%) do their evaluationrelated work is education, though disaggregating education areas (e.g., K-12,
higher education) puts health/public health at the top of the list. A majority of
members (81%) work in one or both of these two areas (education or health);
nearly one-quarter (22%) work in both areas.
Following health/public health, the membership’s top content areas (defined as
more than 20 percent of members doing work in that area) are K-12 education,
non-profits, government, youth development, evaluation methods, higher
education, public policy, human services, child care/early childhood education,
and adult education. (See Table 10.) Most members traverse a range of content
areas; however, 16% of members focus exclusively in one content area.
Table 10
Top Areas in Which Respondents Engage in Evaluation-Related Work
Health/Public health
K-12 education
Non-profits
Government
Youth development
Evaluation methods
Higher education
Public policy/Public administration
Human services
Child care/early childhood education
Adult education
n = 2,637
GOODMAN RESEARCH GROUP, INC.
Percentage
41%
37%
34%
29%
27%
27%
26%
24%
23%
21%
21%
April 2008
15
Health is an area of particular interest to the association at this point in time. Of
members doing work in health/public health:
 43% do work related to nonprofits,
 37% work in the area of government,
 34% do work in human services,
 34% do work in youth development,
 30% work in K-12 education,
 30% do work related to evaluation methods,
 30% do public policy/public administration work,
 28% work with special needs populations, and
 27% work in child care/early childhood education.
Higher percentages of faculty (39%) than other types of members are engaged in
work related to evaluation methods (and evaluation theory, not shown in Table
10), as well as higher education (38%). More of those (44%) who identify as
evaluators and are based in universities also do evaluation-related work in higher
education.
Finally, compared to other types of members, a greater proportion of those who
identify primarily as researchers do evaluation work related to policy (31%).
Evaluation-Related Work Focused Outside the United States
We asked members what percent of their evaluation work and/or study had a
focus outside the U.S. Table 11 shows these results. About two in ten U.S.
members focus at least some of their evaluation outside the U.S. and, when
looking across all members, approximately three in ten are doing some of their
evaluation work in the international context. While the majority of members
who reside outside the U.S. also focus their evaluation work outside the U.S.,
about three in ten of them have at least some U.S. evaluation focus.
Table 11
Evaluation-Related Work Focused Outside the U.S.
% evaluation
focused outside U.S.
All
76%-99%
51%-75%
26%-50%
1%-25%
None
Reside in U.S
(n=2,247)
3%
3%
1%
3%
14%
77%
GOODMAN RESEARCH GROUP, INC.
Reside outside U.S.
(n=369)
69%
16%
2%
3%
4%
6%
April 2008
Total
(n=2,616)
13%
5%
1%
3%
12%
67%
16
CONDUCTING EVALUATIONS
The vast majority (91%) of members had conducted evaluations in the year prior
to the survey (fall 2006-fall 2007). Faculty and students account for most of
those who had not.
Role in Conducting Evaluations
As indicated in Table 12, when involved in conducting evaluations, most
members are either managers/coordinators (i.e., they primarily carry out the dayto-day evaluation activities) or supervisors/directors (i.e., they provide
leadership/oversight).
Table 12
Typical Role in Conducting Evaluations
Percentage
32%
Manager or coordinator of evaluations
Supervisor/director of evaluations
31%
Specialist or consultant on evaluations
14%
Work primarily on own to carry out all evaluation activities
14%
Assistant on evaluations
6%
Other
4%
n = 2,394
Role is related to level of experience in evaluation. As members gain more
experience they shift from primarily carrying out the day-to-day activities related
to evaluations (managers) to providing leadership or oversight of evaluations in
which they are involved (supervisors/directors).
Among the five key types of evaluators identified earlier in the report – those in
firms, universities, government, nonprofits, and independent contractors –
independent contractors account for the majority (65%) of those who work
primarily on their own when conducting evaluations. The decision or ability to
become an independent contractor is likely associated with experience in
evaluation, for as shown in Figure 4, independent contractors are more
experienced than are other types of evaluators. They are also longer-term
members of AEA than are other evaluators.
GOODMAN RESEARCH GROUP, INC.
April 2008
17
Figure 4
Percentage of Evaluators with More than 10 Years of Experience by Type of
Evaluator
70%
60%
50%
40%
30%
20%
10%
0%
Independent
c ontrac tors
Evaluators in
firms
Evaluators in
gov ernment
Evaluators in
universities
Evaluators in
nonprofits
Types of Evaluations Conducted
Members conduct a variety of types of evaluation, as displayed in Table 13.
Nearly all members carry out program evaluations. Aside from program
evaluation, there is no single type of evaluation undertaken by a majority of
members, though the most common types are:
 performance monitoring,
 policy evaluations, and
 curricula evaluations.
Performance monitoring is more prominent in the work of government evaluators
(48%) compared to other evaluators. Additionally, compared to others, more of
the faculty (33%) conduct curricula evaluations.
Table 13
Types of Evaluations Conducted by Members
Percentage
Program evaluations
95%
Performance auditing/monitoring/reviewing
34%
Policy evaluations
30%
Curricula evaluations
25%
Evaluation of research
20%
Consumer evaluations
10%
Student/trainee evaluations
9%
Personnel evaluations
7%
Product evaluations
6%
n = 2,405
GOODMAN RESEARCH GROUP, INC.
April 2008
18
EVALUATION-RELATED PROFESSIONAL CHALLENGES
While some members commented on evaluation-related professional challenges
in the opportunities they had for responding to open-ended survey questions,
challenges were explored directly in online Q&A groups. Despite limited data
from the groups, four themes surfaced that warrant mention:
The Misunderstood Evaluation/Evaluator
Most commonly, members face challenges related to how others – clients, other
consumers, funders, and even colleagues – understand evaluation. The greatest
of these challenges seems to be misunderstanding, even fear, regarding the
purposes of evaluation. One member remembered a recent evaluation this way,
“My biggest challenges were related to lack of knowledge about evaluation on
the part of grantees, their fear that the evaluation would be used by the funder
against them somehow, and the difficulty in establishing trust with them.”
The Methodological Pressure-Cooker
Another common client-related challenge is pressure to use a specific research
method when the method is not necessarily appropriate or sensitive to the
program being evaluated. This issue was raised primarily by independent
contractors (and we know from survey results that independent contractors often
work on government-funded projects).
One independent contractor gave an account of “increasing
pressure/expectations to use research methods (e.g., control groups, matched
comparison studies) when interventions are not well defined and program
budgets are limited,” continuing with, “Many of my clients hear about research
methods but do not understand (nor can they afford) the programmatic
implications and requirements of these methods.” A fellow independent
contractor feels the pressure more from funders, saying, “Increasingly, funders
are dictating evaluation tools (and not necessarily ones that are reliable and
valid) that are not sensitive to local program activities. In addition, funders
increasingly require evaluators to measure program success by a single measure
– students’ performance on a standardized test.”
Underutilized Evaluations
Once evaluation results are in, another challenge is supporting and encouraging
clients’ use of the findings. Evaluators often have concerns that findings will not
be used to modify or improve upon the program or future programs as clients
move forward with their work.
Where to Turn for Guidance and Support
A challenge that is more pertinent to new evaluators is a need for advice or
guidance to support their evaluation work and professional development and not
having a go-to person or a mentor to fill that role. One new member described
her need for “a continuous back and forth with someone about a particular
project” to help her “sort out the lessons learned and formulate next steps.”
GOODMAN RESEARCH GROUP, INC.
April 2008
19
Similarly, more experienced evaluators would like to be able to reference the
work of other evaluators to help them make decisions about methods,
instruments, and reporting, and they find it difficult to access recent and relevant
material. Lacking time for evaluation-related professional development is
another major challenge.
HOW MEMBERS EXPLAIN THEIR EVALUATION WORK
One of the overarching aims of the internal scan was to paint a picture of AEA
members’ professional worlds, including the nature of their evaluation-related
work. Together with the AEA internal scan task force, we identified six major
dimensions of members’ evaluation worlds, as described in this report, including
their primary professional identity in the evaluation field, their primary place of
employment or involvement in evaluation, the types of evaluation-related work
in which they are engaged, the topical areas in which they do their evaluationrelated work, the percent of their work or study time they devote to evaluation,
and their role in conducting evaluations.
Still, we worried that we may not have covered all the bases, especially as we
traded stories with one another about how we sometimes struggled to describe
our professional lives to others. So, we decided to include (in fact start the
survey with) the open-ended question, “Imagine you are out to dinner with other
AEA members, and each member is taking a minute or two to describe his or her
evaluation-related work and/or study. It’s your turn; what would you say?”
Our initial assessment of this question has been guided by practicality. We have
coded and analyzed a stratified random sample (by type of evaluator and
experience in evaluation) of 240 of the 2,505 responses to this question. Our
initial quest has been to understand what dimensions appear to be important to
members in describing their evaluation-related work. A more intensive analysis
of this question, using an iterative, emic approach, is being undertaken and will
be reported prior to (and at) the 2008 annual meeting.
Based on the initial analysis conducted, the most important aspects of their
evaluation-related work for members appear to be the types of evaluation work
they do and the topical areas in which they do it, both areas explored in other
survey questions. Some aspects of evaluation-related work members commented
on that were not explored in other parts of the survey were the types of clients
with whom they work (as well as funding sources for their work), their most
frequent or preferred methods of or approaches to evaluation, specific projects
that define their work, and their job titles. (See Table 14.)
GOODMAN RESEARCH GROUP, INC.
April 2008
20
Table 14
Dimensions Along which Members Describe their Evaluation Work
Dimensions of Evaluation-Related Work
Percentage
Types of Evaluation-Related Work
59%
Topical Areas
53%
Place of Employment
33%
29%
Clients
Roles/Responsibilities
23%
Methods/Approaches
23%
20%
Professional Identity
Projects
15%
Funding Sources
13%
Title
8%
International Work
5%
Work Outside Evaluation
3%
Time Devoted to Evaluation
2%
n = 240
MEMBERS’ EXPERIENCES WITH AEA
STRENGTH OF AFFILIATION
In order to better understand members’ professional worlds and the professional
context in which AEA operates as an association, we investigated members’
involvement in other professional associations and the strength of their affiliation
with AEA.
Thirty-two percent of responding members do not indicate belonging to any
professional association other than AEA. Of the 68% who belong to at least one
other association, the highest numbers belong to:
 American Educational Research Association (n=589),
 American Public Health Association (n=274), and
 American Psychological Association (n=245).
Regarding the association with which they feel the strongest affiliation, 45% of
members (including those who do not indicate belonging to another association)
choose AEA, 30% indicate their strongest affiliation is with another association,
and 25% indicate they do not affiliate strongly with any professional
association.12 Of the three in 10 respondents that affiliate most strongly with
another association, AERA is the most frequently mentioned, with 18% of
respondents indicating it as their professional home.
12
Of those who do belong to more than one association, 41% affiliate most strongly
with another association, 37% with AEA, and 22% have no strong affiliation.
GOODMAN RESEARCH GROUP, INC.
April 2008
21
The highest percentages of evaluators (54%) and students (44%) affiliate most
strongly with AEA, while the highest percentages of faculty (49%) and
researchers (39%) affiliate most strongly with another professional association.
Longer-term members also affiliate more strongly with AEA than newer
members.
THE VALUE OF CURRENT RESOURCES
In order to make informed decisions about the services it offers its members,
AEA wants to know more about how useful the association’s current products,
services, and groups are to members in their evaluation-related work.
Awareness and Use of Resources
Overall, members are aware of the services, products, publications, and groups
offered by AEA. For the most part, lack of awareness is explained by new
membership (less than one year) in the association; however, one resource that is
less familiar across the membership is the journal, Evaluation and the Health
Professions (available to members through the AEA website); 23% of the
membership is not aware of it.
AEA publications – AJE, NDE, and the Guiding Principles for Evaluators – are
the most widely used of the association’s resources. The online journal,
Evaluation and the Health Professions, and the AEA/CDC Summer Evaluation
Institute are the least used resources.
With the exception of the online journal Evaluation and the Health Professions,
length of membership in the association is related to use of all AEA resources,
with the longer-term members using more resources than the newest members.
(See Table 15.)
GOODMAN RESEARCH GROUP, INC.
April 2008
22
Table 15
Access/Use of AEA Products, Services, Groups, and Journals by Length of
Membership
American Journal of Evaluation
New Directions for Evaluation
Guiding Principles for Evaluators
Resources available through the AEA website
AEA annual meeting
Topical Interest Groups (TIGs)
AEA listserv, EVALTALK
Professional development workshops at the annual meeting
AEA electronic newsletter
Evaluation Review
AEA-recognized local or regional affiliate
Evaluation and the Health Professions
AEA/CDC Summer Evaluation Institute
<1 yra
1-5
yrsb
5+
yrsc
Totald
69%
51%
52%
55%
21%
17%
26%
13%
25%
29%
9%
18%
10%
94%
80%
77%
70%
63%
53%
45%
43%
44%
36%
25%
17%
17%
98%
95%
89%
77%
86%
77%
65%
60%
50%
49%
40%
18%
20%
90%
79%
76%
69%
63%
54%
48%
43%
42%
39%
27%
18%
17%
a
n = 519-534; b n = 1,088-1,133; c n = 904-935; d n = 2,511-2,592
NOTE: Those who have not accessed/used resources include those who were not
aware of the resources as well as those who were aware of the resources but had not
used them.
There are a number of differences in use of resources by professional identity:
 Researchers are less likely than are evaluators and faculty to have used
the Guiding Principles for Evaluators.
 Faculty members participate in Topical Interest Groups at a higher rate
than do evaluators, and evaluators participate at a higher rate than do
researchers.
 Faculty members are more likely than others to have read the two
journals that are accessible on line and less likely than others to have
read AEA’s electronic newsletters.
There also are differences by type of evaluator:
 Independent contractors are more likely than others to have used the
Guiding Principles for Evaluators and EVALTALK, and to have
participated in TIGs.
 Evaluators in firms are somewhat less likely than other evaluators to
have used resources available through the AEA website.
 Evaluators in government and in non-profits are more likely than
independent contractors and those in firms and universities to have
attended the AEA/CDC Summer Institute.
 Evaluators in government also are more likely than others to have read
Evaluation and the Health Professions.
Perceived Usefulness of Resources
According to the member survey, the most useful resources for members are the
association’s official publications and its annual meeting, including the
professional development workshops that are held during the meeting.
EVALTALK, TIGs, and the electronic newsletter are considered less useful.
GOODMAN RESEARCH GROUP, INC.
April 2008
23
Table 16 shows the percentage of survey respondents who were aware of and
used each AEA resource and gave the resource the top rating of very useful to
them in their evaluation-related work. Most of those who did not assign a very
useful rating gave a rating of somewhat useful; only a small percentage of
respondents found each resource not at all useful.
Table 16
Perceived Usefulness of AEA Products, Services, Groups, and Journals
Very
useful
AEA annual meeting (n = 1,638)
New Directions for Evaluation (n = 2,076)
AEA/CDC Summer Evaluation Institute (n = 427)
American Journal of Evaluation (n = 2,357)
Guiding Principles for Evaluators (n = 1,997)
Professional development workshops at the annual meeting (n = 1,112)
Evaluation Review (n = 1,007)
Resources available through the AEA website (n = 1,813)
Evaluation and the Health Professions (n = 451)
AEA-recognized local or regional affiliate (n = 706)
AEA listserv, EVALTALK (n = 1,248)
Topical Interest Groups (TIGs) (n = 1,409)
AEA electronic newsletter (n = 1,089)
56%
55%
53%
52%
50%
50%
40%
35%
35%
32%
25%
24%
15%
Members’ survey and interview comments help explain the value of the top-rated
resources.
Annual Meeting
During interviews, members were quick to point out that AEA allows them
to network with other evaluators, exchange ideas, and develop a sense of
community. Although rated as a less useful resource on the survey, many
interviewees mentioned that the TIGs helped facilitate this community and
allowed individuals to connect directly with others who have interests or
areas of expertise similar to their own. As interviews revealed, this
community of colleagues is especially important to individuals who may be
the only evaluators in their organization or their immediate geographical
area.
In addition, in final comments on the member survey, members were quick to
emphasize the friendliness of AEA, often comparing it favorably to other
professional conferences, with exaltations such as, “AEA conferences are miles
beyond other professional conferences I've attended in terms of take-home
messages and friendly support/problem solving from other members!”
A related theme mentioned by survey and interview informants alike was access
to well known evaluators. As one survey respondent wrote, “I love that the
"experts" in the field are so easily accessible at the annual meeting. I try to take
advantage of that.”
GOODMAN RESEARCH GROUP, INC.
April 2008
24
As much as members benefit from the annual meeting, they have decidedly
mixed reviews about the presentations, as evidenced in comments on the member
survey and in the interviews. While some members feel sessions are generally of
high quality, others find presentations lacking in consistency, substance, and/or
form. Among other improvements, some members would like to see presenters
bring papers in addition to their slides. Some also hope to see conference
proceedings in the future, especially as the annual meeting grows in size and it
becomes harder for members to attend all the sessions in which they are
interested.
Journals
Access to essential readings and journals were most commonly cited during
interviews with members as a valuable part of their AEA membership. They
used words such as “useful” and “reflective” to describe the publications,
and one member said, “The journals alone make the membership worth it.”
Interviewed members explained that they rely on the literature to stay current
in the field and to learn about the newest evaluation approaches. Newer
members rely on the journals as guides to the field and its infrastructure.
Practitioners also rely on this resource when doing literature reviews, looking
for similar studies, and finding background information to share with clients.
ENVISIONING THE FUTURE OF AEA
Desirability of New Services/Products
Respondents to the member survey rated the desirability of 14 potential new or
enhanced products and services from AEA, a list provided by the internal scan
task force and considered feasible to implement.
Two resources were endorsed highly by a majority of the responding members:
 an online archive of evaluation materials (e.g., reports, instruments) and
 new live regional training opportunities.
A journal targeted to practitioners and updates on relevant public policy issues
that affect the field of evaluation are also quite popular; different groups of
members, however, have significantly different opinions about the journal
targeted to practitioners. There are also significant differences among members
in the popularity of an online archive and local trainings. Generally, the less
experienced members, particularly students, are more interested in these
offerings than are more experienced members, those who play supervisory roles
when conducting evaluations.
Table 17 shows the percentage of respondents that found each product or service
highly desirable (the top rating) and is followed by members’ explanations of
some of their ratings.
GOODMAN RESEARCH GROUP, INC.
April 2008
25
Table 17
Desirability of Enhanced/New AEA Products and Services
Highly
desirable
Online archive of evaluation materials (n=2,582)
65%
New training opportunities offered live in your region (n=2,545)
52%
Journal targeted to practitioners (n=2,540)
47%
Updates on relevant public policy issues that affect the evaluation field
(n=2,548)
42%
DVD/CD-ROM of training materials (n=2,543)
34%
Training via web-based delivery that is pre-recorded (n=2,564)
30%
Professional mentoring (n=2,537)
28%
Expanded training opportunities offered live at the Annual Meeting
(n=2,518)
27%
Hardcopy self-study texts (n=2,544)
27%
Training via web-based delivery offered in real time (n=2,551)
21%
Expanded training opportunities offered live at the Summer Institute
(n=2,475)
18%
Videotaped lectures/speeches (n=2,482)
18%
Evaluation blog (n=2,522)
12%
Training via teleconferences (n=2,529)
11%
After rating the desirability of potential resources, members responded to an
open-ended question asking if there were any other new or enhanced products or
services AEA could offer member like them. One-fifth of survey respondents
answered this question, and 13% of them wrote that there was nothing else they
desired from AEA. However, some of those who did respond provided followup comments regarding the resources we asked them to rate, and these comments
help us understand what drove some of the quantitative findings.
Online Archive
Overall, members are most enthusiastic about AEA investing resources in an
online archive of evaluation materials. In follow-up comments, members likened
the idea of an online archive to an “AEA Google,” “a virtual reference library,”
or “a clearinghouse.” In addition to searching for reports and instruments, it
appears members may be interested in using such an archive to help them
network with other members. Other desired organizational strategies for the
archives include by projects, subject areas, or types of evaluations. One of the
GOODMAN RESEARCH GROUP, INC.
April 2008
26
capacity considerations for such an undertaking would be the need to update the
archives regularly.
Regional Trainings
Members are in favor of expanded local or regional training opportunities
because of limited travel funding within their organizations, because they simply
do not have time to travel to take advantage of other offerings (e.g., professional
development at the annual meeting, the AEA/CDC Summer Institute), and
because it would allow them to meet other members in their geographic area.
Several survey respondents mentioned the scarcity of training and/or networking
opportunities in their area. Members also want an easy method of finding out
about such opportunities.
Some members suggested providing Evaluation Institute classes during (or
around) the annual meeting, especially since the professional development
workshops at the annual meeting do not count toward a certificate in evaluation.
Journal for Practitioners
The interest in a journal for practitioners is likely driven by the finding that some
members find AJE and NDE “very academic.” To offer another, albeit minority
view, a handful of members were confused by a question polling their interest in
a journal targeted to practitioners because, as one member put it, “Both AJE and
NDE include many articles relevant to practitioners.” This alternate viewpoint
is important because it calls to attention the need to focus on how a potential new
journal would be different than those already in circulation.
Evaluation Policy
We explored AEA’s role in evaluation policy in the online Q&A groups.
Compared to the Q&A exchanges on other topics, members had less to say about
AEA’s potential involvement in conversations about evaluation policy. One
reason for this may have been that they were not sure what the association is
already doing in this area. Along these lines, it was interesting to note that none
of the participants mentioned the newly formed Evaluation Policy Task Force,
though this initiative was featured at the annual meeting and there has been
communication about it to the members from the AEA president and the AEA
office.
The online Q&A envisioning AEA’s involvement in maintaining and promoting
high quality evaluations included commentary on what AEA might do as well as
how the association might do it. Some individual ideas about what AEA should
do with evaluation decision makers included providing information to funders,
the private sector, and evaluators on best practices in evaluation, facilitating
discussions among relevant parties, and sharing related content with other
professional associations (e.g., AERA).
As far as how AEA should approach evaluation consumers, funders, and decision
makers, while responding participants seemed to feel AEA should be the
GOODMAN RESEARCH GROUP, INC.
April 2008
27
“authority on evaluation,” there was a belief that the association should
“facilitate conversation, but not be prescriptive.” Finally, a member put forth
the idea that, “It will take more than just AEA, the association; members have to
do it, too.”
Some ideas from members for conveying the progress and status of such
conversations to the membership included putting it on the annual meeting
agenda, and providing print and online updates.
Future of Evaluation
All of the AEA members interviewed were asked to reflect on where they see the
field of evaluation going and what they think it will be like in the next ten to 15
years. Nearly all were able to comment on this.
Many members (just over one third) spoke about evaluation being here to stay.
Of these, a few expressed that they had not always felt so certain about that. As
one representative interviewee said, “I’m confident it’s still going to be around …
I didn't always feel that way, especially when there were funding crises.” There
was consensus that evaluation consumers are interested in more information
about their programs and products, and that, “Increasingly, there is
acknowledgement around evaluation.”
In addition to continued growth of the field, nearly 10% of interviewees
forecasted continued growth in the number of evaluators. A couple of these
members noted that the demographics of the evaluator community may change
soon, as experienced evaluators begin to retire (and this prediction is supported
by survey findings presented earlier).
A substantial portion of members (more than three quarters of those interviewed)
expressed the notion that society, including both those who conduct and those
who use evaluation, currently reflects a “range of evaluation literacy.”
Interviewees went on to describe how evaluation literacy can or will increase.
Several members described the current state of the evaluation field as
“tumultuous” referring to mostly to “methodological tensions,” such as the
emphasis on quantitative versus qualitative methods for data collection and
analysis. Members generally feel that finding a way to integrate multiple
perspectives into evaluation practice will enhance the overall quality and
usefulness of evaluation, believing for instance that, “Basic research and
evaluation can be complementary.”
About one quarter of interviewed members believe that more programs and
organizations are becoming “evaluation savvy.” On the positive side, some
interviewed members contend that more organizations understand the value of
evaluation, have come to view evaluation more positively, and recognize that
there is value to varied approaches beyond “the gold standard” of Randomized
Controlled Trials (RCTs). A less prevalent perspective, expressed by a handful
of members, was the concern that more organizations will believe they
understand and can conduct evaluation themselves internally. External
GOODMAN RESEARCH GROUP, INC.
April 2008
28
evaluation, in this view, may be phased out or, because evaluation cuts across
different content areas, it may be “subsumed into other fields.”
A few members expect that a trend toward more formal opportunities for training
will arise; universities and graduate programs will continue to offer courses and
programs in evaluation.
A couple of the interviewees also mentioned that, increasingly, decisions will be
made by changes in the funding available and that may lead to a more targeted
approach to deciding which programs are evaluated in the future. Organizations
may not have the luxury to contract for multiple evaluations. Rather, they may
have to be selective about where they can devote their resources.
Advice to AEA to Prepare for the Future
Following up on members’ expectations and hopes for the future of evaluation,
interviewees were asked to share advice they would give to AEA leadership to
develop the association for the future they described. Interviewees suggested
ways they believe AEA can and should respond as the field continues to grow.
More than three quarters of those interviewed described ways that AEA can and
should promote evaluation and the association, as well as communicate and
disseminate information about evaluation, both to society at large (i.e., including
policy-makers and current and prospective users of evaluation) and to association
members.
The idea of promoting evaluation was mentioned by approximately one-third of
those interviewed and crossed several domains, including making more people
aware of AEA and all that it offers, making the association’s policies and
procedures accessible and transparent so that members can become involved in
different ways, and encouraging more collaboration across TIGs as well as
between individual members. The following are a representative sample of the
various suggestions to increase visibility both within and outside of the
association.
“AEA could improve outreach to other policy areas, particularly health
and human services.”
“Finding ways to promote evaluation in diverse communities. Help make
evaluation more accessible to folks.”
“AEA needs to help universities put evaluation on par with other
research.”
“Simple way to learn about each TIG; co-sponsoring TIGs; have more
communication across TIGs. Right now it's a missed opportunity.”
“At the business meeting, there were questions about how to be more
involved. There should be more policies, documentation, more
transparency about how to do it.”
GOODMAN RESEARCH GROUP, INC.
April 2008
29
“Become more visible in programs that don’t have evaluation degree
programs, such as family services, an area that really is going to need to
use and understand evaluation.”
In following with increasing the clarity around ways for members to get involved
with AEA, just over one-quarter of those interviewed listed ways they would like
to see the association provide more deliberately tailored opportunities for
members, including the types and format of resources and professional
development. Members’ specific suggestions ranged from one member who
wanted the association to “think about people who enter evaluation as their
second career” to another member who felt “AEA should adapt their schedule
and approaches to younger learners who are more savvy about information
technology.”
Beyond designing workshops around experience and skill level, a few members
suggested providing opportunities to meet various financial allowances. For
example, offering online training or mentoring may allow more participation
among evaluators who do not have the time or the funding to travel to the annual
conference for several days. In fact, one quarter of the members who were
interviewed by phone (i.e., did not attend the 2007 Annual Meeting) explained
they were interested in attending each year, but could not afford to do so.
A range of suggestions were made, by about one quarter of those interviewed,
regarding the various ways AEA can communicate and disseminate valuable
information. In terms of evaluation-related policy and potential uses, AEA can
present information and make “reasoned arguments” about what evaluation is,
how it can be used, and why it is helpful. AEA can be a resource provider to help
promote “a common language” about and standards for high quality evaluation
and can help to convey the message that evaluators adhere to such standards.
Nearly one quarter of interviewees discussed wanting “more ways to be able to
keep abreast of developments/advancements” in evaluation and suggested ways
that AEA can keep members apprised, including, as mentioned by a few different
interviewees, the increasing use of technology for conducting evaluations and
presenting findings. One member suggested that AEA “act as a clearinghouse
for people and for products,” including uploading all of the presentations after
the conference, and including links to other resources, as well as to people with
common interests. Again, updates can be posted online, as well as through inperson training and development opportunities.
A small number of interviewees suggested that AEA work toward higher quality
and more consistency among conference presentations across TIGs and with
respect to the information that TIGs provide to their memberships. One member
suggested establishing, “some new or different criteria for selecting
presentations,” so that there are not so many presentations with such low
attendance.
GOODMAN RESEARCH GROUP, INC.
April 2008
30
CONCLUSIONS
AEA undertook the internal scan to learn more about the composition of its
membership, most importantly how they practice evaluation and in what fields.
The following conclusions emerge from the scan.
AEA is a diverse association. It includes not only members who identify as
evaluators but also members who identify themselves in the evaluation field as
faculty, researchers, and students, who work in a variety of settings, with
different emphases, support, and resources for their evaluation work. There is no
single content area in which a majority of members focus. Further, members
differ in their backgrounds, their actual evaluation-related work, and in terms of
what they need from AEA.
While education and psychology are the primary academic paths into evaluation,
there is no majority pathway into the field. Nor is there a majority discipline in
which faculty members teach about evaluation.
However, the AEA membership appears more practitioner-oriented than
academic-oriented. The number of members whose primary professional
identity in the field is evaluator is more than three times that of members who
identify as faculty. In addition, while the largest percentage of members is
employed by universities, more than three-quarters are employed in a variety of
non-academic settings.
AEA is changing demographically. While the internal scan did not involve
exploring trends in AEA statistics over time, some of the data collected indicate
the demographic composition of the membership is changing. If the trends by
length of membership continue, the association can expect the already female
dominated membership to grow and can also anticipate an even more
international and racially diverse membership.
AEA is attracting both new and experienced evaluators and researchers.
New members face the challenges of learning a new practice as well as learning
how to launch a career in evaluation. It is less clear what attracts more
experienced evaluators to AEA as new members, how they are challenged in
their evaluation-related work, and what they expect or hope the association will
provide. This is an area for further inquiry.
Technical assistance and capacity building are burgeoning forms of
evaluation-related work. While most members’ evaluation-related work
consists of conducting program evaluations, many also engage in technical
assistance and capacity building. This indicates that evaluation is making its way
into an increasing number of organizations via AEA members.
Experience in the evaluation field drives members’ interests in AEA
products and services as well as their professional development needs.
Generally, regardless of the potential resources AEA could offer, less
experienced members are more apt than those who are very experienced to
endorse their development.
GOODMAN RESEARCH GROUP, INC.
April 2008
31
The value of AEA membership as expressed by its members matches the
association’s stated benefits. AEA’s product and services are among the key
ways that members develop professionally. The journals are viewed as useful,
even as the idea of a practitioner journal is endorsed. Most of all, members
appreciate the friendly community of practice that AEA offers.
CONSIDERATIONS FOR USE
There are a number of considerations for use of the internal scan by the Board.
These are expressed primarily as needs of members that surfaced as strategic
issues meriting further discussion.
Members are in favor of the development of an online archive of evaluation
materials. If AEA were to pursue either of these, the association should
consider first developing vision statements describing how these resources could
be developed to benefit members and then perhaps conduct follow-up needs
assessments with samples of the membership to better understand their specific
needs related to such products.
The idea of an online archive of evaluation materials is consistent with AEA’s
goal of being a primary source for evaluation information. Among the first steps
for a member-accessed repository would be to establish mechanisms to allow
evaluators to make their resources available and make decisions about how such
an archive would be organized and indexed.
The association also would want to consider whether and how to coordinate such
an effort with other resources that offer access to evaluation materials. In
addition, note that the websites of many evaluation firms provide access to their
evaluation reports. Finally, the primary goal of AEA is to promote high quality
evaluations, so, ideally, an online archive would have some sort of quality
control indicator (e.g., one member suggested user reviews).
Local or regional meetings (as well as web-based training) are desirable
given the challenges of time away from work to travel and the cost. Local
meetings also would allow for networking and community development among
members, thus keeping the member-friendly feel of the association.
There is sufficient interest among members to suggest that AEA consider
taking the next step in providing members with regular or systematic
updates on public policy issues that affect the field of evaluation. Some of
this may be coordinated smoothly with the recently launched Evaluation Policy
Task Force. As is the case with the EPTF, the scope of such updates would only
be feasible if limited to particular fields. Education and health are the clear
choices, as these are the areas in which members most commonly engage in their
evaluation-related work. This is another area in which the association would
need to consider its increasing international constituency and their involvement
in these efforts.
GOODMAN RESEARCH GROUP, INC.
April 2008
32
It appears the association’s international membership is on the rise, which
also will change the face of the association. International members made
impassioned pleas on their member surveys regarding the unique challenges they
face doing evaluation-related work in other countries. To the extent the
proportion of international members increases, their concerns will be a
consideration in policy conversations.
The idea of a journal for practitioners warrants further exploration. The
most important next question is how such a journal would be distinguished from
the current publications (AJE and NDE). It is also important for a next step to
explore (more fully than the internal scan allowed) how the academic-practitioner
tension is understood or felt by the members, especially considering that some
members believe the current journals already are practitioner-oriented.
AEA is delivering value to its members and at the same time there is a need
to keep improving some of the current products and services. One area for
improvement is review of conference proposals. The internal scan surfaced some
dismay over the quality of the annual meeting sessions. It has been some time
since the association polled its members regarding criteria for accepting
proposals for the annual meeting. It may be time to do that again. Any efforts to
improve the quality of public presentations of evaluations also may be timely
given the association’s efforts to engage policymakers and make the case for the
value of evaluation.
Members also would like to be more meaningfully involved in their TIGs. Some
members voiced disappointment with the leadership and structure of the TIGs.
Given the importance of TIGs in the organizational structure, this should be a
priority area for the Board. More meaningful participation would allow members
to play a role in the growth of the association.
Members place a great deal of importance upon the friendliness of the
association. At present, the size of the association, its annual meeting, and the
visibility and accessibility of its leaders provides members with an emotional
hook they do not experience in other professional associations to which they
belong. A key consideration is managing the growth of the association in a
manner that allows it to continue to excel in this way. This view of the
association by its members can serve as a vision for AEA.
GOODMAN RESEARCH GROUP, INC.
April 2008
33
CONSIDERATIONS FOR LONG-TERM DATA COLLECTION
Finally, as a recent newsletter noted, AEA is growing by leaps and bounds. This
growth is mirrored by the growth in the field of evaluation itself. In addition, the
perceptions of AEA and the results of the member survey indicate that the
membership has evolved over time. This growth and change point to the need
for creating and sustaining an internal or external mechanism for data collection
on the membership.
There are important financial and capacity considerations for long-term data
collection. However, a first step in considering long-term data collection would
be to define its purpose. This may be threefold:
1. To measure trends in membership over time (e.g., who the members are,
what the members do, what the members want and need)
2. To allow for feedback on new initiatives and input on potential initiatives
3. To allow for interactive participation. The membership clearly
appreciated the opportunity the internal scan provided them to give input
to the association.
GOODMAN RESEARCH GROUP, INC.
April 2008
34
ADDENDUM: PRESENTATION AND
DISCUSSION OF FINDINGS AT WINTER 2008
AEA BOARD MEETING
GRG submitted a previous version of this report to the AEA Board in January
2008, and then presented select findings from the scan to the AEA Board at their
winter 2008 meeting (February 29 and March 1, 2008). During the presentation,
we highlighted five “stories” from the scan, the stories of:





New members in AEA
Bridging academic and practitioner concerns about AEA products,
services, and groups, focusing on profiles of the primary professional
identities of faculty and evaluator
The “many hats” worn by AEA members in their professional worlds
AEA and policy – members’ interest in receiving updates from AEA on
public policy issues that affect evaluation as well as members’ thoughts
on AEA’s role in public conversations on public policy and evaluation
policy
Members’ desires for accessible and local AEA or AEA-affiliate
activities (responding to the challenges of lack of funding and time to
travel for professional development and networking)
In a half hour discussion following the presentation, the Board and GRG
considered a range of scan findings (from the report and the presentation) and
their implications for the association. Perhaps the most discussed topic was that
of academics and practitioners. There are two important points to make from this
discussion. First, while GRG, the AEA Board, and AEA members frequently use
these terms (academic and practitioner), their definitions are not perfectly clear.
As one Board member pointed out, the majority of members who identify as
faculty in universities conduct evaluations; they are practicing evaluators. On the
other hand, one member survey respondent expressed an uncertainty about the
term practitioner (that others members may also feel), when she offered this
comment on a survey question, “I didn't know what you meant by ‘practitioner’
in the response ‘journal for practitioners’.” A second important moment in our
discussion was when a Board member suggested that we frame this issue as
“academic-practitioner opportunities” rather than “concerns.”
Other strategic opportunities or actions discussed by the Board (with GRG)
included systematic outreach to new members, defensible conference proposal
review criteria, and the role of local affiliates in regional training. The Board
continued to discuss the scan as it related to other items on their agenda (for
which GRG was not present). AEA will be making a presentation on the scan
findings and the association’s response to and use of the findings at the annual
meeting.
GOODMAN RESEARCH GROUP, INC.
April 2008
35
Other areas of interest to Board members during our discussion included, but
were not limited to:
 More information about the nonrespondent bias survey
 Differences between “academics” and “practitioners” in their
professional development needs and their interest in public policy
updates
 More information about U.S. respondents of color and why they appear
more interested than White members in public policy updates
 Proportion of members in private sector employment
 Nature of student work
 More information about members’ interest in certification
 How/why new members with experience join the association
 More feedback on the operations of TIGs and EVALTALK
 A profile of faculty who are new to teaching evaluation
GRG made a second presentation to the Board the following day, which
addressed some of these areas of interest. The two presentations have been
integrated and are available to members on the AEA website. Other Board
interests, as well as the interests of AEA committee members who are reviewing
the report, will be addressed through further analysis in the coming months.
GOODMAN RESEARCH GROUP, INC.
April 2008
36
APPENDIX A: AEA RFP
AEA Colleagues,
Please take a moment to review and disseminate the below RFP. We hope you will consider
responding.
---------------------------American Evaluation Association (AEA) Request for Proposals
Deadline: Thursday, July 12, 2007
The American Evaluation Association is in the midst of a strategic planning process. As part of the input
to this process, we are in need of more accurate and comprehensive data on the nature of the membership.
The composition of the association’s membership has changed in recent years, as has the context in which
our members practice. In addition, evaluation is practiced in many ways in many fields. We wish, through
this project, to document these changes and identify potential responses, as well as to use the information
to anticipate and shape the future of the association and the field. To learn more about AEA, go to
www.eval.org.
Conceptually, we wish to be able to paint a picture of our members’ professional world in a way that
informs our decisions about the services we offer our members and the ways we contribute to supporting
their needs in evaluation practices. Towards that end, we are issuing this request for proposals to gather
data from our members both via an online survey as well as via follow-up interviews or discussion
groups.
Ultimately, AEA wishes to build a long-term relationship with a contractor to collect data from our
membership. While this RFP is strictly for the work outlined below, ideally the value and quality of the
work and relationship would lead to a long-term contract to undertake annual data collection.
Key activities:



Survey: We wish to conduct a focused online census of our approximately 5200 members. We
feel that it is important to include the full membership, rather than a sample, in order to
emphasize the value we place on each member’s input. We have available email addresses for
over 99% of the membership and our regular bounce rate is less than 2%. The contractor will
have access to our members’ contact information. This online survey may need to be
supplemented by follow-up with a sample via alternate methods and we seek your plan for
maximizing the response rate and the reliability and validity of results.
Follow-up Interviews or Discussion Groups: We wish to expand our understanding of the data
from the survey, including but not limited to a better understanding of (a) conditions and
requirements of members’ evaluation work in specific practice areas, e.g., public health,
technology, (b) interests and needs of members who are new to the association, (c) the value of
AEA’s services/benefits including its publications, and ideas for other communications strategies
and publications, and (d) members’ ideas about a vision of the future for AEA. We anticipate that
a series of group discussions, or individual interviews, could be held at our annual conference in
November in Baltimore, but welcome recommendations as to other approaches.
Reporting and Guidance: The November report would be a short update, focusing on progress
and lessons learned to date. The February report would be more in-depth, including time spent
with the Board as part of its strategic planning process, focusing on what was learned from the
research as well as recommendations for ongoing data collection.
A-1
Obligations of the Contractor:









Create a research plan designed to maximize the response rate, and the reliability and validity of
results
Co-develop online survey instrument and protocol with AEA task force
Pilot test, administer, and analyze survey, including administering the invitation and follow-up
process
Co-develop interview (whether individual or group) instrument and protocol with AEA task force
Pilot test, administer, and analyze interviews, including administering the invitation and followup process
Develop working documents for use by and with Task Force including preliminary data analyses
Develop interim report, with recommendations, for use by Board at its November 07 meeting
Develop and present final report, with recommendations, for use by Board at its February 08
meeting
Recommendations for types, timelines, questions, and procedures for long-term data collection
Timeline:


















Thursday, July 12: Due date for RFP responses to AEA office via email (AEA)
Thursday, July 19: Selected contractor notified (AEA)
Thursday, August 16: Survey instrument finalized (Contractor & AEA)
Thursday, August 30: Survey pilot testing and revisions complete (Contractor & AEA)
Thursday, September 6: Survey deployed (Contractor)
Thursday, September 27: Initial survey analysis to inform interview/discussion invitations
(Contractor)
Thursday, October 11: Interview/discussion invitees identified (AEA & Contractor)
Thursday, October 11: Survey deployment complete (Contractor)
Tuesday, October 18: Interim written report prepared for Board (Contractor)
Thursday, October 25: Interview/discussion invitations complete (Contractor)
Thursday, October 25: Interview/discussion group draft protocol prepared (Contractor and AEA)
Thursday, October 25: Survey analysis complete (Contractor)
Thursday, November 15: Survey interpretation complete (AEA & Contractor)
November/December: Interviews/discussions in progress (Contractor)
Thursday, January 3, 2008: Interview/discussion analysis complete (Contractor)
Thursday, January 10, 2008: Interview/discussion interpretation complete (AEA and Contractor)
Thursday, January 17, 2008: Final written report for Board complete (Contractor)
Friday, February ??, 2008: Final live presentation to Board at its winter Board meeting
(Contractor)
Survey: For purposes of budget estimation, bidders should assume that the first wave of the survey will
be internet based and will be a full census of AEA’s approximately 5200 members. Most questions on
the survey will be closed-ended, although employing a variety of formats. Survey questions will focus on
members’:





Satisfaction with and use of current AEA services/products
Desires for alternative AEA services/products
Perception of benefits derived from AEA membership
Experience level, workplace, sector and job responsibilities (evaluation & non)*
Extent of work in international, health, and policy arenas
A-2



Academic and non-academic preparation for their current position
Needs (skills, training) for career advancement
Involvement in other professional associations
* The language developed for these ideally would be usable for long-term data collection on AEA’s
membership applications and elsewhere. Throughout the process, we wish to keep an eye towards
developing strategies for long-term data collection that can help the association to feel that they can
paint a realistic picture of their membership’s professional milieu.
Interviews: For purposes of budget estimation, bidders should assume that the AEA Task Force will
identify approximately six domains of interest based on the needs of the AEA Board and the preliminary
survey analysis, e.g. members working in public health, new members, or members who do not attend the
annual conference. For each of these six domains, we will seek follow-up information regarding
members’:




Nature of work in particular sectors
Evaluation-related professional challenges
Understanding of satisfaction and use patterns with particular products/services
Future-vision for AEA
Approximately half of AEA’s membership attends the annual conference, to be held this year November
7-11 in Baltimore, Maryland. This convention provides a possible opportunity for live group or individual
interviews and limited space for interviews is available if desired.
Proposal: Please include the following items in your proposal, sent via email by close of business on
Thursday, July 12, to AEA Executive Director Susan Kistler at susan@eval.org.
1. Capacity: Describe the capacity of your firm and any key personnel who will be working on this
project. In particular, include information regarding your experience developing, deploying, and
analyzing surveys and interviews/discussions.
2. Draft Plan: Provide an outline of your plan for implementing the above research, with an eye
towards developing strategies for long-term data collection, including detailed recommendations
regarding the follow-up interviews (type, location, extent, sampling, etc.). Include in this section
your proposed process for maximizing the survey response rate, your anticipated response rate,
and the ways in which you will account for, and minimize the effect of, non-respondent bias in
your reporting.
3. Anonymity and Confidentiality: Identify your process for survey respondent tracking and for
maintaining respondent confidentiality, as appropriate, throughout the process.
4. Budget: Include a budget, broken down by key categories, and reflecting all expenses related to
the project including travel, deployment costs, communication costs, etc. The total estimated
budget for this product is between $10,000 and $15,000, inclusive of all costs.
5. References: Provide at least three references, with contact information, at other professional
associations for which you have recently undertaken parallel work.
Ownership: Copyright of all work products, including surveys, protocols, data, and reports, must reside
with AEA.
Questions: Please contact AEA Executive Director, Susan Kistler, at susan@eval.org or 1-508-748-3326.
A-3
APPENDIX B: MEMBER SURVEY
AEA Member Survey 2007
Greetings members! This survey is part of AEA's effort to develop a more detailed and
comprehensive picture of its members' experiences and strategically plan for the future.
The survey includes broad questions about your evaluation-related work, as well as more specific
questions about your experience conducting evaluations. We also want your input on ideas for
enhanced or new AEA products/services.
We estimate it will take about 10 minutes to complete the survey. It is not possible to save your
work, so you must complete the survey in one sitting. However, the survey will not time out, so
you may take as much time as you need - as long as you do not exit from the survey page.
To begin the survey, please enter the ID number provided in your email invitation, then click on
the "Begin Survey" button. As you move from page to page, it's extremely important to use
the "Back" and "Next" buttons at the bottom of the survey page. Please, do NOT use your
browser's buttons; if you do, your valuable responses will be lost.
Thank you in advance for your time and input!
ID # [text box]
B-1
Your Work in Evaluation
1. Before we ask more specific (close-ended) questions, we invite you to tell us about your
evaluation-related work using your own words.
Imagine you are out to dinner with other AEA members, and each member is taking a
minute or two to describe his or her evaluation-related work and/or study. It's your turn;
what would you say?
[text box]
Your Primary Identity in Evaluation
2. Many of us wear more than one hat in the evaluation field. What is currently your
primary professional identity in the evaluation field? (Select only one.)
 Evaluator (in any capacity)
 College or university faculty member or instructor [branch to 9]
 Researcher
 Trainer
 Student involved in evaluation (paid or unpaid)
 Unemployed or currently seeking employment
 Retired and no longer active in the evaluation field [skip to 20]
 Retired but still active in the evaluation field in some way(s)
 Other; if other, please describe: [text box]
Your Evaluation-Related Work
Now, we have some broad questions about your evaluation-related work.
* In this survey, we are defining evaluation-related work as any work in the evaluation
field, including the types of evaluation-related work listed below.
3. Which of these types of evaluation-related work do you do? (Check all that apply.)
 Conducting evaluations (in any capacity, including supervising)
 Writing about evaluation
 Teaching evaluation
 Evaluation capacity building
 Technical assistance
 Training others in evaluation
 Planning/contracting for evaluations (that others conduct)
 Student in evaluation
 Other; if other, please describe: [text box]
B-2
Your Primary Employment/Involvement in Evaluation
4. Considering only your evaluation-related work, how are you primarily employed or
involved in evaluation? (If you are unemployed or retired, please select your most recent
employment/involvement in evaluation.)
I am (or was) primarily employed or involved in evaluation as a/an: (Select only one.)
 Employee of a research, evaluation, and/or consulting firm
 Employee of a company in business or industry
 Employee of a college/university [skip to 6]
 Self-employed, independent contractor [skip to 7]
 Employee of a local or state government [skip to 8]
 Employee of the federal government [skip to 8]
 Employee of a foundation [skip to 8]
 Student involved in evaluation (paid or unpaid) [skip to 8]
 I am not employed or involved in evaluation-related work. [skip to 8]
 Other; if other, please describe: [text box] [skip to 8]
B-3
Your Firm or Company
5. In what type of firm or company are you primarily employed to do your evaluationrelated work? (Select only one.)
 For-profit
 Non-profit
 Other; if other, please describe: [text box]
[skip to 8]
Your Evaluation Contract Work
6. Other than the evaluation-related work you do at your college/university, by what other
types of organizations, if any, are you contracted to do evaluation-related work? (Check all
that apply.)
 For-profit research, evaluation, and consulting firms
 Non-profit research, evaluation, and consulting firms
 Other for-profit companies/agencies
 Other non-profit companies/agencies
 Federal government
 State or local government
 Foundations
 I'm not contracted by any other organizations to do evaluation-related work.
 Other; if other, please describe: [text box]
[skip to 8]
Your Evaluation Contract Work
7. In what types of organizations are you contracted to do your evaluation-related work?
(Check all that apply.)
 For-profit research, evaluation, and consulting firms
 Non-profit research, evaluation, and consulting firms
 Other for-profit companies/agencies
 Other non-profit companies/agencies
 Colleges/universities
 Federal government
 State or local government
 Foundations
 Other; if other, please describe:
B-4
Your Areas of Evaluation-Related Work
We want to develop a more detailed and accurate picture of the areas in which AEA
members do their evaluation-related work.
8. In which areas do you do your evaluation-related work? (Check all that apply.)
 Adult education
 Arts and culture
 Business and industry
 Child care/early childhood education
 Disaster/Emergency management
 Educational technologies
 Environmental programs
 Evaluation methods
 Evaluation theory
 Foundations
 Government
 Health/Public health
 Higher education
 Human development
 Human resources
 Human services
 Indigenous peoples
 Information systems
 International/Cross cultural
 K-12 education
 Law/Criminal justice
 Lesbian, gay, bisexual and transgender issues
 Media
 Medicine
 Non-profits
 Organizational behavior
 Public policy/Public administration
 Science, technology, engineering, math (STEM)
 Social work
 Special needs populations
 Workforce/Economic development
 Youth development
 Other; if other, please describe: [text box]
B-5
Your Academic Appointment
9. In what type of department do you hold your primary appointment? (Select only one.
Note that a response is available for more than one primary appointment, and you may use the
text box to describe.)
 Anthropology
 Business/management
 Computer science
 Economics
 Education
 Educational psychology
 Environmental science
 Evaluation
 Government
 Health/Public Health
 Human resources
 Human services
 International relations/International development
 Medicine
 Law/Criminal justice
 Organizational behavior
 Physical sciences
 Political science
 Psychology [branch to 13]
 Public policy/Public administration
 Social work
 Sociology
 Statistics
 More than one primary appointment
 Other; if other or more than one primary appointment, please describe: [text box]
B-6
10. What level of courses do you teach? (Check all that apply.)
 Undergraduate
 Graduate
11. Which best describes your primary appointment? (Select only one.)
 Tenured full professor
 Tenured or tenure-track associate professor
 Tenure-track assistant professor
 Nontenure track position
 Other; if other, please describe: [text box]
12. Is your position full-time or part-time? (Select only one.)
 Full-time
 Part-time
[back to 3]
Your Subfield in Psychology
13. What is your subfield within psychology? (Select only one.)
 Clinical psychology
 Cognitive psychology
 Counseling psychology
 Developmental psychology
 Educational psychology
 Health psychology
 Industrial and organizational psychology
 Quantitative psychology
 School psychology
 Social psychology
 Other; if other, please describe: [text box]
[back to 10]
B-7
Percent Time Devoted to Evaluation
14. In the last year, approximately what percent of your work and/or study was devoted to
evaluation-related work? (Select only one.)
 None
 Between 1% and 25%
 Between 26% and 50%
 Between 51% and 75%
 Between 76% and 99%
 All
15. Considering only your evaluation-related work and/or study in the last year,
approximately what percent of this work/study had a focus outside the United States?
(Select only one.)
 None
 Between 1% and 25%
 Between 26% and 50%
 Between 51% and 75%
 Between 76% and 99%
 All
Your Experience Conducting Evaluations
16. Considering only your evaluation-related work and/or study in the last year, did any of
the work/study involve actually conducting evaluations?
* By conducting evaluations we mean any role in designing and/or implementing
evaluations, including supervising evaluations.
 Yes
 No [skip to 21]
Your Experience Conducting Evaluations
Now, we'd like you to think more specifically about your experience conducting
evaluations.
17. Considering only your evaluation-related work and/or study in the last year,
approximately what percent of this work/study involved conducting evaluations, including
supervising evaluations? (Select only one.)
 None
 Between 1% and 25%
 Between 26% and 50%
 Between 51% and 75%
 Between 76% and 99%
 All
B-8
18. Which of the following best describes your typical role in conducting evaluations?
(Select only one.)
 Supervisor/director of evaluations (provide leadership/oversight but others primarily
carry out the evaluations)
 Manager or coordinator of evaluations (primarily carry out the day-to-day evaluation
activities)
 Assistant on evaluations (provide generalized support for the day-to-day evaluation
activities)
 Specialist or consultant on evaluations (provide expertise or fulfill a specialized function)
 I work primarily on my own to carry out all evaluation activities.
 I do not play a role in conducting evaluations.
 Other; if other, please describe: [text box]
19. Which of the following types of evaluations do you conduct? (Check all that apply.)
 Curricula evaluations
 Consumer evaluations
 Performance auditing/monitoring/reviewing
 Personnel evaluations
 Product evaluations
 Program evaluations
 Policy evaluations
 Evaluation of research
 Student/trainee evaluations
 I do not conduct evaluations.
 Other; if other, please describe: [text box]
[skip to 21]
Benefits of Your AEA Membership
20. Please take a moment to describe how AEA membership benefits you: [text box]
B-9
Your Academic Background
21. We also want to document the various academic backgrounds of AEA members. At
which levels have you received degrees, and in what fields? (Check all that apply, and
indicate field. Note that additional responses are available if you have more than one Masters
or Doctoral degree.)
 Bachelor's or equivalent completed degree
Indicate field; please select a field [see last page for list of fields used for each item]
If other, please describe: [text box]
 Second Bachelor's or equivalent completed degree
Indicate field; please select a field
If other, please describe: [text box]
 Master's or equivalent completed degree
Indicate field; please select a field
If other, please describe: [text box]
 Second Master's or equivalent completed degree
Indicate field; please select a field
If other, please describe: [text box]
 Doctorate or equivalent completed degree
Indicate field; please select a field
If other, please describe: [text box]
 Second Doctorate or equivalent completed degree
Indicate field; please select a field
If other, please describe: [text box]
22. Have you received specialized training in evaluation that led to a certificate of some
kind? (Select only one.)
 Yes
 Currently receiving training that will lead to certificate
 No [skip to 24]
Your Certificate in Evaluation
23. What is the full name of the program and/or institution from which you received, or
expect to receive, your certificate? [text box]
B-10
Your Affiliation with Professional Associations
Knowing more about your affiliation with professional associations other than AEA will
help us better understand your professional world and the professional context in which we
operate as an association.
24. To which other national or international professional associations do you currently
belong? (For each professional association to which you belong, please indicate if you have
ever been a Board or committee member in that association.)










Board or
committee
member




















Current
member
Academy of Human Resource Development (AHRD)
American Association of Public Opinion on Research (AAPOR)
American Economic Association
American Educational Research Association (AERA)
American Political Science Association (APSA)
American Psychological Association (APA)
American Public Health Association (APHA)
American Society for Training and Development (ASTD)
American Sociological Association (ASA)
American Statistical Association (ASA)
Association for Public Policy Analysis and Management
(APPAM)
National Legislative Program Evaluation Society (NLPES)
Society for Social Work and Research (SSWR)
Other international evaluation association
Other(s); if other(s), please list: [text box]
25. Which statement is most true for you? (Select only one.)
 AEA is the professional association with which I most strongly affiliate. [skip to 27]
 I most strongly affiliate with a professional association other than AEA.
 I do not strongly affiliate with any professional association. [skip to 27]
B-11
Your Strongest Affiliation
26. With which professional association do you most strongly affiliate? (Select only one.)
 Academy of Human Resource Development (AHRD)
 American Association of Public Opinion on Research (AAPOR)
 American Economic Association
 American Educational Research Association (AERA)
 American Political Science Association (APSA)
 American Psychological Association (APA)
 American Public Health Association (APHA)
 American Society for Training and Development (ASTD)
 American Sociological Association (ASA)
 American Statistical Association (ASA)
 Association for Public Policy Analysis and Management (APPAM)
 National Legislative Program Evaluation Society (NLPES)
 Society for Social Work and Research (SSWR)
 Other; if other, please describe: [text box]
Your Satisfaction with AEA: Usefulness of Products, Services and Groups
AEA wants to make informed decisions about the services it offers its members. We want
to know how helpful AEA is to you now and how the association might support you in the
future.
27. Through AEA, members have access to a variety of products, services, and groups.
How useful have each of the following resources been to you in your evaluation-related
work and/or study? (Select only one answer for each item.)
Guiding Principles for Evaluators
Resources available through the AEA
website
AEA annual meeting
Professional development workshops at the
annual meeting
AEA/CDC Summer Evaluation Institute
AEA listserv, EVALTALK
AEA electronic newsletter
Topical Interest Groups (TIGs)
AEA-recognized local or regional affiliate
Not
aware

Haven't
accessed

Not
useful

Somewhat
useful

Very
useful









































B-12
Your Satisfaction with AEA: Usefulness of Journals
28. AEA members receive subscriptions or electronic access to the four journals listed
below. How useful is the material in each journal to you in your evaluation-related work
and/or study? (Select only one answer for each item.)
American Journal of Evaluation
New Directions for Evaluation
Evaluation Review
Evaluation and the Health
Professions
Not
aware



Haven't
accessed



Not
useful



Somewhat
useful



Very
useful








Desirability of Enhanced/New AEA Products and Services
29. If AEA were to invest resources in new products and services for its members, how
would you rate the desirability of each of the following products or services? (Select only
one answer for each item.)
Training via web-based delivery
that is pre-recorded
Training via web-based delivery
offered in real time
Training via teleconferences
Expanded training opportunities
offered live at the Annual
Meeting
Expanded training opportunities
offered live at the Summer
Institute
New training opportunities
offered live in your region
Videotaped lectures/speeches
DVD/CD-ROM of training
materials
Hardcopy self-study texts
Online archive of evaluation
materials (reports, instruments,
etc.)
Journal targeted to practitioners
Evaluation blog
Professional mentoring
Updates on relevant public
policy issues that affect the
evaluation field
Not at all
desirable
Slightly
desirable
Moderately
desirable
Highly
desirable
























































B-13
30. Are there any other products or services (enhanced or new) that you would like AEA to
offer members like you? [text box]
Your Background
Finally, we have just a few background questions.
31. How many years of experience do you have in the evaluation field? (Select only one.)
 Less than 2 years
 2-5 years
 6-10 years
 11-15 years
 16-20 years
 More than 20 years
32. How many total years have you been a member of AEA? (Select only one.)
 Less than 1 year
 1-2 years
 3-4 years
 5-6 years
 7-8 years
 9-10 years
 More than 10 years; if more than 10 years, how many years? [text box]
33. Are you:
 Male
 Female
34. What is your age range? (Select only one.)
 19 or younger
 I'm in my 20s
 I'm in my 30s
 I'm in my 40s
 I'm in my 50s
 I'm in my 60s
 I'm in my 70s
 I'm in my 80s
 I'm in my 90s
 Choose not to respond
35. Do you currently reside primarily in the United States?
 Yes
 No
B-14
Your Race/Ethnicity
36. The following categories are used by the U.S. federal government to collect data on
race and ethnicity. Which category best describes you, or are you best described as an
international member, or in some other way(s)? (Check all that apply, and/or use the "other"
box to write in another description.)
 American Indian or Alaska Native
 Asian
 Black or African American
 Hispanic or Latino
 Native Hawaiian or Other Pacific Islander
 White
 International member
 Choose not to respond
 Other; if other, please describe: [text box]
Your Final Comments
37. What else would you like us to know about your work and/or study, about AEA, or
about this survey that you have not had the opportunity to share? We value your input.
[text box]
Thank you!
We appreciate your taking the time to help AEA!
Confirmation and More Information
Your response has been submitted. Thank you again for your time in helping AEA!
For more information about AEA please visit: www.eval.org
For more information about Goodman Research Group, Inc. please visit: www.grginc.com
B-15
List of Fields Used in Academic Background Section






























Anthropology
Business and management
Child development
Computer science
Economics
Education
Educational psychology
English
Environmental science
Evaluation
Government
Health/Public health
Human development
Human resources
Human services
Information systems
International relations/international development
Law/criminal justice
Mathematics
Medicine
Organizational behavior
Philosophy
Physical science
Political science
Psychology
Public policy/public administration
Social work
Sociology
Statistics
Other
B-16
APPENDIX C: INTERVIEW PROTOCOL
AEA MEMBER INTERVIEWER GUIDELINES
We are collecting qualitative data through brief, but in-depth, open-ended interviews with AEA
members. The interview has 9 main questions and is designed to take 20 minutes.
Because of limited time and because several people are conducting the interviews, the interview
is standardized. This said, some flexibility in probing and exploring certain topics in greater
depth is permitted.
Interview objectives:
 To understand the interviewee’s experiences in his/her evaluation world
o To learn more about the nature of their evaluation work
o To gain perspective on how they think and feel about their evaluation work
 To understand the interviewee’s perspective on AEA
o To gain perspective on why and in what ways members value certain AEA
resources (both existing resources and ideas for new/enhanced resources)
Basic Guidelines:
 Enjoy the interview!
 Stick to the questions.
 Use probes to get more depth/detail.
 Be supportive. Let the interviewee know how the interview is progressing.
 Observe while interviewing. Be sensitive to how the interviewee responds to the
questions.
 Maintain control of the interview.
 Take notes.
 Tape record.
 Elaborate on notes after the interview.
If circumstances are such that you have less than 20 minutes with an interviewee, or the
interview is going long, the critical questions to ask are 2, 3, 5, and 7.
C-1
AEA MEMBER INTERVIEW PROTOCOL
Interviewee Name:
Date and Time:
Introductory Remarks [2 minutes]
Thank you for taking the time to talk with me today.
The interview should take about 20 minutes to complete. If there are any questions you’d rather
not answer, please let me know and we’ll move on.
The interview will be confidential; we will not identify you by name in our report or in any
conversations with other people.
If you don’t mind, I’d like to tape record our conversation so I don’t miss anything. Nobody
outside the GRG research team will have access to the tape. (Get verbal consent.) Thank you.
I know from the database that you work at [organization name] and you’ve been working in the
field of evaluation for about [range from survey].
Experiences in Evaluation [10 minutes]
38. I’m interested in hearing about your pathway into evaluation. When and why did you
first consider evaluation as a professional activity? [3 minutes]
Probe for some detail:
̶ When was that?
̶ Where were you working or studying at that time?
39. Now I’d like to hear more about the work you do now. What is your primary
responsibility in evaluation or as an evaluator? [4 minutes]
Probe for some detail:
̶ If I shadowed you on a typical day at work, what kinds of evaluation “tasks” would I
observe you doing?
̶ Are there others in your organization also doing evaluation work, or do you work as a
sole evaluator?
40. What are the key ways you learn about recent developments or advances related to
your evaluation work? [3 minutes]
Probe for detail:
̶ What are the supports and/or limitations of your work place in helping you do that?
̶ How about formal evaluation coursework? Conferences (AEA or other)?
Training/workshops (AEA or other)? Self-initiated study? Learning from others in the
evaluation field?
C-2
Perspective on AEA [3 minutes]
41. How long have you been a member of AEA?
42. People belong to AEA for a variety of reasons. What is the value of membership for
you? [3 minutes]
̶ Would you share an example or two of how you have made use of AEA products or
services? (Refer to list of AEA resources, including the conference)
 How did that go?
Future of Evaluation [4 minutes]
43. Do you see yourself in the field of evaluation in 5-10 years?
Probe:
̶ Where do you see yourself going as an evaluator?
44. I’m also interested in where you see the field of evaluation going. What do you think it
will be like in 10-15 years?
45. What advice would you give AEA leadership about how to develop the association for
that future?
Wrap Up [1 minute]
46. Is there anything else you’d like to share about your experiences in evaluation or about
AEA?
You said a lot of important things that were just the kinds of things we were looking for. Thank
you so much for your time. Enjoy the rest of the conference!
C-3
APPENDIX D: ONLINE Q&A GROUP PROTOCOL
Online Q&A Groups Protocol
Three different groups of AEA members will participate in three separate online Q&A groups:
1) New members (new both to AEA and to evaluation)
2) Independent contractors
3) Evaluators working in firms
Building on the data we’ve obtained thus far from the member survey and interviews, Q&A
groups will obtain additional targeted feedback about members’:
1) Professional identity in evaluation
2) Evaluation-related professional challenges and how they are addressed
3) Perceptions of how AEA is and can be involved in evaluation policy and the
advancement of high quality evaluations
Introductory information for group participants:
This group follows up on the recent AEA member survey and interviews with a sample of
members. We’re inviting responses from you to strengthen our understanding of AEA members’
experiences in evaluation, including your professional identity in evaluation and evaluationrelated professional challenges you may face and how those are addressed. We also are
interested in your opinions about the role AEA may play in evaluation policy. We will post a few
different questions over the next five days and will ask you to respond to those questions as well
as to responses from others in the group.
Under separate cover, we are sending group participants detailed instructions for participation
in the groups and responses to frequently asked questions. We also will be telling participants
that they will participate in a group with other evaluators who are new/independent contractors /
work in firms.
Question Area 1: Professional Identity
First, we’re interested in learning more about members’ professional identities. Evaluators come
from a variety of educational paths and background training, and evaluation itself pulls in
content and methodology from various disciplines. Some AEA members identify themselves
primarily as evaluators, and some identify themselves in other ways. How do you refer to
yourself primarily – as an evaluator or in some other way?
Potential follow up questions:
In describing your work, do you differentiate “who you are” from “what you do?”
What background and disciplines do you draw upon in your evaluation work?
What are the characteristics/elements of your work that make that your primary identity?
What factors are necessary to identify yourself as an evaluator?
If not mentioned, we’ll probe specifically on the pros and cons of self-identifying as an
evaluator, and on the perceived necessary characteristics for one to be able to be called an
evaluator.
D-1
Question Area 2: Professional Issues and Resources
What are the biggest challenges you face in your daily evaluation-related work, and what
information do you find yourself seeking out (or are you in need of) to help respond to those
challenges?
Potential follow-up questions:
- Are there certain issues or challenges that come up regularly?
- Where and how do you get those needs met?
- What kinds of resources could/should AEA offer that would help address those issues?
If not mentioned, we’ll probe specifically about issues such as: the need for evaluation or their
quality of work being questioned, a lack of technical support or collegiality in their daily work
environment, client relations, getting direction from people who don’t understand evaluation,
etc.
Question Area 3: Evaluation Policy
As the field of evaluation expands to include a wide range of consumers, funding sources, and
decision-makers, how do you envision AEA’s involvement in conversations intended to maintain
and promote high quality evaluations?
Potential follow-up questions:
- What do decision makers need to know about evaluation?
- How should AEA involve members and their expertise in these conversations?
- How should AEA convey progress and the status of such conversations to the
membership?
- How might that affect (help) you in your continued evaluation work?
If not mentioned, we’ll probe more specifically about whether participants believe AEA should
influence evaluation policy, as well as whether and how they already do.
D-2
APPENDIX E: METHODS
INTERNAL SCAN METHODS
This appendix provides a detailed description of the internal scan methods. To enable its use as a standalone
document, we repeat information provided in methods section of the report.
The scan was a descriptive study, meant to characterize the AEA membership, the nature of members’ work,
and their experiences with AEA. The scan included three components (described below) that gathered both
quantitative and qualitative data.
Web-Based Survey of the AEA Membership
Sampling Plan
The AEA Member Survey was conducted with all members, including U.S. and international members. AEA
and GRG believed it was important to survey the full membership, rather than a sample, in order to emphasize
the value AEA places on each member’s input (noted in the AEA RFP for the internal scan). All members in
the AEA membership database (provided to GRG by the AEA office within days of the survey launch) were
sent an email invitation to the survey. Ten members without email addresses received telephone invitations to
take the survey.
Description of the Survey
GRG and the AEA task force co-developed the member survey and GRG pilot tested the survey with AEA
committee members and a small purposive sample of other members. The survey consisted of 28 distinct web
pages, including a welcome page and thank you/confirmation page, and featured a number of branching
patterns. The survey primarily consisted of close-ended questions but also included three opportunities for
open-ended comments.
Steps to Minimize Nonresponse
We took a number of steps to reduce nonresponse/nonresponse bias:

We sent three reminder emails to nonrespondents, and worked with the AEA office to send an
encouragement email on behalf of the AEA President.

We personalized our email invitations, using members’ first names (e.g., “Dear Jane”).

Our email invitation identified our firm and the purpose of the study, the survey’s benefits to AEA
(and salience to them as members), the length of the survey, and a statement of confidentiality. The
invitation included a link to the survey and a unique respondent ID number. Except for the ID
number, the invitations were exactly the same for all members.

We pilot tested the survey to improve the experience of taking the survey (and in order to reduce
measurement error, revising question wording as appropriate). Note:
o We do not believe there were any major technological incompatibilities in accessing the
survey. We had only a handful of communications from members who had issues accessing
the survey.
o The first survey question was of obvious interest to many respondents but may have been
associated with some nonresponse as it was an open-ended question that was not as easily
E-1
answered as a close-ended one. We had one nonrespondent write to tell us this was the
reason for nonresponse.

The survey was aesthetically pleasing and professional looking. The survey form incorporated the
AEA logo and design elements in AEA’s color. It also included a progress bar on each page so that
respondents always knew how close to the end of the survey they were. The survey was designed and
administered using Remark Web Survey Software 3, Professional version (for more information visit
http://www.gravic.com/remark/websurvey).

The survey required moderate navigational controls. Some pages required respondents to scroll down
to complete questions; however, the survey was designed so that they could see all answer choices to
individual questions without having to scroll up and down. On each page, the respondent had to click
on a button to advance to the next question. Note:
o For each drop-down menu, there were a small number of respondents who did not follow the
visible instructions to “Please select an answer.”

The only required question was the respondent’s survey ID number.

The survey included branching, so that respondents skipped or received only those follow-up
questions we believed were relevant to them.

We offered an incentive – for all respondents to be included in a drawing for ten $50.00 online gift
certificates to Amazon.com.
Response Rate and Steps to Understand Nonresponse Bias
The survey launched on September 17th and closed October 10th, 2007. During those three weeks, a total of
four reminders were sent to non-respondents, including an encouragement email on behalf of the AEA
President. A total of 5,460 surveys were distributed and we received valid responses from 2,657 members,
yielding a response rate of 49%. (The instances of premature termination of the survey are unknown.) Table
D-1 provides the daily response/response rate to the survey. The shaded rows indicate the dates of reminder
emails.
E-2
Table D-1
AEA Member Survey Response by Date
#
Cumulative
Date
Respondents
Response
9/17/2007
669
669
9/18/2007
234
903
Response
Rate
12.25%
16.54%
9/19/2007
73
976
17.88%
9/20/2007
9/21/2007
58
33
1034
1067
18.94%
19.54%
9/22/2007
9/23/2007
9
18
1076
1094
19.71%
20.04%
9/24/2007
459
1553
28.44%
9/25/2007
9/26/2007
9/27/2007
9/28/2007
9/29/2007
9/30/2007
95
37
15
14
4
6
1648
1685
1700
1714
1718
1724
30.18%
30.86%
31.14%
31.39%
31.47%
31.58%
10/1/2007
10/2/2007
10/3/2007
274
143
87
1998
2141
2228
36.59%
39.21%
40.81%
10/4/2007
10/5/2007
10/6/2007
10/7/2007
10/8/2007
32
23
10
6
12
2260
2283
2293
2299
2311
41.39%
41.81%
42.00%
42.11%
42.33%
10/9/2007
10/10/2007
Total
268
78
2657
2579
2657
2657
47.23%
48.66%
49%
The 2007 member survey response rate is slightly higher than the two AEA member surveys of which we are
aware: the 2001 AEA member survey (44%)13 and the 2004 AEA Independent Consulting TIG member
survey (37%)14. The response rate is also higher than the response rates of a few other professional
association member surveys we found in a cursory search of relevant professional association web sites: 2003
13
Unpublished data from the AEA office
Jarosewich, T., Essenmacher, V. L., Lynch, C. O., Williams, J. E., Doino-Ingersoll, J. (2006). Independent
consulting topical interest group: 2004 industry survey. New Directions For Evaluation, 111, 9-21.
14
E-3
APSA international membership survey (38%)15, 2007 APHA Community Health Planning and Policy
Development Section member survey (12%)16, and 2008 APHA Statistics Section member survey (29%)17.
Finally, in a meta-analysis exploring factors associated with higher response rates in electronic surveys, Cook
et al. (2000) reported the mean response rate for the 68 surveys reported in 49 studies was 39.6%
(SD=19.6%).18 The studies included in this meta-analysis included those published in Public Opinion
Quarterly, Journal of Marketing Research, and American Sociological Review as well as unpublished
research.
In this context, we believe the response rate achieved in this survey is good. Nonetheless, half of the
membership did not respond, and therefore the possibility of nonresponse bias cannot be overlooked. We
took three steps to explore the possibility of nonresponse bias: 1) we conducted a nonrespondent bias survey,
2) we investigated differences between earlier and later responders, and 3) we compared the respondents to
known data for the AEA membership.
Comparing earlier and later respondents to the member survey, we found that earlier responders were more
likely to be White and were somewhat more likely than later respondents to be longer-term members of AEA.
Our comparison of respondent and known demographic data also suggests that respondents may be skewed
slightly toward intermediate term members, with slightly lower proportional representation of brand new
members. However, the comparison suggests the member survey respondents were proportionally
representative of the entire membership in terms of race; they also were proportionally equivalent in terms of
gender and US/international status.
Nonrespondent Bias Survey
After the survey closed, GRG conducted a non-respondent bias survey (also web-based) with a random
sample of 200 non-respondents to investigate differences between respondents and non-respondents; 52
members responded, for a response rate of 26%. Survey non-respondents and respondents were compared to
each other in five key areas: professional identity (see Table D-2), primary employment in evaluation (see
Table D-3), education (see Table D-4), affiliation with AEA (see Table D-5) and years of experience in the
evaluation field (see Table D-6).
We recognize that the response rate to the nonrespondent bias survey is low and thus we are limited in
generalizing our findings to all nonrespondents. Nevertheless, the results of the survey suggest that stronger
affiliation with a professional association other than AEA may have been a factor in nonresponse. This is not
altogether surprising, as others studies have linked salience of issues to response rate.19 Only slight
differences were found in the other areas of comparison and the majority of those who completed the
nonrespondent bias survey were highly satisfied with AEA (see Table D-7).
15
Retrieved from http://www.apsanet.org/imgtest/survey.pdf
Retrieved from http://www.apha.org/NR/rdonlyres/01EB89FB-FEF6-4E8F-A7F95F1E0DE2CF61/0/chppd_2007_2.pdf
17
Retrieved from http://www.apha.org/NR/rdonlyres/9983F55B-B29A-465C-AFA6269272210411/0/StatSurveyHighlights08_Website.pdf
18
C. Cook, F. Heath, R.L. Thompson. (2001). A meta-analysis of response rates in web- or Internet-based surveys.
Educational and Psychological Measurement, 60(6), 821-836.
19
Sheehan, K., & McMillan, S. (1999). Response variation in e-mail surveys: An exploration. Journal of
Advertising Research, 39, 45-54.
16
E-4
Table D-2
Primary Professional Identity of Respondents and Nonrespondents
Percentage
Primary Professional
Respondents
Identity
(n=2,655)
Evaluator (in any capacity)
49%
Percentage
Nonrespondents
(n=52)
44%
College or university faculty member or instructor
Researcher
15%
14%
19%
10%
Student involved in evaluation (paid or unpaid)
Trainer
Retired but still active in the evaluation field in some way
7%
1%
1%
6%
4%
2%
Unemployed or currently seeking employment
Retired and no longer active in the evaluation field
1%
<1%
2%
0%
Other
11%
14%
Percentage
Respondents
(n=2,649)
29%
19%
16%
Percentage
Nonrespondents
(n=52)
29%
27%
15%
Employee of a local, state, or federal government
12%
10%
Non-profit organization
Student involved in evaluation paid or unpaid
Other
7%
6%
11%
4%
4%
11%
Table D-3
Primary Employment of Respondents and Nonrespondents
Primary Employment
Employee of a college/university
Employee of a research evaluation and/or consulting firm
Self-employed independent contractor
Table D-4
Educational Level of Respondents and Nonrespondents
Percentage
Percentage
Highest Level of
Respondents
Nonrespondents
Education
(n=2,537)
(n=52)
Doctorate
52%
52%
Master’s
42%
39%
Bachelor’s
Other
7%
0%
8%
2%
E-5
Table D-5
AEA Affiliation of Respondents and Nonrespondents
Percentage
Percentage
Affiliation with AEA
Respondents
Nonrespondents
(n=2,602)
(n=52)
Affiliate most
45%
35%
strongly with AEA
Affiliate most
strongly with other
30%
48%
professional
association
No strong association
with any professional
25%
17%
association
Table D-6
Evaluation Experience of Respondents and Nonrespondents
Percentage
Percentage
Experience in
Respondents
Nonrespondents
Evaluation
(n=2,652)
(n=50)
Less than 5 years
33%
38%
6-10 years
24%
20%
11-15 years
16%
12%
16 or more years
27%
30%
Table D-7
Nonrespondent Satisfaction with AEA Products, Services, Benefits
Percentage
Nonrespondents
(n=50)
Extremely satisfied
10%
Very satisfied
58%
Somewhat satisfied
Only a little satisfied
28%
4%
Not at all satisfied
0%
Differences in Earlier and Later Respondents to the Member Survey
We compared respondents on a number of characteristics by time of response (i.e., whether they
responded without a reminder, after one reminder, after two reminders, or after three reminders). There
were no differences by gender, age, highest degree, experience in evaluation, primary professional
identity in evaluation, or strength of affiliation with AEA. Earlier respondents were more likely to be
White (see Table D-8) and were somewhat more likely than later respondents to be longer-term members
of AEA (see Table D-9).
E-6
Table D-8
Response by Race/Ethnicity
Responded Responded Responded Responded
without
after 1
after 2
after 3
reminder
reminder
reminders reminders
White
80%
74%
70%
61%
Black or
African
4%
8%
8%
11%
American
Asian
4%
5%
7%
7%
Latino/Hispani
3%
3%
3%
7%
c
American
Indian or
1%
1%
2%
2%
Alaskan Native
Native
Hawaiian or
<1%
<3%
<1%
<1%
Other Pacific
Islander
Biracial/multira
2%
3%
2%
4%
cial
International
4%
5%
7%
7%
Other
1%
2%
2%
2%
Table D-9
Response by Years of AEA Membership
Responded Responded Responded Responded
without
after 1
after 2
after 3
reminder
reminder
reminders reminders
Less than 1 year
20%
21%
24%
18%
1-4 years
41%
45%
46%
46%
5+ years
40%
34%
30%
36%
Total
74%
7%
5%
3%
1%
<1%
3%
5%
2%
Total
21%
44%
36%
Comparison of Respondents to Known Data for AEA Membership
We compared demographic data on our respondents to known data for the membership. The membership
data is from Winter 2008. This investigation suggests the member survey respondents were proportionally
representative of the entire membership in terms of gender, race, and US/international status. The percentage
of brand new members (less than one year) responding to the survey appears slightly lower than the
population statistic, while the percentage of intermediate-term members (1-4 years) appears slightly higher;
however, this may have to do with how respondents defined and self-reported “less than one year” and “1-4
years.” See Table D-10.
E-7
Table D-10
Gender, Race, and Country of Respondents and AEA Membership
Percentage
Percentage
Respondents Membership
Female
67%
66%
Gender
Male
33%
34%
White
73%
77%
Black or African American
7%
8%
Race
Asian
5%
5%
Latino or Hispanic
5%
4%
Native Hawaiian or Other Pacific Islander
<1%
<1%
USA
86%
87%
Country
Non-US
14%
13%
Less than 1 year
21%
26%
a
Membership 1-4 years
44%
38%
5+ years
36%
36%
The AEA data on membership was run on April 27, 2007; “less than one year” was defined as joined on or before
April 27, 2007 and “1-4 years” was defined as joined on or before April 27, 2003. On the member survey,
respondents self-reported their years of membership; therefore, the possibility of measurement error cannot be
discounted.
a
Survey Analysis
Survey data were imported into SPSS, where analyses included frequencies, crosstabs (with appropriate
statistical tests), and nonparametric statistical tests. Where we comment on group differences in the report,
they are statistically significant at the p<.05 level.
INTERVIEWS
In order to enhance the findings from the member survey, GRG conducted follow-up interviews with a
sample of 56 AEA members who responded to the survey. GRG and the AEA task force co-developed the
interview protocol and then GRG pilot tested the protocol with a small purposive sample of members.
Approximately half of the interviews were completed in person at the annual AEA conference in Baltimore in
November and the other half were completed by telephone in December 2007 and January 2008.
The interview sampling plan was a stratified random sample by evaluator type and experience. In order to
have a sufficient number of people within each stratum to inform our areas of inquiry, we used an equal
allocation sampling strategy. One exclusion criterion for the interview selection was affiliation with the AEA
Board or committees. We also used quotas to limit the number of international and Beltway area interviewees
(for the in-person interviews in Baltimore).
Because of the distribution of the other variables that were of interest but not included in the stratification
plan, the plan resulted in a sample that included evaluators with Master’s and those with Doctorates,
evaluators who take on different roles in conducting evaluations, and evaluators that practice in different
substantive areas, including the most common areas of health/public health and K-12 education. The
tables that follow show the sampling plan (Table D-11), the breakdown of sample strata for the member
survey respondent population (Table D-12), and the number of completed interviews per strata (Table D13).
E-8
Table D-11
Internal Scan Sampling Plan for Interviews
Experience
5 or fewer yrs.
6-10 yrs.
11-15 yrs.
16 or more yrs.
Total
Evaluators in
research
firms
Independent
contractor
evaluators
Evaluators in
universities
Evaluators in
government
Total
n=4
n=4
n=4
n=4
n=16
n=4
n=4
n=4
n=4
n=16
n=4
n=4
n=4
n=4
n=16
n=4
n=4
n=4
n=4
n=16
n=16
n=16
n=16
n=16
n=64
Table D-12
Type of Evaluator by Evaluation Experience in Member Survey Respondents
Experience
5 or fewer yrs.
6-10 yrs.
11-15 yrs.
16 or more yrs.
Evaluators in
research
firms
Independent
contractor
evaluators
Evaluators in
universities
Evaluators in
government
n=87
n=83
n=59
n=95
n=324
n=40
n=70
n=64
n=112
n=286
n=82
n=56
n=37
n=59
n=234
n=55
n=51
n=26
n=51
n=183
Evaluators in
research
firms
Independent
contractor
evaluators
Evaluators in
universities
Evaluators in
government
n=5
n=1
n=3
n=5
n=14
n=4
n=4
n=3
n=5
n=16
n=3
n=3
n=4
n=4
n=14
n=4
n=3
n=1
n=4
n=12
Total
n=264
n=260
n=186
n=317
n=1,027
Table D-13
Completed Interviews by Strata
Experience
5 or fewer yrs.
6-10 yrs.
11-15 yrs.
16 or more yrs.
Total
n=16
n=11
n=11
n=18
n=56
ONLINE Q&A GROUPS
To further explore themes of interest arising from the scan, GRG conducted three online Q&A groups, one
group with new evaluators, one with moderately experienced evaluators in firms (i.e., with 6-10 years of
experience in evaluation), and one with experienced independent contractor evaluators (i.e., 11-15 years of
experience). We explored the same three topics in each group: professional identity in evaluation,
evaluation-related professional challenges, and AEA’s role in evaluation policy. We used a semi-structured
protocol, developed in consultation with the task force.
We assigned every member in each of the strata a random number and initially invited 20 from each group
(the maximum number of participants we desired per group). (Our exclusion criteria were “in an AEA
leadership position” and “participated in an internal scan interview.”) As we received declinations (or no
response), we invited the next member from our random numbers table. Eventually, we exhausted that table,
so, ultimately, every member of the strata had received an invitation. Thus, the online Q&A group
participants should be viewed as a self-selected sample.
E-9
We conducted the bulletin board style forums asynchronously over one week using vBulletin (for more
information visit http://www.vbulletin.com). Participants registered for the forums and received instructions
for participation (including answers to frequently asked questions) in advance. A Senior Research Associate
from GRG monitored all three forums, posting one kick-off question for each topic and one follow-up
question reflecting on the responses and encouraging more response.
Table D-14 below shows the levels of participation in the groups. On average, each participant posted three
responses over the course of the week. Most posts were responses to the questions posed by the GRG
monitor, although there were a few instances of participants commenting on each others’ responses or
responding directly to a question posed by another respondent. As the table shows, the level of response
decreased over the Q&A period.
Table D-14
Participation in AEA Internal Scan Online Q&A groups
Forum
#
# posts on
participants/
total # posts
professional
total
identity
New
n=13/14
n=44
n=25
6-10 yrs.
n=10/14
n=28
n=12
11-15 yrs.
n=9/11
n=31
n=13
Total
n=32/39
n=103
n=50
# posts on
challenges
# posts on
policy
n=10
n=11
n=11
n=32
n=9
n=4
n=7
n=20
The qualitative data from the survey, interviews and Q&A groups were analyzed inductively, allowing for
emergent themes. We analyzed the qualitative data in three phases – as we completed the survey, interviews,
and Q&A groups, respectively, and our approach was to analyze the data by question.
E-10
Download