NOMINATION COVER PAGE AND INDEX

advertisement
NOMINATION COVER PAGE AND
INDEX
This document, which serves as a cover page, and the items indexed in Part B below, comprise the
nomination packet.
A-I. NOMINEE INFORMATION
Name: Stewart I. Donaldson
Title: Dean & Professor
Affiliation: Claremont Graduate University
Email: stewart.donaldson@cgu.edu
Phone: (909) 607 -9001
A-II. NOMINATOR INFORMATION (If different from the nominee)
Name: Michael Quinn Patton
Title: Founder & Director
Affiliation: Utilization-Focused Evaluation
Email: mqpatton@prodigy.net
Phone: (612) 590-2919
PART B: INDEX OF SUBMITTED MATE RIALS
All items in the nomination packet should be compiled in the order listed below with this document as the first page.
B-I. NOMINATION JUSTIFICATION STATEMENT
B-II. LETTERS OF SUPPORT
(Please list. Maximum of five letters)
1.
2.
3.
4.
5.
Letter Written By
MELVIN MARK
MICHAEL SCRIVEN
HUEY CHEN
CHRISTINA CHRISTIE
DALE BERGER
Submission Status
Included in this packet [X]
Included in this packet [X]
Included in this packet [X}
Included in this packet [X]
Included in this packet [X]
B-III. S UPPORTING DOCUMENTATION
i. Vita or Resume of the Nominee (Maximumof ten pages)
ii. Reports/Publications (List report titles in the order that they appear in the nomination packet.
Maximum of three reports/publications)
1. Donaldson, S. I., & Lipsey, M.W. (2006). Roles for theory in contemporary evaluation practice:
Developing practical knowledge. In I. Shaw, J.Greene, & M. Mark (Eds.), The Handbook of Evaluation:
Policies, Programs, and Practices (pp. 56-75). London: Sage. (More than 50 citations Google Scholar)
2. Donaldson, S. I., Gooler, L. E., & Scriven, M. (2002). Strategies for managing evaluation anxiety: Toward
a psychology of program evaluation. American Journal of Evaluation, 23(3), 261-273. (Consistently in
top 20 most-cited AJE articles: http://aje.sagepub.com/reports/most-cited)
3. Donaldson, S. I., Graham, J. W., & Hansen, W. B. (1994). Testing the generalizability of intervening
mechanism theories: Understanding the effects of school-based substance use prevention interventions.
Journal of Behavioral Medicine, 17, 195-216. (Approximately 200 citations Google Scholar).
B-I. NOMINATION JUSTIFICATION STATEMENT
Prepared and submitted by Michael Quinn Patton
It is a pleasure to nominate Stewart I. Donaldson for the American Evaluation Association’s Paul F. Lazarsfeld
Award for contributions to evaluation theory. Probably the most familiar of Stewart’s contributions to
evaluation theory came in his 2003, 2007, 2008, 2011, and 2013 evaluation books, a 2011 issue of New
Directions for Evaluation (NDE), and a number of subsequent chapters and articles on theory-driven
evaluations, improving evaluation evidence and conclusions, exploring the intersections of applied psychology
and evaluation, emerging practices in international development evaluation, and future directions in evaluation.
Stewart has contributed to evaluation theory over a long period of time, in important ways, with publications
and presentations on a wide array of topics.
Theory-driven evaluations
Donaldson has spent the past two decades evolving, expanding, and improving theory-driven evaluations.
Originally influenced by the writings of Rossi, Chen, and Weiss, he has developed a contemporary approach he
refers to as Program Theory-Driven Evaluation Science. Theory-driven evaluation was originally put forth as
way to integrate the hard won lessons of evaluation practice over several decades. It was an attempt to
incorporate the strengths of other evaluation approaches in an effort to synthesis and integrate the field. Simply
stated, Donaldson’s contribution and advancement of theory-driven evaluation involves using a conceptual
framework of the evaluand in its context (sometimes described as program theory, theories of change, logic
models, systems frameworks or the like) for tailoring specific evaluation designs and methods to answer the key
theory-based evaluation questions. Donaldson has described the contemporary version of this approach in detail
and has provided very concrete examples of program theory driven evaluations in practice in his influential and
widely cited book:
Donaldson, S. I. (2007). Program theory-driven evaluation science: Strategies and
applications. Mahawah, NJ: Erlbaum.
This book endures as one of the most significant contributions to theory-driven evaluation in the last decade. It is
particularly effective in addressing the integration of theory and practice, and providing concrete evaluation
examples of how such integration enhances both the quality and utility of evaluations. Moreover, Donaldson
makes a compelling case that the integration of theory and practice is what makes evaluation a scientific endeavor.
Leading up to and laying the foundation for his major theory-driven evaluation science book were several
important articles that address the theory-practice connection.
 Donaldson, S. I., & Lipsey, M.W. (2006). Roles for theory in contemporary evaluation practice:
Developing practical knowledge. In I. Shaw, J.Greene, & M. Mark (Eds.), The Handbook of
Evaluation: Policies, Programs, and Practices (pp. 56-75). London: Sage.
 Donaldson, S.I. (2005). Using program theory-driven evaluation science to crack the Da Vinci
Code. New Directions for Evaluation, 106, 65-84.
 Donaldson, S. I., & Gooler, L. E. (2003). Theory-driven evaluation in action: Lessons from a $20
million statewide work and health initiative. Evaluation and Program Planning, 26, 355-366.
2
 Donaldson, S. I., & Gooler, L. E. (2002). Theory-driven evaluation of the Work and Health
Initiative: A focus on Winning New Jobs. American Journal of Evaluation, 23(3), 341-346.
 Fitzpatrick, J. (2002). Dialog with Stewart Donaldson (about the theory-driven evaluation of the
Work and Health Initiative). American Journal of Evaluation, 23(3), 347-365. (Interview in
Exemplars Section).
Social psychological theory and evaluation practice
Throughout his writings, Donaldson has demonstrated the meaningfulness and fundamental wisdom of Kurt
Lewin’s proposition that "Nothing is as practical as a good theory." Kurt Lewin, as the founder of modern social
and organizational psychology, pioneered attention to the theory-practice connection. Donaldson has brought that
perspective to evaluation and, in doing so, has contributed significantly to our understanding of the integral nature
of the theory-practice linkage. Important works in this regard include:
 Mark, M., Donaldson, S. I., & Campbell, B. (Eds.) (2011). Social psychology and evaluation.
New York, NY: Guilford.
 Donaldson, S. I., & Crano, W. C. (2011). Theory-driven evaluation science and applied social
psychology: Exploring the intersection. In M. M. Mark, S. I. Donaldson, S. I., & B. Campbell
(Eds.), Social psychology and evaluation. New York, NY: Guilford.
Later in this statement I’ll return to the linkage Donaldson has forged between social psychology and evaluation.
First, I’d like to highlight other ways in which he has connected theory and practice in his evaluation contributions.
Applying theory in evaluation design and practice
Donaldson has published many empirical evaluations and conceptual frameworks to illustrate the value of theorydriven evaluations to advancing substantive knowledge in various domains. Examples of these publications
include:

Donaldson, S. I., Graham, J. W., & Hansen, W. B. (1994). Testing the generalizability of intervening
mechanism theories: Understanding the effects of school-based substance use prevention interventions.
Journal of Behavioral Medicine, 17, 195-216.

Donaldson, S. I., Graham, J. W., Piccinin, A. M., & Hansen, W. B. (1995). Resistance-skills training and
onset of alcohol use: Evidence for beneficial and potentially harmful effects in public schools and in private
Catholic schools. Health Psychology, 14, 291-300.

Donaldson, S. I. (1995). Worksite health promotion: A theory-driven, empirically based perspective. In L.
R. Murphy, J. J. Hurrell, S. L. Sauter, & G. P. Keita (Eds.). Job stress interventions (pp. 73-90).
Washington, DC: American Psychological Association.
3

Donaldson, S. I., Sussman, S., MacKinnon, D. P., Severson, H. H., Glyn, T., Murray, D. M., & Stone, E. J.
(1996). Drug abuse prevention programming: Do we know what content works? American Behavioral
Scientist, 39, 868-883.

Donaldson, S. I., Thomas, C. W., Graham, J. W., Au, J., & Hansen, W. B. (2000). Verifying drug
prevention program effects using reciprocal best friend reports. Journal of Behavioral Medicine, 23(6),
221-234.

Donaldson, S. I. (2002). High-potential mediators of drug-abuse prevention program effects. In W. D.
Crano & Burgoon, M. (Eds.), Mass media and drug prevention: Classic and contemporary theories and
research (pp. 215-230). Mahwah, NJ: Erlbaum.

Donaldson, S.I. (2004). Using professional evaluation to improve the effectiveness of nonprofit
organizations. In R.E. Riggio & S. Smith Orr (Eds.), Improving leadership in nonprofit organizations. San
Francisco, CA: Jossey-Bass.
Theory, methodology, and high quality evidence:
Toward improving evaluation evidence & conclusions
A consistent theme in Donaldson’s writings is the role of theory in guiding methodological decision-making and
strengthening evidence-based conclusions. Rather than treating theory as an issue unto itself, he is especially
skilled and astute at connecting theory to broad issues, like evidence-informed practice. Examples of his
writings that address improving evaluation evidence and conclusions include:
 Donaldson, S. I., Christie, C. A., & Mark, M. (Eds.) (2008). What counts as credible evidence in
applied research and evaluation practice? Newbury Park, CA: Sage.
 Chen, H.T., Donaldson, S. I., & M. M. Mark (Eds.) (2011). Advancing validity in outcome evaluation:
Theory & practice. New Directions for Evaluation. San Francisco, Jossey-Bass.
One of the central challenges in all of Donaldson’s practical evaluation work has been gathering credible and
actionable evidence that supports and leads to accurate evaluative conclusions. His passion for improving the
methods we use to gather evidence is evident throughout a large portion of his written contributions to evaluation
theory and practice. For example, differential attrition was a big problem Donaldson faced when analyzing data
from the large prevention trials described above. Working closely with John Graham and other colleagues, he
conducted a series of studies and published two influential articles to illustrate how to deal with the common
problems of missing data and differential attrition in order to arrive at sound evaluative conclusions.
 Graham, J. W., & Donaldson, S. I. (1993). Evaluating interventions with differential attrition: The
importance of nonresponse mechanisms and use of follow-up data. Journal of Applied Psychology, 78,
119-128, 1993.
 Graham, J. W., Hofer, S. M., Donaldson, S. I., MacKinnon, D. P., & Schafer, J. L. (1997). Analysis with
missing data in prevention research. In K. J. Bryant, M. Windle, & S. G. West (Eds.), The science of
prevention: Methodological advances from alcohol and substance abuse research (pp. 325-366).
Washington, DC: APA.
4
Two other very serious problems in some areas of evaluation research are self-report and mono-method bias.
With funding from both the National Institute of Mental Health and the National Institute on Alcohol Abuse and
Alcoholism, Donaldson conducted several studies to shed light on how to prevent and deal with self-report and
mono-method bias in organizational and evaluation research. His 2002 article, he published a conceptual
framework based on sound empirical evidence that has become particularly influential (approximately 220
citations in Gooler Scholar) for understanding this challenge to collecting credible evidence.

Donaldson, S. I., & Grant-Vallone, E. J. (2002). Understanding self-report bias in organizational behavior
research. Journal of Business and Psychology, 17(2), 245-262.

Graham, J. W., Collins, N. L., Donaldson, S. I., & Hansen, W. B. (1993). Understanding and controlling
for response bias: Confirmatory factor analysis of multitrait-multimethod data. In R. Steyer, K. F.
Wender, & K. F. Widamen (Eds.). Psychometric methodology (pp. 585-590). Stuttgart and New York:
Gustav Fisher Verlag.

Mersman, J. L., & Donaldson, S. I. (2000). Factors affecting the convergence of self-peer ratings on
contextual and task performance. Human Performance, 13(3), 299-322.

Donaldson, S. I., Thomas, C. W., Graham, J. W., Au, J., & Hansen, W. B. (2000). Verifying drug
prevention program effects using reciprocal best friend reports. Journal of Behavioral Medicine, 23(6),
221-234.
Perhaps Donaldson’s most visible contributions to improving evaluation evidence in recent times has been his
2008 book and 2011 New Directions for Evaluation volume which address one of the most thorny issues
challenging the evaluation field today – should randomized controlled trials (RCTs) be consider the gold standard
for impact evaluation. He addressed and framed this contentious issue as a critical example of the intersection of
theory and practice:
 Donaldson, S. I., Christie, C. A., & Mark, M. (Eds.) (2008). What counts as credible evidence in applied
research and evaluation practice? Newbury Park, CA: Sage.
 Chen, H.T., Donaldson, S. I., & M. M. Mark (Eds.) (2011). Advancing validity in outcome evaluation:
Theory & practice. New Directions for Evaluation. San Francisco, Jossey-Bass.
Some authors in these volumes make strong arguments for experimental evidence as the gold standard, while
others make strong arguments against privileging RCTs in impact evaluation. This first volume has been widely
cited and has received much acclaim in the evaluation community. Sage publications has recently contracted with
Donaldson and his colleagues to do a second volume that broadens the debate and discussion to what counts as
credible and actionable evidence in evaluation practice. The New Directions for Evaluation volume with Huey
Chen and Mel Mark promises to further illuminate the challenging nature of this issue and to advance
understanding of validity in contemporary evaluation practice. A listing of additional written contributions to this
topic are provided below.

Donaldson, S.I., & Christie, C.A. (2005). The 2004 Claremont Debate: Lipsey versus Scriven. Determining
causality in program evaluation and applied research: Should experimental evidence be the gold standard?
Journal of Multidisciplinary Evaluation, 3, 60-77.
5
 Donaldson, S. I. (2008). In search of the blueprint for an evidence–based global society. In S. I. Donaldson,
C. A. Christie, & M. M. Mark (Eds.), What counts as credible evidence in applied research and evaluation
practice? Newbury Park, CA: Sage.
 Donaldson, S. I. (2008). A practitioner's guide for gathering credible evidence. In S. I. Donaldson, C. A.
Christie, & M. M. Mark (Eds.), What counts as credible evidence in applied research and evaluation
practice? Newbury Park, CA: Sage.
 Chen, H. T., Donaldson, S. I., & Mark, M. M. (2011). Validity frameworks for outcome evaluation. New
Directions for Evaluation, 130, 5-16.
 Gargani, J., & Donaldson, S. I. (2011). What works for whom, where, why, for what, and when?: Using
evaluation evidence to take action in local contexts. New Directions for Evaluation, 130, 17-30.
Finally, Donaldson has made additional contributions to improving evaluation methods and approaches ranging
from understanding how to conduct mediator and moderator analysis to improve programs, using meta-analysis to
strengthen program designs, and facilitated debate and discussion about the strengths and challenges of utilizationfocused and empowerment evaluation.

Donaldson, S. I. (2001). Mediator and moderator analysis in program development. In S. Sussman (Ed.),
Handbook of program development for health behavior research (pp. 470-496). Newbury Park, CA: Sage.

Donaldson, S. I., Street, G., Sussman, S., & Tobler, N. (2001). Using meta-analyses to improve the design
of interventions. In S. Sussman (Ed.), Handbook of program development for health behavior research
(pp. 449-466). Newbury Park, CA: Sage.

Donaldson, S.I., Patton, M.Q., Fetterman, D., & Scriven, M. (2010). The 2009 Claremont Debates: The
promise and pitfalls of utilization-focused and empowerment evaluation. Journal of Multidisciplinary
Evaluation, 6 (13), 15-57.
Developing and deepening the intersection of applied psychology and evaluation
Donaldson is a well-known leader in the area of applied scientific psychology. In addition to leading one of the
largest and most successful graduate programs in this field for more than a decade, he has written extensively
making contributions to advance the field. For example, in 2006 he published a book that documented the rise of
applied psychology and explored a wide range of application areas and career opportunities.

Donaldson, S. I., Berger, D. E., & Pezdek, K. (Eds.) (2006). Applied psychology: New frontiers and
rewarding careers. Mahawah, NJ: Erlbaum.
In this volume Donaldson & Berger (2006) analyzed U.S. Department of Education data and showed that
psychology had been extraordinarily successful at recruiting the next generation of social scientists into the
discipline. Furthermore, they revealed that there had been a rapid rise in the number of Ph.D. level psychologists
working outside traditional mental health service professions and the university.

Donaldson, S.I., & Berger, D.E. (2006). The rise and promise of applied psychology in the 21st century.
In S.I. Donaldson, D.E. Berger, & K. Pezdek (Eds.), Applied psychology: New frontiers and rewarding
careers. Mahwah, NJ: Erlbaum.
6
One new area that showed particularly strong growth was evaluation. Donaldson & Christie (2006) contributed a
chapter to the volume illustrating the intersection of applied psychology and evaluation, and describing a wide
range of career opportunities for psychologists interested in program, policy, and organizational evaluation. Their
chapter has been quite popular with students seeking information about career opportunities in the discipline and
profession of evaluation.

Donaldson, S.I., & Christie, C.A. (2006). Emerging career opportunities in the transdiscipline of evaluation
science. In S. I. Donaldson, D. E. Berger, & K. Pezdek (Eds.), Applied psychology: New frontiers and
rewarding careers. Mahwah, NJ: Erlbaum.
Several years earlier Donaldson and his colleagues explored the intersection of psychology and evaluation in their
widely cited American Journal of Evaluation (2002) article analyzing the antecedents and consequences of
excessive evaluation anxiety. This article provided evaluators with 17 practical strategies for preventing and
managing evaluation anxiety in program evaluation practice. This work illustrated that more than technical skills
are required to conduct high quality evaluations and set the stage for broader discussions of the psychology of
evaluation. This contribution has consistently been in the top 20 of the most-cited AJE articles:
http://aje.sagepub.com/reports/most-cited).

Donaldson, S. I., Gooler, L. E., & Scriven, M. (2002). Strategies for managing evaluation anxiety: Toward
a psychology of program evaluation. American Journal of Evaluation, 23(3), 261-273.
Donaldson has since made many more significant contributions aimed at expanding the intersection of psychology
and evaluation. In 2008, he and Hallie Preskill illustrated how professional evaluation and the new positive
psychology movement could contribute to improving the evidence base for career development programs.

Preskill, H., & Donaldson, S.I. (2008). Improving the evidence base for career development programs:
Making use of the evaluation profession and positive psychology movement. Advances in Developing
Human Resources, 10(1), 104-121.
In 2011, Donaldson further explored the intersection of positive psychology and evaluation in a book with
colleagues Mihaly Csikszentmihalyi and Jeanne Nakamura.

Donaldson, S. I., Csikszentmihalyi, M., & Nakamura, J. (Eds.) (2011). Applied positive psychology:
Improving everyday life, health, schools, work, and society. New York, NY: Routledge Academic.
He specifically contributed chapters to this volume that illustrated the importance of using knowledge about
evaluation theory and practice to develop positive psychology interventions to improve health, education, work,
and public policy.

Donaldson, S. I. (2011). What works, if anything, in applied positive psychology. In S.I. Donaldson, M.
Csikszentmihalyi, and J. Nakamura (Eds.) Applied Positive Psychology: Improving Everyday Life, Health,
Schools, Work, and Society. New York, NY: Routledge Academic.

Donaldson, S. I. (2011). A practitioner's guide for applying the science of Positive psychology. In S.I.
Donaldson, M. Csikszentmihalyi, and J. Nakamura (Eds.), Applied Positive Psychology: Improving
Everyday Life, Health, Schools, Work, and Society. New York, NY: Routledge Academic.
7
Finally, with AEA colleagues Melvin Mark and Bernadette Campbell, Donaldson published a second book in 2011
focused on learning from and expanding the intersection of psychology and evaluation. This volume focused
specifically on improving social psychology and program/policy evaluation.

Mark, M., Donaldson, S. I., & Campbell, B. (Eds.) (2011). Social psychology and evaluation. New York,
NY: Guilford.

Mark, M. M., Donaldson, S. I., & Campbell, B. (2011). The past, the present, and possible futures for social
psychology and evaluation. In M. Mark, S. I. Donaldson, & B. Campbell (Eds.), Social psychology and
evaluation. New York, NY: Guilford.

Mark, M. M., Donaldson, S. I., & Campbell, B. (2011). Social psychology and evaluation: Building a
better future. In M. Mark, S. I. Donaldson, & B. Campbell (Eds.), Social psychology and evaluation. New
York, NY: Guilford.
In addition, he and social psychologist William Crano contributed a chapter exploring the intersection of theorydriven evaluation and applied social psychology.

Donaldson, S. I., & Crano, W. C. (2011). Theory-driven evaluation science and applied social psychology:
Exploring the intersection. In M. M. Mark, S. I. Donaldson, S. I., & B. Campbell (Eds.), Social psychology
and evaluation. New York, NY: Guilford.
The chapters in this volume provide many good examples of how social psychology can be improved by
evaluation, and how evaluation can be improved by social psychology theory and research. It calls out for more
evaluation intersections of various sorts to be explored in the future and provides an example of the value and
power of these explorations. This book promises to make important contributions to both the future of social
psychology and evaluation theory and research.
Elucidating the importance and contributions of theory-practice integration
to international development evaluation
Donaldson received a $350,000 grant from the Rockefeller Foundation in 2010 to build evaluation capacity in the
growing field of international development evaluation. This work has substantially extended his impact on crosscultural evaluation work. He primarily has focused his efforts on three notable projects.
First, he assembly a group of world level experts to reflect on many years of experience, successes and failures in
development evaluation in Asia and Africa, and on recent work supported by the Rockefeller Foundation on
Rethinking, Reshaping, and Reforming Evaluation. These leading thinkers contributed chapters that explored
concepts, frameworks and ideas that promise improve international development evaluation’s influence, ability to
respond to the challenges of the 21st century, and to play a meaningful role in social and economic transformation.

Donaldson, S. I., Azzam, T. A., & Conner, R. (Eds.) (2013). Emerging practices in international
development evaluation. Greenwich, CT: Information Age.

Donaldson, S. I., Azzam, T., Conner, R. (2013). Searching for good practices in international development
evaluation. In S. I. Donaldson, T. A. Azzam, & R. Conner, Emerging practices in international
development evaluation. Greenwich, CT: Information Age.
8

Donaldson, S. I., Azzam, T., Conner, R. (2013). Future directions for international development
evaluations. In S. I. Donaldson, T. A. Azzam, & R. Conner, Emerging practices in international
development evaluation. Greenwich, CT: Information Age.
Second, in collaboration with UNICEF, Rockefeller and many other international development partner
organizations, he and colleague Marco Segone designed and hosted an extensive webinar series designed to teach
evaluation practitioners working in developing countries about cutting edge evaluation theory and practice issues.
The series provide more than 40 webinars which were attended by more than 3,330 participants from more than
100 countries.
Finally, Donaldson and Segone developed an e-learning series on (1) equity-focused evaluations, (2) national
evaluation capacity development for country-led monitoring and evaluation systems, and (3) Donaldson’s latest
book on Emerging Practices in International Development Evaluation. To date, more than 12,000 participants
from 170 countries have registered to participate in the series. This effort is part of a larger initiative call
EvalPartners that aspires to contribute to the enhancement of the capacities of Civil Society Organizations (CSOs)
- notably Voluntary Organizations for Professional Evaluators (VOPEs) - to influence policy makers, other key
stakeholders and public opinion so that public policies are evidence- informed and supports equitable development
processes and results. This effort has gain considerable momentum in the last 4 months and now has 34 key
organization partners. Some of the analyses from this work revealed that there are now more than 120 VOPEs
with more than 35,000 members. Donaldson is on the Advisory Board of EvalPartners and remains active in
helping develop these networks to improve evaluation theory across both developing and developed countries. He
has recently made presentations about his international development evaluation work at the European Evaluation
Society Meeting in Helsinki, the American Evaluation Association Conference in Minnesota, Malmo University in
Sweden, the Danish Evaluation Institute in Copenhagen, Oregon Program Evaluators Network in Portland, and the
International Development Evaluation Global Assembly in Barbados.
Donaldson’s evaluation work in international and cross-cultural evaluation and research, and his recent work as
Co-Director of AEA’s Graduate Education Diversity Internship program have inspired him to think more deeply
about the role of culture in evaluation theory and practice. He has recently provided a number of training and
webinars on AEA’s cultural competency statement as part of the GEDI program, and gave presentations on
related topics at many conferences around the world. He and his former student Katrina Bledsoe were recently
invited to contributing a chapter based on their recent presentation at the Culturally Responsive Evaluation and
Assessment Conference in Chicago.
 Bledsoe, K, & Donaldson, S. I. (forthcoming). Culturally responsive theory-driven evaluations. In S.
Hood, R. Hopson, K. Obeidat, & H. Frierson (Eds.), Continuing the journey to reposition culture and
cultural context in evaluation theory and practice. Greenwich, NY: Information Age.
As evident throughout his writings, whatever the topic, the lens through which he addresses the topic is the
integration of theory and practice.
Applications of theory-driven evaluation science
From the beginning of his distinguished career, Donaldson’s theory-driven evaluation science work has been
grounded in rigorous empirical work. For example, he received funding from the National Institute of Alcohol
Abuse and Alcoholism to apply theory-driven evaluation principles to the evaluation of large-scale prevention
programs. This work led to important empirical contributions that advanced knowledge in the substantive area of
school based prevention and illustrated how theory-driven evaluation can pay off in the evaluation of large scale
9
prevention trails.
More specifically, Donaldson and his colleagues identified the mechanisms that explain why popular programs like
DARE both fail and succeed. In the 1994 article, “just say no” strategies were found ineffective while social
norms were predictive of the onset of substance use. A second theory-driven evaluation published in 1995 isolated
the harmful effects (unintended consequences or side effects) of “just say no” or resistance skills training. These
contributions were based on the analysis of four waves of data collected from approximately 12,000 students, and
were very influential in the public debate as well as the literature with more than 350 citations to data.

Donaldson, S. I., Graham, J. W., & Hansen, W. B. (1994). Testing the generalizability of intervening
mechanism theories: Understanding the effects of school-based substance use prevention interventions.
Journal of Behavioral Medicine, 17, 195-216.

Donaldson, S. I., Graham, J. W., Piccinin, A. M., & Hansen, W. B. (1995). Resistance-skills training and
onset of alcohol use: Evidence for beneficial and potentially harmful effects in public schools and in private
Catholic schools. Health Psychology, 14, 291-300.
Donaldson continued to apply theory-driven evaluation principles to this substantive area and made a number of
subsequent written contributions.

Donaldson, S. I. (1995). Peer influence on adolescent drug use: A perspective from the trenches of
experimental evaluation research. American Psychologist, 50, 801-802.

Donaldson, S. I., Sussman, S., MacKinnon, D. P., Severson, H. H., Glyn, T., Murray, D. M., & Stone, E. J.
(1996). Drug abuse prevention programming: Do we know what content works? American Behavioral
Scientist, 39, 868-883.

Au, J., & Donaldson, S. I. (2000). Social influences as explanations for substance use differences among
Asian-American and European-American adolescents. Journal of Psychoactive Drugs, 32(1), 15-23.

Donaldson, S. I., Thomas, C. W., Graham, J. W., Au, J., & Hansen, W. B. (2000). Verifying drug
prevention program effects using reciprocal best friend reports. Journal of Behavioral Medicine, 23(6),
221-234.

Donaldson, S. I. (2002). High-potential mediators of drug-abuse prevention program effects. In W. D.
Crano & Burgoon, M. (Eds.), Mass media and drug prevention: Classic and contemporary theories and
research (pp. 215-230). Mahwah, NJ: Erlbaum.
Theory-driven evaluation for understanding the intersection of organizational effectiveness and
employee well-being.
Donaldson has also made a wide range of substantive contributions to understanding the intersection of
organizational effectiveness and employee well-being. He has received funding from a number agencies and
10
foundations to conducted theory-driven evaluations in this context. For example, the National Institute of
Mental Health funded him to apply Donald Campbell’s MTMM matrix to understand a range of measurement
issues important for determine the effectiveness of workplace health promotion interventions. The California
Wellness Foundation funded two large theory-driven evaluations to determine the effectiveness of investments
in improving the lives of thousands of California workers. A sample of the body of written work that has
influenced both his evaluation theory and workplace health promotion work is listed below.
 Donaldson, S. I. (1993). Effects of lifestyle and stress on the employee and organization: Implications for
promoting health at work. Anxiety, Stress, and Coping, 6, 155-177.
 Donaldson, S. I. (1995). Worksite health promotion: A theory-driven,empirically based perspective. In L.
R. Murphy, J. J. Hurrell, S. L. Sauter, & G. P. Keita (Eds.). Job stress interventions (pp. 73-90).
Washington, DC: American Psychological Association.
 Donaldson, S. I., & Blanchard, A. L. (1995). The seven health practices,well-being, and performance at
work: Evidence for the value of reaching small and underserved worksites. Preventive Medicine,24, 270-7.
 Donaldson, S. I., Dent, C. W., Sussman, S., Severson, H. H., & Stoddard, J. L. (1996). The organizational
implications of smokeless tobacco use in the Lumber Mill Industry. Addictive Behaviors, 21, 259-267.
 Donaldson, S. I., & Klien, D. (1997). Creating healthful work environments for ethnically diverse
employees working in small and medium-sized businesses: A non-profit industry/community/university
collaboration model. Employee Assistance Quarterly, 13, 17-32.
 Donaldson, S. I., Gooler, L. E., & Weiss, R. (1998). Promoting health and well-being through work:
Science and practice. In X. B. Arriaga & S. Oskamp (Eds.), Addressing community problems: Research
and intervention (pp. 160-194). Newbury Park: Sage.
 Donaldson, S. I., & Weiss, R. (1998). Health, well-being, and organizational effectiveness in the virtual
workplace. In M. Igbaria, & M. Tan (Eds.), The virtual workplace (pp. 24-44). Harrisburg, PA: Idea
Group Publishing.
 Donaldson, S. I., Sussman, S., Dent, C. W., Severson, H. H., & Stoddard, J. L. (1999). Health behavior,
quality of work life, and organizational effectiveness in the Lumber Industry. Health Education and
Behavior, 26(4), 579-594.
 Donaldson, S. I., Ensher, E. A., & Grant-Vallone, E. J. (2000). Longitudinal examination of mentoring
relationships on organizational commitment and citizenship behavior. Journal of Career Development,
26(4), 233-248.
11
 Kent, D. R., Donaldson, S. I., Smith, P., & Wyrick, P. (2000). Evaluating criminal justice programs
designed to reduce crime by targeting repeat gang offenders. Evaluation & Program Planning, 23(1), 113122.
 Grant-Vallone, E. J., & Donaldson, S. I. (2001). Consequences of work-family conflict on employee wellbeing over time. Work and Stress, 15(3), 214-226.
 Ensher, E. A., Grant-Vallone, E. J., & Donaldson, S. I. (2001). Effects of perceived discrimination on
organizational citizenship behavior, job satisfaction, and organizational commitment. Human Resource
Development Quarterly, 12(1), 53-72.
 Donaldson, S. I., & Gooler, L. E. (2002). Summary of the evaluation of The California Wellness
Foundation's $20 million Work and Health Initiative. Institute for Organizational and Program Evaluation
Research, Claremont Graduate University (www.cgu.edu/sbos).
 Donaldson, S.I. (2004). Using professional evaluation to improve the effectiveness of nonprofit
organizations. In R.E. Riggio & S. Smith Orr (Eds.), Improving leadership in nonprofit organizations. San
Francisco, CA: Jossey-Bass.
 Donaldson, S.I. & Bligh, M. (2006). Rewarding careers applying positive psychological science to improve
quality of work life and organizational effectiveness. In S.I. Donaldson, D.E. Berger, & K. Pezdek (Eds.),
Applied psychology: New frontiers and rewarding careers. Mahwah, NJ: Erlbaum.
 Donaldson, S. I., & Ko, I. (2010). Positive organizational psychology, behavior, and scholarship: A review
of the emerging literature and evidence base. Journal of Positive Psychology, 5 (3), 177-191.
 Ko, I., & Donaldson, S. I. (2011). Applied positive organizational psychology: The state of the science and
practice. In S.I. Donaldson, M. Csikszentmihalyi, and J. Nakamura (Eds.) Applied Positive Psychology:
Improving Everyday Life, Health, Schools, Work, and Society. New York, NY: Routledge Academic.
 Donaldson, S.I., & Dollwet, M. (2013). Taming the waves and horses of positive organizational
psychology. Advances in Positive Organizational Psychology, 1, 1-21.
Broad and diverse applications of theory-driven evaluation
While the above published contributions illustrate two substantive areas where Donaldson’s theory-driven
evaluations have been influential, he has also conducted theory-driven evaluations and refined the approach in
a much wider range of settings and substantive areas. For example, Donaldson’s theory-driven evaluation
work has been funded by The National Institute of Mental Health; The National Institute on Alcohol Abuse and
Alcoholism; National Science Foundation; U.S. Department of Education; National Office of Justice Programs;
12
Office of Juvenile Justice Planning; National Institute of Allergy and Infectious Diseases; Center for Substance
Abuse Prevention; Riverside County Department of Mental Health; State of California Tobacco-Related
Disease Research Program; First 5 Los Angeles; The David and Lucile Packard Foundation; The Rockefeller
Foundation; The California Wellness Foundation; The Howard Hughes Foundation; The Hillcrest Foundation;
The Weingart Foundation; The Robert Ellis Simon Foundation; The Irvine Foundation; The Fletcher Jones
Foundation; The John Randolph Haynes and Dora Haynes Foundation; Commonwealth Capital Partners, L.P
and Kaiser Permanente. Finally, while Donaldson has been deeply engaged in the practice of theory-driven
evaluations he has been simultaneously using these grounded experiences to evolved and improve theorydriven evaluation as a contemporary theory of evaluation practice. The contributions below illustrate his
efforts to reflect upon his empirical work and challenges in evaluation practice to help evaluators improve their
understanding of how best to provide high quality evaluations.

Fitzpatrick, J. (2002). Dialog with Stewart Donaldson (about the theory-driven evaluation of the Work and
Health Initiative). American Journal of Evaluation, 23(3), 347-365. (Interview in Exemplars Section).

Donaldson, S. I., & Gooler, L. E. (2002). Theory-driven evaluation of the Work and Health Initiative: A
focus on Winning New Jobs. American Journal of Evaluation, 23(3), 341-346.

Donaldson, S. I., & Gooler, L. E. (2003). Theory-driven evaluation in action: Lessons from a $20 million
statewide work and health initiative. Evaluation and Program Planning, 26, 355-366.

Donaldson, S.I. (2004). Using professional evaluation to improve the effectiveness of nonprofit
organizations. In R.E. Riggio & S. Smith Orr (Eds.), Improving leadership in nonprofit organizations. San
Francisco, CA: Jossey-Bass.
Articulating future directions for evaluation theory and practice
Donaldson has been a leader in facilitating thinking about the future of evaluation theory and practice. In 2001, he
was invited to write an article for the American Journal of Evaluation about what evaluation might become in the
year 2010. He focused on a topic that received considerable attention, and to a certain extent, much of his vision of
evaluation expanding in positive directions and taking on the role of a helping profession has been realized.

Donaldson, S. I. (2001). Overcoming our negative reputation: Evaluation becomes known as a helping
profession. American Journal of Evaluation, 22(3), 355-361.
In 2002, he invited a well-known group of evaluation theorist and practitioners to the Claremont Colleges to
present their visions for evaluation theory and practice in the new millennium. This work has been influential and
has been used extensively in the training of new evaluators.

Donaldson , S. I., & Scriven , M. (Eds.) (2003). Evaluating social programs and problems: Visions for the
new millennium. Mahwah, NJ: Erlbaum.
In addition, he provided his vision for how evaluation should be practiced in the new millennium, and summarized
and integrated the diverse visions for the future of evaluation that were presented.
13

Donaldson, S. I. (2003). Theory-driven program evaluation in the new millennium. In S. I. Donaldson & M.
Scriven (Eds.), Evaluating social programs and problems: Visions for the new millennium (pp. 109-141).
Mahwah, NJ: Erlbaum.

Donaldson, S. I., & Scriven, M. (2003). Diverse visions for evaluation in the new millennium: Should
we integrate or embrace diversity? In S. I. Donaldson & M. Scriven (Eds.), Evaluating social programs
and problems: Visions for the new millennium (pp. 3-16). Mahwah, NJ: Erlbaum.
Finally, Donaldson organized a symposium on the Future of Evaluation in Society to honor his colleague
Michael Scriven. The presentations were written up as chapters that will appear in a forthcoming volume
focused on future directions in evaluation. The Future of Evaluation in Society has contributions from
Scriven, Patton, Stake, House, Stufflebeam, Mark, Greene, Kirkhart, and Christie. These authors have made
substantial contributions which highlight and honor Scriven’s major contributions to the transdiscipline and
practice of evaluation, and set the course for a better future for evaluation in societies across the globe.
 Donaldson, S. I. (Ed.) (2013, forthcoming). The future of evaluation in society: A tribute to Michael
Scriven. Greenwich, CT: Information Age.
 Donaldson, S. I. (2013, forthcoming). Connecting evaluation’s past with its future: A tribute to
Michael Scriven. In S. I. Donaldson (Ed.), The future of evaluation in society: A tribute to Michael
Scriven. Greenwich, CT: Information Age.
As a contributor to this volume myself, I know how hard Stewart has worked to bring it to fruition.
Citation Counts and Indices
The preceding pages have described the substance of Stewart Donaldson’s contributions as an evaluation
theorist. My perspective as a close observer of the field is that his contributions have been hugely influential.
Another indicator of his influence can be gleaned through citation counts and indices.
Below are citation counts and citation indices for Stewart Donaldson from Google Scholar (which has a
broader coverage than other sources for citation information, including for example, books and NDE). As of
June 1, 2013, Stewart has 2,521 citations and has averaged approximately 300 citations a year over the last 3
years.
All
Since 2007 Citations
2521
1407
h-index
i10-index
23
39
18
29
Paraphrasing from Google Scholar, the h-index gives the largest number h such that h publications have at least
h citations. The second column has the "recent" version of this metric, which is the largest number h such that h
publications have at least h new citations in the last 5 years. In the last 5 years, Stewart has had 18 publications
14
that have received at least 18 citations. The i10-index is the number of publications with at least 10 citations.
The second column has the "recent" version of this metric, which is the number of publications that have
received at least 10 new citations in the last 5 years. In Stewart’s case, 29 of his publications have had 10 or
more new citations in the last 5 years. Overall, the pattern of citations reveals an influence on the literature that
is both deep and wide.
Teaching Evaluation Theory
Stewart Donaldson’s influence as a theorist has come not just from his writings, but through his longstanding
commitment to make theory-driven evaluation the core of his teaching. This section reviews how he has
contributed to evaluation through teaching theory and the theory-practice connection.
He has engaged in an extraordinary amount of teaching, training, mentoring, and evaluation educational
program design and leadership. Here are examples:
 Stewart has designed and leads the evaluation doctoral, masters, and certificate programs at Claremont
University that focus on evaluation theory and practice. He has taught many courses on evaluation
theory to students in this program.
 He developed a very successful annual summer professional development workshop series in evaluation
and applied research methods at Claremont University.
 He has mentored and supervised more than 200 graduate students and professionals working on doctoral
dissertations, masters theses, certificate practicum projects, and evaluation grants and contracts at the
Claremont Evaluation Center.
 He designed, leads, and teaches seminars on evaluation theory in the Certificate in the Advanced Study
of Evaluation Program (a distance education program for working professionals) at Claremont
Graduate University.
 He has also taught numerous evaluation workshops and made hundreds of presentations (included many
keynotes, invited presentations, and AEA presidential strand sessions) on, or related to, evaluation
theory.
 He has taught professional development workshops on and related to evaluation theory for more than 10
years at the Centers for Disease Control/American Evaluation Association Summer Institute.
 He leads AEA’s Graduate Education Diversity Internship Program (GEDI), having served on the AEA
Board 2010-2012.
This nomination statement addresses each of these kinds of contributions. Collectively, they add to an already
compelling case for selecting Stewart Donaldson for the 2013 AEA Lazarsfeld Award.
Commitment to teaching theory
Evaluation theory has been one of Professor Donaldson’s main teaching and writing foci since he joined the
faculty at Claremont Graduate University more than 25 years ago. His written work on evaluation theory is
often inspired and refined by his teaching and practice of evaluation. He has developed a number of evaluation
education and training programs that emphasize training and research on evaluation theory. He has also taught
thousands of graduate students and practicing evaluators about evaluation theory and the diverse theoretical
approaches central to the vibrant transdiscipline of evaluation.
15
His teaching influence is both national and global. He has taught many workshops and made presentations on and
related to evaluation theory at the American Evaluation Association, European Evaluation Society, Canadian
Evaluation Society, Australasian Evaluation Society, International Development Evaluation Association, The
Evaluators Institute, and AEA Local Affiliates such as the Hawaii Pacific Evaluation Association, Oregon
Program Evaluators Network, Washington DC Evaluators, Arkansas Group of Evaluators, Midwestern Evaluation
Network, Southern California Evaluation Association and the like.
During the past three years, Donaldson has done similar evaluation theory related professional development work
in the international development evaluation field. With funding from the Rockefeller Foundation and in
partnership UNICEF he developed evaluation webinars and an e-learning program on evaluation theory related
topics that have reached over 12,000 participants from approximately 175 countries.
In addition to classroom, workshop, and online teaching about evaluation theory, Donaldson has served as a
mentor for hundreds of diverse graduate students and practitioners working on projects bridging the evaluation
theory and practice divide. For example, Donaldson has served as chair or committee member on more than 50
completed doctoral dissertations (with more than 10 currently in progress). Many of these dissertations have
focused on topics that have advanced knowledge about evaluation theory and practice. I asked him to provide me
a list of these dissertations to include in this nomination justification because they reflect his mentoring influence
as an evaluation theorist. Particularly noteworthy, and included in the sample below, is the current positions of
these former students. So, below is a selected list of evaluation related doctoral dissertations by former students
now active in the field of evaluation.
 The Role of Method in Treatment Effect Estimates: Evidence from Psychological, Behavioral and
Educational Meta-Analyses. David Wilson, 1995: currently Professor & Chair, George Mason University.
 A Meta-Analysis of the Efficacy of Various Antecedent Behaviors, Characteristics, and Experiences for
Predicting Later Violent Behavior, James Derzon, 1996: currently Senior Evaluation Specialist, Battelle
Centers for Public Health)
 The Meta-Learning Organization: A Model and Methodology for Evaluating Organizational Learning
Capacity, Jane Davidson, 2001: cuurently Director, Real Evaluation Ltd. And author of two major
evaluation books, and co-director of the influential blog, Genuine Evaluation.
 Assessing Motivational Response Bias: The Influence of Individual, Item, and Situational Characteristics
on the Measurement of Self-Reported Health Indicators, Craig Thomas, 2007: currently Director, CDC
Division of Public Health Performance Improvement.
 Using Appreciative Inquiry to Build Evaluation Capacity at Three Non-Profit Organizations, Shanelle
Boyle, 2009: currently Research Associate, EVALCORP)
 Estimating the Effects of Teacher Certification on the Academic Achievement of Exceptional High School
Students in North Carolina, Bianca Montrosse, 2009: currently Assistant Professor, Western Carolina
University
 Evaluation Practice and Technology, Vanessa Jamieson, 2009: currently Senior Researcher, MarketCast.
16
 Understanding the Impact of Process Use and Use of Evaluation Findings on Program Participants,
Rachel Lopez, 20102, currently Evaluation Specialist, Skati Rising
 Theory Building through Praxis Discourse: A Theory-and-Practice Informed Model of Transformative
Participatory Evaluation, Michael Harnar, 2012, currently Senior Research & Evaluation Associate,
Mosaic Network, Inc.
 Clarifying the Connections: Evaluation Capacity and Intended Outcomes, Leslie Fierro, 2012: currently
Evaluation Specialist, Deloitte Consulting
Student Testimonial
Katrina Bledsoe is another of Donaldson’s former graduate students. Her dissertation in 2003 was on
Effectiveness of Drug Prevention Programs Designed for Adolescents of Color: A Meta-analysis. She is
currently Research Scientist and Senior Evaluator at the Education Development Center – and active in AEA.
She recently served a 3-year term on the AEA Board, served on the task forced that developed AEA’s
Statement on Cultural Competence in Evaluation, and Co-Directed AEA’s Graduate Education Diversity
Internship program with Donaldson 2011-2012. Because I know Katrina through our AEA connection, I asked
her if she’d care to offer a student’s perspective in support of this nomination. She was enthusiastic about
doing so and wrote me the following:
I have known Dr. Donaldson for almost close to 20 years, first as a graduate student and now as a
colleague and collaborator. He was instrumental in my entry into a career in evaluation. Without
having taken his signature class on Theory-driven Evaluation (TDE), I can say with absolute
confidence that I likely would not have discovered the field. Dr. Donaldson’s enthusiasm for the
course was unbridled; he taught and facilitated with exuberance. The class was engaging, infused
with simulated role-plays with possible clients, evaluators, and contextual situations. Through that
class, the world of evaluation was opened to me, and the use of one of my signature approaches,
theory-driven evaluation, was solidified. As I continued to work with Dr. Donaldson I was exposed
not only to TDE, but to other approaches and often, to the theorists who developed them.
He also introduced me to the American Evaluation Association (AEA). I attended my first
conference in 1995, attending the first joint Canadian Evaluation Society and AEA conference in
Vancouver, British Columbia. Throughout my doctoral training Dr. Donaldson provided
opportunities to conduct numerous evaluations, many times using the TDE approach. By the time I
finished my doctorate I had been practicing evaluation for several years---and I left Claremont
Graduate University an experienced evaluator.
Over the years, I have since come to use participatory approaches in addition to TDE but have
found Donaldson’s three step approach to conducting TDE -- (1) developing program impact
theory; (2) formulating and prioritizing evaluation questions; and (3) answering said evaluation
questions -- to be extremely useful in explaining the evaluation process to clients, communities, and
consumers. Dr. Donaldson continues to evolve his thinking about the approach, especially in light of
17
the continually changing needs of programs, communities, and societies. For instance, he has
worked and continues to work with international communities in adapting the TDE approach to
those unique contexts. At the present, Dr. Donaldson and I are writing are writing a chapter
exploring the cultural responsiveness of TDE. I applaud him for taking a bold step in understanding
how the approach is inherently culturally responsive in discerning the theories and mechanisms that
drive the programs that serve diverse communities.
Donaldson has mentored more than 100 diverse students on MA Thesis projects, Certificate of Advanced Study
of Evaluation Practicum Projects, and Graduate Evaluation Diversity Internship (GEDI) projects on topics
related to evaluation theory, and has employed and mentored numerous students who have worked on his
funded evaluation projects through the Claremont Evaluation Center.
Much of his teaching and mentoring work includes the use of his written work on evaluation theory. For
example, when Michael Scriven joined the faculty at the Claremont Graduate University he was quite
displeased with the Foundations of Program Evaluation Text (Shadish, Cook, & Leviton, 1991) that was being
used to teach Claremont graduate students about evaluation theory. Scriven’s main objection was he thought it
would better to have evaluation theorists describe their own theory or vision for how evaluation should be
practiced, rather having text book authors influenced by the Campbellian experimental tradition present their
framework for organizing the field. Donaldson and Scriven set out to develop a volume by inviting a diverse
group of evaluation theorists to Claremont to discuss and debate evaluation theory. Each presenter developed a
chapter and much of the discourse was captured in:
Donaldson , S. I., & Scriven , M. (Eds.) (2003). Evaluating social programs and problems: Visions for the
new millennium. Mahawah, NJ: Erlbaum.
This volume has been used in the Claremont programs and in evaluation theory courses all over the world for
more than a decade. Donaldson’s own chapter contributions included:

Donaldson, S. I. (2003). Theory-driven program evaluation in the new millennium. In S. I. Donaldson & M.
Scriven (Eds.), Evaluating social programs and problems: Visions for the new millennium (pp. 109-141).
Mahwah, NJ: Erlbaum.

Donaldson, S. I., & Scriven, M. (2003). Diverse visions for evaluation in the new millennium: Should we
integrate or embrace diversity? In S. I. Donaldson & M. Scriven (Eds.), Evaluating social programs and
problems: Visions for the new millennium (pp. 3-16). Mahwah, NJ: Erlbaum.
Evaluating evaluation training
A related area where his work has been influential is in understanding the nature and contributions of Universitybased evaluation training programs. He and his graduate student John Lavelle analyzed previous surveys
conducted with representatives of university-based evaluation programs and found this work had some serious
methodological limitations. Based on these findings, they introduced a new method for gaining quality data about
these programs. They published their findings in the American Journal of Evaluation in 2010 showing a more
accurate picture of the past and present from 1980-2008, and discussed the future of university based evaluation
education.
18

LaVelle, J., & Donaldson, S. I. (2010). University-based evaluation training programs in the United
States 1980-2008: An empirical examination. American Journal of Evaluation, 31 (1), 9-23.
This work has been extended in LaVelle’s doctoral dissertation where he is, under Donaldson’s supervision,
analyzing the curriculum and a range of other important characteristics of university based doctoral, masters,
certification, and professional development programs throughout the world. They have been recently been
invited to share their findings in a proposed issue of New Directions for Evaluation.

LaVelle, J., & Donaldson, S. I. (under review). The state of evaluation education and training. In J. W.
Altschuld, & M. Engle (Eds.) Accreditation, certification, and credentialing: Whither goes the American
Evaluation Assocaition. New Directions for Evaluation.
Donaldson has also been pushing the boundaries of effective teaching and training in the applied psychology and
evaluation online environment. He has pioneered the developed distance education programs in evaluation,
online evaluation courses and workshops, and a variety of capacity development efforts. He has recently coauthored a book providing lessons learned us and strategies for success for teaching in an online environment.

Neff, K. & Donaldson, S. I. (2013). Teaching psychology online: Tips and techniques for success.
London: Psychology Press.
What is relevant about these teaching and training contributions to the Lazarsfeld nomination is that deeply
embedded in and interwoven throughout is attention to and focus on evaluation theory and its critical relevance
to informing practice.
Conclusion
When I first approached Stewart Donaldson about submitting this nomination, I had only a general overview and
awareness of his contributions to evaluation theory. I had read and used his book on Program theory-driven
evaluation science as well as his New Directions for Evaluation and AJE writings. But I confess that I had no
appreciation for the extensiveness of his contributions and the broad range of areas in which he has contributed to
theory-practice integration. As an author and full-time evaluation professional, I have learned a great deal in
preparing this justification statement – and I come away convinced that Stewart Donaldson is probably our field’s
most prolific theorist.
In the midst of preparing this statement, I had a conversation with Marv Alkin, a prior Lazarsfeld recipient. He had
heard that I was nominating Stewart and wanted to provide a supportive letter. He said: “Frankly, it is quite
surprising to me, given the enormous contributions that Stewart has made, that he has not already been a recipient.”
I explained that I would have nominated Stewart sooner, but he has been serving as an AEA Board member (20102012) and it is generally viewed as a bit unseemly for a current Board member to receive an AEA award. So this is
the first year in some time that Stewart could be nominated. I then explained that we had already reached the
permitted limit of five supporting letters, but Marv insisted that he wanted to go on the record strongly supporting
Stewart’s nomination and added that he hoped the nomination would emphasize three particular areas in which
Stewart has made important theoretical contributions. “Clearly, his work in examining the nature of credible
evidence has been an important influence within the field. Second, Stewart's contributions to understanding
program theory are enormously important. His work on what he refers to as ‘program theory-driven evaluation
science’ has offered an accessible method and procedure for integrating program theory in the evaluation process.
Finally, on a more general level, Stewart has attempted to integrate social psychology approaches and theories into
evaluation practice by emphasizing how theoretical work in social psychology can be used to evaluate
programs and policies.”
The five supporting letters included in this nomination package provide further evidence of the high esteem in
which Stewart is held by colleagues. Three of the letters are from previous Lazarsfeld Award recipients:
Huey Chen, Michael Scriven, and Mel Mark. As a former Lazarsfeld Award recipients myself, I join these
distinguished colleagues in believing that Stewart Donaldson has honored the field with his commitments and
contributions and it is now appropriate that the profession honor him by recognizing those commitments and
contributions.
Download