Direct and indirect Assessment Methods

Assessment Methods
Office of Institutional Research, Planning, and Assessment
Neal F. McBride, Ed.D., Ph.D.
Associate Provost
How would you assess these SLOs?
Graduates are able to critique
a brief draft essay, pointing out
the grammatical, spelling, and
punctuation errors and offer
appropriate suggestions to
correct identified deficiencies
In a “capstone course” during the
final semester prior to graduation;
required to critique a supplied
essay containing predetermined
errors; evaluated by a 3-person
faculty panel (criteria: appropriate
suggestions to remediate 90% of
the errors)
Senior undergraduate psychology
majors perform above the national
average on the GRE Psychology
Subject Test
GRE Psychology Subject Test;
completed during the senior year,
required for graduation. Compare
average GRE Psychology Subject
Test scores with average scores of
all examinees nationwide
Assessment Methods
Assessment methods are ways to
ascertain (“measure”) student achievement
levels associated with stated student
learning outcomes (SLOs)
“Outcome” is a generic term for goals,
objectives, and/or aims
Basis for Selecting
Appropriate Assessment
A specific assessment method(s) is selected for
a specific outcome. . . “How do I ‘measure’ this
Assessment Methods
Assessment methods include both direct
and indirect approaches. . . We’ll define
these terms in a few minutes.
First, let’s explore a few criteria or
considerations to keep in mind as you
select appropriate assessment methods...
Qualitative Versus Quantitative Methods
Qualitative assessment: collects data that does not
lend itself to quantitative methods but rather to
interpretive criteria; “data” or evidence are often
representative words, pictures, descriptions,
examples of artistic performance, etc.
Quantitative assessment: collects representative
data that are numerical and lend themselves to
numerical summary or statistical analysis
Programs are free to select assessment methods
appropriate to their discipline or service.... choices
must be valid and reliable
Valid and Reliable Methods
Valid: The method is appropriate to the academic
discipline and measures what it is designed to
Reliable: The method yields consistent data each time
it is used and persons using the method are
consistent in implementing the method and
interpreting the data
Basic Aim: “defensible methods”
Locus of Assessment
Embedded assessment - “measurement” strategies
included as part of the requirements within existing
courses, internships, or other learning experiences–
“double duty” assessment; e.g., “critical assignments”
Ancillary assessment - “measurement” strategies
added on or in addition to requirements within existing
courses, internships, or other learning experiences–
“additional duty” assessment
Sources for Finding
Assessment Methods
 Professional associations and organizations
 Other programs/departments at CBU
 Similar programs/departments at other universities
 Published Resources
Dunn, D. S., Mehrotra, C. M. and Halonen J. S. (2004). Measuring
Up: Educational Assessment Challenges and Practices for
Psychology. APA: Washington, DC.
 Web... In general or for your specific area
 Literature search by a professional librarian
 Personal experience – yours or colleagues
here are some questions to consider carefully:
 Does it “fit” the SLO?
 Did the faculty or student services staff select the
method and are they willing to participate in its use?
 Will all students in the program or provided service
be included in the assessment (ideally, yes) or a
sample of students (maybe)?
 How much time is required to complete the
assessment method? Determine how this affects
faculty, staff, and students
 When and where will the assessment be administered?
 Are there financial costs? Are program and/or university
resources available?
 Is the method used at one point in time (cross-sectional
method) or utilized with students over several points in
time (longitudinal method)?
 Does the program faculty/staff have the skills and/or
knowledge necessary to use the method and analyze the
 Most importantly... WHO is responsible to make certain
the assessment is accomplished?
Ideally…. as you write or rewrite SLOs keep in
mind the question: “What method(s) can I use to
assess this SLO?”
Why is this tip potentially useful?
Direct Methods
Direct assessment methods are “measurement”
strategies that require students to actively demonstrate
achievement levels related to institutional and programspecific learning outcomes
Direct assessment methods focus on collecting
evidence on student learning or achievement directly
from students using work they submit (assignment,
exam, term paper, etc.) or by observing them as they
demonstrate learned behaviors, attitudes, skills, or
Direct Methods: Examples
Capstone or Senior-Level projects, papers, presentations,
performances, portfolios, or research evaluated by faculty or
external review teams... effective as assessment tools when
the student work is evaluated in a standard manner, focusing
on student achievement of program-level outcomes
Exams - locally developed comprehensive exams or entry-toprogram exams, or national standardized exams, certification
or licensure exams, or professional exams
Internship or Practicum - evaluations of student knowledge
and skills from internship supervisors, faculty overseers, or
from student participants themselves. This may include written
evaluations from supervisors focused on specific knowledge or
skills or evaluation of student final reports or presentations from
internship experiences.
Direct Methods, continued
Portfolios (hard-copy or web-based) - reviewed by faculty
members from the program, faculty members from outside the
program, professionals, visiting scholars, or industrial boards
Professional Jurors or Evaluators to evaluate student
projects, papers, portfolios, exhibits, performances, or recitals
Intercollegiate Competitions - useful for assessment when
students are asked to demonstrate knowledge or skills related
to the expected learning outcomes within appropriate programs
Course assessments - these are projects, assignments, or
exam questions that directly link to program-level expected
learning outcomes and are scored using established criteria;
common assignments may be included in multiple sections
taught by various professors (assuming prior agreement)
Direct Methods: Advantages
 Require students to actively demonstrate
knowledge, attitudes, and/or skills
Provide data to directly measure expected
Demand less abstract interpretation
Usually “easier” to administer
Direct Methods are always our first choice;
indirect methods support
but cannot replace direct methods
Achievement Levels or Criteria
 Rarely does every student achieve all SLOs
completely, 100%; nor can we expect this
What “level” of achievement is acceptable?
Identified in the “OPlan”
Rubrics recognize varying achievement levels
Rubrics are a scoring method or technique
appropriate to many assessment methods
A Rubric Example
Correctly analyzes
research data
Limits analysis to correct
basic descriptive
Selects and executes
correct basic statistical
Selects, articulates, and
executes an inferential
statistical analysis
Selects, articulates, and
executes the statistical
analysis suitable to the
research question
Excellent resource:
Stevens, D. D. & Levi, A. J. (2005). Introduction to Rubrics.
Sterling, VA: Stylus.
CBU utilizes 4-point rubrics, with the specific level
criteria appropriate to the outcome in question
Guidelines for Implementing
Imbedded, Direct Assessment
 Link class assignments to both SLOs and course
 If multiple sections of the same course exist and the
intent is to aggregate data across sections, ensure that
the assessment is the same in all sections (same
assignment and grading process)
 Make certain faculty collaboration underpins
assessment across multiple course sections
 Tell students which assignment(s) is being used for
SLO assessment as well as course assessment…Why?
Indirect Methods
Methods requiring the faculty and student life staff to
infer actual student abilities, knowledge, and values
rather than observing direct evidence of learning or
Indirect assessment is gathering information through
means other than looking at actual samples of student
work... e.g., surveys, exit interviews, and focus groups
Indirect methods provide perceptions of students,
faculty, or other people (often alumni or employers)
who are interested the program, service, or institution
Indirect methods expand on or confirm what is
discovered after first using direct methods
Indirect Methods, Continued
Exit interviews and Student Surveys - to provide
meaningful assessment information, exit interviews
and/or student surveys should focus on students’
perceived learning (knowledge, skills, abilities) as
well as students’ satisfaction with their learning
experiences; including such things as internships,
participation in research, independent projects,
numbers of papers written or oral presentations
given, and familiarity with discipline tools
Indirect Methods, Continued
Faculty Surveys aimed at getting feedback about
faculty perceptions of student knowledge, skills,
values, academic experiences, etc.
Alumni Surveys aimed at evaluating perceptions of
knowledge, skills, and values gained while studying
in a particular program. . . surveys frequently target
alumni who are 1-and 5- years post-graduation and
include program-specific questions
Indirect Methods, Continued
Surveys of Employers / Recruiters aimed at
evaluating specific competencies, skills, or outcomes
Tracking Student Data related to enrollment,
persistence, and performance... may include
graduation rates, enrollment trends, transcript analysis
(tracking what courses students take and when they
take them), and tracking student academic
performance overall and in particular courses
Indirect Methods, Continued
External Reviewers provide peer review of academic
programs and the method is a widely accepted in
assessing curricular sequences, course
development and delivery, as well as faculty
effectiveness. . . using external reviewers is a way to
assess whether student achievement reflects the
standards set forth in student learning and capacity
outcomes. . . skilled external reviewers can be
instrumental in identifying program strengths and
weaknesses leading to substantial curricular and
structural changes and improvements
Indirect Methods, Continued
Curriculum and syllabus analysis – Examining
whether the courses and other academic
experiences are related to the stated outcomes...
often accomplished in a chart or “map.”
Syllabus analysis is an especially useful technique
when multiple sections of a course are offered by a
variety of instructors. . . provides assurance that
each section covers essential points without
prescribing the specific teaching methods used in
helping the students learn the outcomes
Indirect Methods, Continued
Keeping records or observing students' use of
facilities and services... data can be correlated with
test scores and/or course grades
Example: Logs maintained by students or staff
members documenting time spent on course work,
interactions with faculty and other students,
internships, nature and frequency of library use,
computer labs, etc.
Advantages of Indirect Methods
Relatively easy to administer
Provide clues about what could/should be
assessed directly
Able to flesh out subjective areas direct
assessments cannot capture
Particularly useful for ascertaining values and
Surveys can be given to many respondents at
the same time
Indirect Methods Advantages, Continued
 Surveys are useful for gathering information from alumni,
employers, and graduate program representatives
Exit interviews and focus groups allow questioning
students face-to-face; exploring and clarifying answers is
done more easily
External reviewers can bring objectivity to assessment
and answer questions the program or department wants
answered or questions based on discipline-specific
national standards
Disadvantages of Indirect Methods
Indirect methods provide only impressions and
opinions, not “hard” evidence on learning
Impressions and opinions may change over time and
with additional experience
Respondents may tell you what they think you want to
Survey return rates are often low and, consequently,
not representative
Indirect Methods Disadvantages, Continued
 You cannot assume those who did not respond would
responded in the same way as those who did respond
 Exit interviews take considerable time to complete
 Focus groups usually involve a limited number of
respondents who are not representative
Unless the faculty agree upon the questions asked
during exit interviews and focus groups, there may not
be consistency in responses
Suggestions for Implementing
Indirect, Ancillary Assessment
 Use “purposeful samples” when it is not possible to
include all students (which is always the first choice)
 Offer incentives to participants
 Anticipate low turn-out and therefore over-recruit
 Plan carefully logistics and question design (i.e.,
surveys, interviews, focus groups)
 Train group moderators and survey interviewers
Implementation Suggestions, Continued
 Consider using web-based or telephone as well
as face-to-face interviews or focus groups
 Set time limits for focus groups and interviews
 Develop and provide very careful, explicit
 Be wary of FERPA regulations when using
archival records
 Only use archival records that are relevant to
specific outcomes
Implementing Assessment in General
 Capitalize on what you are already doing
 Integrate imbedded assessment as much as possible
 Schedule ancillary assessment during regular class
times or times when students are present
 Make assessment a graduation requirement
 Plan an “assessment day”
 Seek to make assessment a routine activity within
your curriculum or student services programs
Strategy Combinations
Depending on the specific SLO, there are four
assessment strategies or frames:
 Imbedded, direct assessment
 Imbedded, indirect assessment
 Ancillary, direct assessment
 Ancillary, indirect assessment
REMEMBER: There is more than one way to assess any given SLO!
It’s your choice as long as it is valid and reliable.