PowerPoint Presentation (Banta, 2003)

advertisement
Plan
Implement
Improve
Assess
Outcomes Assessment
The process of providing credible
evidence of the outcomes of
higher education undertaken for
the purpose of improving
programs and services within the
institution.
Banta, T. W.
ASSESSMENT . . .
“a rich conversation
about student learning
informed by data.”
-- Ted Marchese -AAHE
Assessment of Individual
Student Development
Assessment of basic skills for use in
advising
• Placement
• Counseling
Periodic review of performance with
detailed feedback
End-of-program certification of
competence
• Licensing exams
• External examiners
• CLAST
Key Results of Individual
Assessment
 Faculty
can assign grades
 Students
learn their own
strengths and weaknesses
 Students
become selfassessors
A Second Look
Across
students
Across
sections
Across
courses
 Where
 What
is learning satisfactory?
needs to be retaught?
 Which
approaches produce the
most learning for which
students?
Group Assessment
Activities
•
•
•
•
•
•
•
Classroom assignments, test,
projects
Questionnaires for students,
graduates, employers
Interviews, focus groups
Program completion and placement
Awards/recognition for graduates
Monitoring of success in graduate
school
Monitoring of success on the job
Use of Results of Group
Assessment
•
Program improvement
•
Institutional and / or state
peer review
•
Regional and / or national
accreditation
Some Purposes of
Assessment
1.
2.
3.
4.
5.
Students learn content
Students assess own strengths
Faculty improve instruction
Institutions improve programs/services
Institutions demonstrate accountability
Outcomes Assessment
Requires Collaboration





In setting expected program
outcomes
In developing sequence of learning
experiences (curriculum)
In choosing measures
In interpreting assessment findings
In making responsive improvements
Barriers to Collaboration
in the Academy
1.
2.
3.
4.
Graduate schools prepare
specialists
Departments hire specialists
Much of our scholarship is
conducted alone
Promotion and tenure favor
individual achievements -interdisciplinary work is harder to
evaluate
Campus Interest in
Assessment
WHAT WORKS in….




increasing student retention?
general education?
use of technology in instruction?
curriculum in the major?
Good assessment is
good research . . .





An important question
An approach to answer the
question
Data collection
Analysis
Report
-Gary R. Pike (2000)
To Foster Collaboration
 Name
interdisciplinary committees
 Read and discuss current literature
on learning/assessment
 Attend conferences together
 Bring experts to campus
 Share good practices
 Work together on learning
communities
Most Faculty Are Not Trained as
Teachers
FACULTY DEVELOPMENT
Can Help Instructors:

Write clear objectives for student learning in
courses and curricula
 Individualize instruction using a variety of
methods and materials
 Ask questions that make students active
learners
 Develop assessment tools that test higher
order intellectual skills
Taxonomy of Educational
Objectives
(Bloom and Others, 1956)
Cognitive domain
categories
Knowledge
Comprehension
Application
Analysis
Synthesis
Evaluation
Sample verbs for outcomes
Identifies, defines, describes
Explains, summarizes, classifies
Demonstrates, computes, solves
Differentiates, diagrams, estimates
Creates, formulates, revises
Criticizes, compares, concludes
Organizing for Assessment
Goal
Course Measure Findings Uses
Write
Portfolio
Speak
Speech
Think
Test
Find
Information
Project
Some General Education
Objectives
 Differentiate
between fact and
opinion
 Gather, analyze, and interpret data
 Apply ethical principles to local,
national, global issues
 Communicate ideas in writing
effectively
Learning Outcomes in
Science
1.
2.
3.
4.
5.
6.
Define and explain basic principles, concepts,
theories of science
Identify characteristics that distinguish math
and science from each other and from other
ways of obtaining knowledge
Illustrate how developments in science can
raise ethical issues
Solve theoretical or experimental problems in
science
Evaluate the validity and limitations of
theories and scientific claims in interpreting
experimental results
Evaluate scientific arguments at a level
encountered by informed citizens
Critical Assessment Questions
1. What should a major know and be able to
do?
2. What curriculum experiences promote
student attainment of
This knowledge?
These skills?
3. Are these experiences taking place?
4. How do we know students are attaining
The knowledge?
The skills?
Planning for Learning and
Assessment
1. What
2. How
3. How will
general
would you
you help
outcome
know it
students
are you
(the
learn it?
seeking?
outcome)
(in class
if you saw
or out of
it? (What
class)
will the
student
know or
be able to
do?)
4. How could
5. What are 6. What
you
the
improvemeasure
assessments
each of the
ment
might be
desired
findings?
based on
behaviors
assesslisted in #2?
ment
findings?
Some Assessment History
1970 – Alverno
NE Missouri
1979 – Tennessee
1985 – VA, NJ, CO
1998 – HE Amendments - Accreditors
Purposes for Assessment

Accountability
to satisfy external
stakeholders

Improvement
to make things better
internally
Some external impetus is
necessary
to initiate outcomes assessment
in higher education.
Organizational Levels for Assessment
National
Regional
State
Campus
College
Discipline
Classroom
Student
Licensing/Certification
Tests
• National Teacher Exam
• Commons and specialty areas
•
•
•
•
•
•
Engineer in Training Exam
NCLEX in Nursing
CPA exam in Accounting
Bar exam in Law
NCARB exam in Architecture
Board exams in Medicine, Social Work,
Planning
Major Field Achievement Tests
from Educational Testing Service
Princeton, New Jersey
Biology
Chemistry
Computer Science
Economics
Education
Engineering
Geology
History
Literature in English
Mathematics
Music
Physics
Political Science
Psychology
Sociology
Definitions and Assessment Methods for
Critical Thinking, Problem Solving,
and Writing
By
T. Dary Erwin
James Madison University
for the
National Postsecondary Education Cooperative
(U.S. Dept. of Education, National Center for Education
Statistics)
Student Outcomes Pilot
Cognitive Working Group
Washington, DC 1998
Website: nces02.ed.gov/evaltests
Are Standardized Tests
the Answer?
 Not
available in many fields
 Do not measure all that is taught
 Usually assess knowledge, not
performance
 May be standardized on
unrepresentative norm group
 Provide few, if any, subscores
 Do not indicate why scores are low
Start with Measures You
Have
 Assignments
 Course
 Work
in courses
exams
performance
 Records
of progress through
the curriculum
Primary Trait Scoring
Assigns scores to attributes (traits) of a task
STEPS

Identify traits necessary for success in
assignment
 Compose scale or rubric giving clear
definition to each point
 Grade using the rubric
Can Develop a Research
Paper
1.
2.
3.
4.
5.
6.
Narrows and defines
topic
Produces
bibliography
Develops outline
Produces first draft
Produces final draft
Presents oral
defense
Outstanding
Accept- Unaccept
able
-able












Bibliography
Outstanding – References current,
appropriately cited, representative,
relevant
Acceptable – References mostly current,
few citation errors, coverage adequate,
mostly relevant
Unacceptable – No references or
containing many errors in citation
format, inadequate coverage or
irrelevant
Mapping Course Outcomes
to Program Outcomes
Outcomes
Course 1
Course 2
Course 3
1





2
3



4
5

6

7


Sophomore Competence in Mathematics
(Multiple choice responses & supporting
work)
Score
3
2
1
0
Criterion
Clear conceptual understanding, consistent
notation, logical formulation, complete
solution
Adequate understanding, careless errors,
some logic missing, incomplete solution
Inadequate understanding, procedural errors,
logical steps missing, poor or no response
Problem not attempted or conceptual
understanding totally lacking
Ball State University
Assessment in Sociology and
Anthropology
Focus groups of graduating students
 Given
a scenario appropriate to the discipline,
a faculty facilitator asks questions related to
outcomes faculty have identified in 3 areas:
concepts, theory, methods.
 2 faculty observers use 0-3 scale to rate each
student on each question
 GROUP scores are discussed by all faculty

Murphy & Goreham
North Dakota State University
Journal Evaluation
1.
2.
3.
Entries accurately and
vividly record objective
observations of site
experiences (events,
people, actions, setting)
Entries convincingly record
subjective responses to site
experience (thoughts,
emotions, values,
judgments)
Entries effectively analyze/
evaluate your experiences
(find insights, patterns,
meaning, causes, effects)
Well
done
Satisfactory
Unsatisfactory
Direct Measures of Learning
Assignments, exams, projects, papers
Indirect Measures
Questionnaires, inventories, interviews
- Did the course cover these objectives?
- How much did your knowledge increase?
- Did the teaching method(s) help you
learn?
- Did the assignments help you learn?
Fast Feedback
(at end of every class)




Most important thing learned
Muddiest point
Helpfulness of advance reading
assignments for day’s work in class
Suggestions for improving class /
assignments
Bateman & Roberts
Graduate School of Business
University of Chicago
Student Suggestions for
Improvement





Install a portable microphone
Increase type size on
transparencies
Leave lights on when using
projector
Don’t cover assigned reading in
detail
Provide more examples in class
College Student Experience
Questionnaire
(4th Edition)
SCALES






Computer and information
technology
Course learning
Writing experience
Experience with faculty
Art, music, theater
Campus facilities







Clubs and organizations
Personal experiences
Student acquaintances
Scientific and quantitative
experiences
Conversations
The college environment
Estimate of gains
College Student Experiences
Questionnaire (sample item)
Library Experience

used library as quiet place to study
 used online catalogue
 asked librarian for help
 read reserve book
 used indexes to journal articles
 developed bibliography
 found interesting material by browsing
 looked for further references cited
 used specialized bibliographies
 read document other authors cited
Self-Reports
How
time is spent
Studying
(IUB)
In all activities (Miami)
Social
interactions
Diaries, journals
Portfolios
Assessing Student Growth
The Portfolio - Some Examples of Content







Course assignments
Research papers
Materials from group projects
Artistic productions
Self-reflective essays (self-assessment)
Correspondence
Taped presentations
Student Electronic Portfolio
 Students
take responsibility for
demonstrating core skills
 Unique individual skills and
achievements can be emphasized
 Multi-media opportunities extend
possibilities
 Metacognitive thinking is enhanced
through reflection on contents
- Sharon J. Hamilton
IUPUI
Using Electronic Assessment
Methods
We can

Track progress in assignments

Evaluate contributions to group
projects

Conduct immediate process
checks to evaluate instruction

Assess the quality of written
work
Faculty-Developed Exam
in Religious Studies
Components:
1 Identification of topic for comprehensive paper in senior seminar
 faculty critique
2 Development of bibliography for paper
 faculty critique
3 Development of outline for paper
 faculty critique
4 Writing of first draft of paper
 faculty critique
5 Writing of final paper
 faculty critique according to set of guidelines
 critique by external consultants using same guidelines
University of Tennessee, Knoxville
Authentic Assessment
at
Southern Illinois University - Edwardsville

Business - Case Study Analysis with Memo
 Education - Professional Portfolio
 Psychology - Poster on Research Project
 Engineering - Senior Design Project
 Nursing - Plan of Care for Patient
Responses to Assessment
at
Southern Illinois University - Edwardsville
Business - More case studies and research
 Education - More practice in classroom
management
 Psychology - Curriculum change in statistics
 Engineering - More practice in writing and
speaking
 Nursing - Simulation lab with computerized
patients
In a Comprehensive
Assessment Program...
INVOLVE
 Students
 Faculty
 Student Affairs
Staff
 Administrators
 Graduates
 Employers
Guidance from Alumni

Alumni surveys emphasized that graduates
valued skills in writing, speaking, working
collaboratively, and information literacy

Now the Faculty Senate’s General Education
Committee has developed 5 learning
elements, at least 3 of which must be
integrated in any course approved for general
education
-Michael Dooris
Penn State University
Involving Employers
Combination of survey and focus groups
for employers of business graduates




Identified skills, knowledge, personality attributes
sought by employers
Encouraged faculty to make curriculum changes
Motivated student to develop needed skills
Strengthened ties among faculty, students,
employers
- Kretovics & McCambridge
Colorado State University
Colorado State University
College of Business
Curriculum changes based on employer
suggestions:

1 credit added to Business Communications for team
training and more presentations
 Ethics & social responsibility now discussed in intro
courses
 New Intro to Business course emphasizing career
decision-making
 More teamwork, oral & written communication,
problem-solving in Management survey courses
- Kretovics & McCambridge
Longwood College
In 1989 –
MFAT scores at 35th percentile
“a marginal program”
In 1991 –
New dean engaged faculty in assessment
and continuous improvement
In 1998 –
MFAT scores at 96th percentile
Satisfaction of students and faculty ranked 2nd
of 7 peers
AACSB accreditation with highest rating
Building a Scholarship of
Assessment
- Banta & Associates
Jossey-Bass Publishers
April 2002
The Scholarship of Assessment
Involves
 basing
assessment studies on
relevant theory/practice
 gathering
evidence
 developing
 sharing
a summary of findings
findings with the
assessment community
Some Research Traditions
Underlying Assessment
 Program
evaluation
 Organizational change and
development
 Cognitive psychology
 Student development
 Measurement
 Informatics
Assessment Methods
Improve instruments to measure
 content knowledge at more
complex levels
 affective development
 effects of educational
interventions
 changes in learning over time
Organizational Behavior &
Development

How can assessment be combined with
other systemic changes to improve
teaching & learning?
 What patterns of organizational behavior
promote and sustain assessment?
 What methods of providing and
managing assessment information are
most effective?
 Which public policy initiatives are most
effective in promoting improvement on
campuses?
Targets for Research on
Engaging Faculty

How can we determine the interests and
commitments of stakeholders?
 How should we educate stakeholders for
choosing methods?
 How can we reduce costs and maximize
assessment’s benefits?
 What ethical principles should guide our
work?
Derived from Michael Quinn Patton’s
Utilization – Focused Evaluation (1997)
Success Factors
Committed leadership
2 Collaboration between faculty and student
affairs leaders
3 Teamwork in planning and implementation
4 Supportive campus climate
Concern for students, continuous
improvement
5 Involvement in design of assessment
6 Results effectively communicated
7 Conscientious follow-up
8 Persistence
1
The Future

Need for evidence of accountability will
increase
 More faculty will recognize benefits of
assessment
 More electronic assessment methods will be
developed
 More sharing of assessment methods will
take place
 Faculty will learn more about learning and
student learning will improve
Download