Assessing First-Year Seminars - University of South Carolina

advertisement
Assessing First-Year Seminars
First-Year Assessment Conference
San Antonio, TX
October 13, 2008
Dan Friedman, Ph.D.
Director, University 101
University of South Carolina
Agenda
1.
2.
3.
What is Assessment?
Assessment Lenses
Who & What to Assess?
a)
b)
c)
I-E-O model applied
Goals v. outcomes
Formulating learning outcomes
Agenda
4. How to Assess
a)
b)
c)
Direct v. Indirect Measures
Elective v. Required Courses
Assessing Pedagogies
5. Sharing & Utilizing the Results
6. Final Advice
FAITH-BASED?

“Estimates of college quality are essentially
"faith-based," insofar as we have little direct
evidence of how any given school contributes to
students' learning.”

RICHARD HERSCH (2005). ATLANTIC MONTHLY
What is Assessment?
Assessment Defined
 Any
effort to gather, analyze, and interpret
evidence which describes program
effectiveness.

Upcraft and Schuh, 1996
 An
ongoing process aimed at
understanding and improving _______.

Thomas Angelo
Assessment Cycle
Gather
Evidence
Interpret
Evidence
Identify Outcomes
Implement
Change
Maki, P. (2004).
Two Types of Assessment
1) Summative – used to make a judgment
about the efficacy of a program
2) Formative – used to provide feedback in
order to foster improvement.
Word of Caution
Assessment only allows us to make inferences
about our programs, not to draw absolute
truths.
The Prescription
Rx
Rx for Assessing a 1st-year Seminar

Relevance


Content (doing the right things)
Excellence

Effectiveness (doing things right)
Assessment Lenses
Multiple Lenses of Assessment
 Standards
based
 Peer referenced
 Longitudinal
 Value added
Suskie, L (2004)
Hypothetical Scenario

If Dan made a 55 on some sort of exam, how did
he do?


NEED MORE INFORMATION!
Need a lens to help us make a judgment.
Lens 1: Standards Based
(local or external)
Key Question: How do results compare to some
internal or external standard?
Example:
 Dan made a 55
 A score of 45 is considered proficient
 80% of students at our institution scored above a
45

Is that good?
Lens 2: Peer Referenced (benchmarking)
Key Question: How do we compare with our
peers?
 Gives a sense of relative standing.
Example:
 80% of students at our institution scored above a
45.
 For our Peer Group, 90% scored above 45.
Lens 3: Longitudinal
Key Question: Are we getting better?
Example:
 80% of students at our institution scored above a
45.
 But 3 years ago, only 60% scored above a 45.


Showed great improvement.
Is that due to our efforts?
 Maybe we just admitted better students!
Lens 4: Value Added
Key Question: Are our students improving?
Example:
 Dan scored a 35 when he first took the test as a
freshman. After three years of college, Dan
scored a 55.
 Proficiency level of freshman class was 40%.
Three years later, 70% of same cohort were
proficient.
Astin’s Value-Added
I – E – O Model
E
Environments
I
Inputs
Astin, A. (1991)
“outputs must always be evaluated
in terms of inputs”
O
Outcomes
Common Mistakes
Just looking at
inputs
E
Environments
I
O
Inputs
Outcomes
Common Mistakes
Just looking at
environment
E
Environments
I
O
Inputs
Outcomes
Common Mistakes
E
Just looking at
outcomes
Environments
I
O
Inputs
Outcomes
Common Mistakes
E
Environments
E-O Only
(No Control
for Inputs)
I
O
Inputs
Outcomes
Summary of Value Added

Outputs must always be evaluated in terms of
inputs

Only way to “know” the impact an environment
(treatment) had on an outcome.
Who & What to
Assess
Inputs
An input would be any pre-enrollment variable
regarding our students that could conceivably
impact the outcome.
 What are our inputs?




Academic preparedness (high school performance;
SAT scores, etc)
Demographics (gender, race, parental education, etc)
Attitudes & Behaviors
Motivation
 Expectations regarding level of engagement in college
 Study habits

Sources of Input Data




Admissions
Registrar
Institutional Research
Surveys





College Student Inventory (CSI)
College Student Expectations Questionnaire (CSXQ)
Beginning College Survey of Student Engagement (BCSSE)
Freshman Survey (Higher Education Research Institute –
UCLA).
Survey of Entering Student Engagement (SENSE) –for C.C.
http://www.sc.edu/fye/resources/assessment/typology.html
Environment

The environment = intervention or treatment.
Environment
Is the FYS really just one treatment?
 What are the individual variables in a FYS that
could contribute to our outcomes?

Environment
Individual factors comprising a first-year seminar
that could contribute to outcomes:






Small class size
Out-of-class engagement
Faculty-student interaction
Peer connections
Use of peer leader
Specific content
 Time management
 Academic skill development
Outcomes

Academic Outcomes
Grades, Persistence, Graduation
 Writing


Personal Development Outcomes


Social, emotional, ethical, physical
Attitudinal & behavioral
Satisfaction
 Engagement in learning experience
 Time management


Cognitive

Knowledge of specific content
 Wellness, campus policies, school history, etc.
We should measure our specific
program goals & outcomes!
Goal
General, broad, and abstract

Ex: Help students achieve academic success
Outcome
Specific and concrete
Ex: Students will strengthen their notetaking skills.
Learning Outcomes (aka Objective)

A statement that “identifies what students
should be able to demonstrate or represent or
produce as a result of what and how they have
learned at the institution or in a program” (p.
61).
Maki, P.L (2004). Assessing for Learning.
A good learning outcome is…
Observable – action words – what should
students be able to DO
 Focused on outcomes – what students should be
able to do after the course


“as a result of this course, students should….”
Clear – no fuzzy terms (appreciate!)
 Use active verbs (create, develop, evaluate,
apply, identify, formulate, etc)

Maki, P.L (2004). Assessing for Learning.
Examples

As a result of this course, students should be able
to:



Locate and evaluate electronic information in the
university’s library.
Identify appropriate campus resources
Articulate the purpose of general education
Evidence of Learning

What evidence is necessary to sufficiently infer
that a student has met or achieved a specific
outcome?
Students will strengthen their note-taking skills.


What does this look like?
Need to develop standards, criteria, metrics, etc
4
How to Assess?
Direct v. Indirect Measures
Indirect Measure

An indirect measure is something a student
might tell you he or she has gained, learned,
experienced, etc.



Aka: self-reported data
Ex: surveys, interviews, focus groups, etc.
Use existing data to every extent possible
Survey Examples for Indirect Measures
College Student Experiences Questionnaire
(CSEQ)
 National Survey of Student Engagement (NSSE)
 Community College Survey Student Engagement
(CCSSE)
 Your First College Year (YFCY)
 First-Year Initiative Survey (FYI)

http://nrc.fye.sc.edu/resources/survey/search/index.php
Qualitative Examples for Indirect
Measures
Interviews
 Focus groups
 Advisory council

Direct Measures
A direct measure is tangible evidence about a
student’s ability, performance, experience, etc.


Ex: performances (papers), common assignments,
tests, etc.
Ways to assess direct measures
Course embedded (essays, assignments, etc)
 Portfolios (electronic or hard copy)


Writing sample at beginning of course v. end of
course.
Pre-and post-testing on locally developed tests
(of knowledge or skills)
 National tests


http://www.sc.edu/fye/resources/assessment/typology.html
Challenges with Value Added Approach
Motivation (for direct measures)


How do we ensure students take assessment
seriously? Is there a hook?
Is growth due to our interventions?


How do you control for all the variables that could
influence the outcomes?
Making Comparisons

For elective courses – compare with students
who did not enroll (control group).

For REQUIRED courses – can only compare
with Peer Institutions (benchmarking) or with
prior years (longitudinal).
Other Considerations

Do all types of students and sub-populations
experience or benefit from the course in the
same way?

Disaggregate data by sub-populations

Ex:





Minority
First-generation
Gender
Ability level
When looking at GPAs, it might be wise to factor out FYS
grade.
Sharing & Utilizing the
Results
“You Can’t Fatten A Pig by Weighing It”
-T.Angelo
Ways to Share Results





Host forum to process what the data
mean
Standing assessment committee for FYS
Newsletters
Website
Chain of command
Final Advice
Final Advice
There is no perfect assessment plan (it’s a series
of compromises)
 Assessment raises more questions than it
answers
 Not every goal needs to be assessed every year
 Avoid over-surveying your students
 Utilize existing data, when possible
 Need to communicate assessment findings to
stakeholders and participants (you can’t fatten a
pig by weighing it)

Contact Information
Dr. Dan Friedman
1728 College Street
Columbia, SC 29208
friedman@sc.edu
(803) 777-9506
Download