UNIVERSITY OF LOUISVILLE DIVISION OF STUDENT AFFAIRS

advertisement
UNIVERSITY OF LOUISVILLE
DIVISION OF STUDENT AFFAIRS
Assessment Plan 2008-2010
OVERVIEW
The Division of Student Affairs at the University of Louisville is committed to assessment, defined as efforts
“to gather, analyze, and interpret evidence which describes [educational program] effectiveness” (Schuh &
Upcraft, 2001). The needs of the Division regarding assessment currently fall into three main areas:
GOAL 1. Conduct, evaluation, and implementation of departmental program reviews on an ongoing,
cyclical basis. The Division of Student Affairs has a thorough, existing cycle for departmental program
reviews completed by the Dean of Students and department Directors (Refer to Appendix B for program
review schedule). Therefore, new assessment efforts during 2008-2010 will focus on the following two goals.
GOAL 2. Foster connections and better integrate Student Affairs outcomes with University (I2A
critical thinking) outcomes. Aligning goals at various university levels, including those at the program,
department, Student Affairs Division, and larger institutional levels, presents an ongoing challenge. The
present plan will specifically address the link between Student Affairs department goals with the Ideas to
Action (I2A) Quality Enhancement Plan QEP) through the creation of a Collaborative Learning Community
(CLC) workgroup.
GOAL 3. Develop the overall assessment knowledge and skill base of Student Affairs staff using the 13
content areas outlined by the ASK standards. Development of knowledge, skills, and dispositions critical
to the performance of meaningful and useful higher education assessment is essential for Student Affairs
staff. The present plan will outline a series of educational workshops to develop these competencies.
1
GOAL 2. Foster Connections and Better Integrate Student Affairs Outcomes
with University (I2A Critical Thinking) Outcomes.
Summary
Developed through U of L’s SACS accreditation process, I2A calls upon Student Affairs to demonstrate how
we foster critical thinking skills through co-curricular and extra-curricular programs. Towards this end,
Student Affairs and Undergraduate Affairs are partnering with I2A facilitators on a Collaborative Learning
Community (CLC) work group. A voluntary cohort of 1-2 staff members from several Student Affairs
departments will meet monthly during the 2008-2009 year with I2A and Undergraduate Affairs to align
departmental and I2A priorities by designing, implementing, and assessing a project to support their program
goals. Specifically, the CLC will help facilitate use of the Paul-Elder critical thinking model in the work of
Student Affairs to help staff more intentionally describe, foster and assess the thinking skills they desire to
see students demonstrate.
I2A administrative staff will direct the CLC initiative. The CLC Working Group Proposal outlines the
following participant responsibilities for 2008-2009:
1. Learn to use the Paul-Elder model to design or revise an SLO, program goal, or other dimension of your
work to promote students’ critical thinking and design and use an appropriate assessment tool in support
of the adaptation. The adaptation must utilize language from the Paul-Elder model.
2. Draft a plan to make critical thinking an explicit, rather than tacit, outcome of your work and the specific
SLO, goal, or other dimension of your work.
3. Devise at least one formative assessment tool to gain informal feedback from students about their
experiences, needs and benefits of the new or revised strategy and/or SLO, and feed that information
back into your planning (mid-semester questionnaire, short reflective paper, etc.).
4. Create an assessment plan that utilizes an existing or new rubric or other assessment tool; collaborators
may wish to design their own assessment tool with assistance from the I2A team.
5. Compile a brief reflection/report in Spring 2009.
During the 2009-2010 year, it is anticipated that the initial CLC members would attend the ASK-based
workshops, while a new cohort of participants, having participated in the workshops, will opt into the CLC.
In addition, the initial CLC group members will share assessment data and conclusions with the Division.
Outcomes
2008-2009
In the CLC Working Group Proposal, I2A administrative staff established the following outcomes for the
CLC:
1. Identify key areas for strategic collaboration in order to meet shared outcomes and goals between I2A
and departments within Student Affairs and Undergraduate Affairs. This will support the transformation
of the curricular, co-curricular and extracurricular experience for undergraduate students.
2. Identify and support Student Affairs and Undergraduate Affairs staff members who are willing to align
I2A priorities (critical thinking) with their departmental priorities by designing, implementing and
assessing a project to support their own program goals.
3. Create a small cohort of Student Affairs & Undergraduate Affairs staff who will meet monthly
throughout the 08-09 academic year (September – April) with I2A Team members to gain guidance and
share strategies for their collaborative projects. Staff will attend working sessions on a monthly basis,
complete shared readings and participate in training to support the projects.
4. Collect and share formal and informal data about student learning and staff development related to
critical thinking that will be used to facilitate the Ideas to Action initiative and meet the outcomes of
Student Affairs & Undergraduate Affairs.
2
5. Facilitate additional training and support across Student Affairs & Undergraduate Affairs using the PaulElder critical thinking model.
2009-2010
Next year’s outcomes for the CLC will be determined by I2A administrative staff and CLC members based
upon assessment of progress on 2009-2009 outcomes.
Assessment
The five CLC outcomes will be assessed by I2A administrative staff, and Student Affairs staff participating
in the CLC group will be encouraged to share perceptions of its effectiveness with I2A leaders and with other
Student Affairs staff.
Leadership
I2A administrative staff will direct the CLC initiative, with ongoing input and direction from CLC members.
Communication
The goal is for CLC members to prepare and submit a proposal on CLC workgroup outcomes during summer
of 2009 for presentation at national conferences of professional Student Affairs organizations (such as ACPA
and NASPA) occurring spring 2010.
3
GOAL 3. Develop the Overall Assessment Knowledge and Skill Base of Student Affairs Staff
Using the 13 Content Areas Outlined by the ASK Standards.
Summary
In 2006, the ACPA Commission for Assessment for Student Development (CASD) developed the
Assessment Skills and Knowledge (ASK) standards which attempt to “describe what [professionals] need to
know” (p.4) to perform assessment. While the ASK standards do not address specific levels of skill
proficiency required, they represent a first step in the articulation of a foundational knowledge base for
assessment. According to Dr. Gavin Henning, Chair of the ACPA Commission on Assessment and
Evaluation, the Commission is working on Beginning, Intermediate, and Advanced competencies within
each standard but that it will be some time before they are available. The ASK standards will be adopted to
guide assessment education within the division. (Refer to Appendix C for summary of ASK standards).
The goal of improving the assessment knowledge and skill base of Student Affairs staff will be addressed in
the coming 2008-2009 year through a monthly series of Cardinal Learning Assessment Workshops
(CLAWS) to upgrade skills along the ASK content standards. The series of 1½-2 hour workshop sessions
will be offered monthly and scheduled from roughly noon until 1:30 or 2 pm. The CLAWS are open to
Divisional staff members and interns, to graduate students studying College Student Personnel and Higher
Education Administration, and to any interested community members working in higher education. While
representatives from each department not participating in the CLC will be asked to attend, CLC members are
also certainly encouraged, as assessment knowledge reviewed in the CLAWS workshops would reinforce
CLC goals.
During the 2009-2010 year, it is anticipated that CLC members from the prior year would attend the ASKbased workshops, while staff having participated in the 2008-2009 ASK workshops would participate in the
CLC group. The learning assessment workshop series will be repeated in 2009-2010, but will be altered in
topics or content based upon staff needs assessment.
Overview of Workshop Schedule and Topics
Refer to Appendix A for more detail on workshop outcomes. Topics and content may be slightly altered due
to staff feedback from the Needs Assessment.
September 2008:
October 2008:
November 2009:
January 2009:
February 2009:
March 2009:
April 2009:
Assessment Fundamentals (ASK 1, 13)
Articulating Learning and Development Outcomes (ASK 2)
Overview of Quantitative Methods (ASK 3-7)
Overview of Qualitative Methods: Interviews and Focus Groups (ASK 3-7)
Using and Communicating Assessment Results (ASK 7, 11)
Benchmarking and Program Review (ASK 8, 9)
Assessment Ethics and Politics (ASK 10, 12)
Outcomes
2008-2009
1. Increase Student Affairs members’ skills and knowledge in assessment content areas (ASK standards)
from fall 2008 to fall 2009 using self-ratings on ACPA’s ASK Needs Assessment.
2009-2010
1. Increase Student Affairs members’ skills and knowledge in assessment content areas (ASK standards)
from fall 2009 to fall 2010 using self-ratings on ACPA’s ASK Needs Assessment.
4
Assessment
The ACPA ASK Standards Needs Assessment will be used to gauge self-rated levels of staff knowledge in
ASK content areas. Ratings range from “no experience” to “accomplished” and can be completed in about
one minute This assessment will be administered Division-wide through email in fall 2008 and then again in
fall 2009. Results following the 2008-2009 workshop series will be utilized to tailor training in future years
to areas indicating the most need for development. (See Appendix D for ACPA ASK Standards Needs
Assessment).
Assessments of the monthly workshops will also be completed and used to tailor future trainings.
Leadership
The workshop series planning will be initiated by the summer 2008 doctoral intern under the direction of
Becky Clark, with ongoing implementation during the 2008-2009 and 2009-2010 academic years by future
doctoral and graduate interns within the College Student Personnel and Higher Education Administration
programs. Results of the fall 2008 ASK Needs Assessment will also be used to determine if current staff
have advanced knowledge in content areas and therefore might be invited to co-present.
Communication
A formal report of learning assessment workshop outcomes will be communicated at least annually and will
include:
1) Findings of the ASK Needs Assessment, and
2) An evaluation of individual workshop assessments.
The report may be presented at the summer Student Affairs Academies in 2009 and 2010, at Division staff
meetings, and at local, regional, or national conferences of professional Student Affairs organizations.
5
APPENDIX A: WORKSHOP PLAN DETAIL
Workshop outcomes for September-February modeled after the University of Maryland-Baltimore County
(UMBC) 2006-2007 Workshop Series designed by Susan C. Martin, Ed.D, Assessment and Research
Coordinator for Student Affairs.
September 2008: Assessment Fundamentals
Outcomes. Participants will be able to:
1. Review the year-long Assessment Workshop Series, CLC, and relationship to assessment plan
2. Understand several models for assessment: ASK Content standards (ACPA), the Assessment Cycle
(Maki), and the assessment continuum (Henning)
3. Understand the relationship among mission statements, goals or objectives, and outcomes
4. Identify objectives of programs in participant’s area of responsibility
5. Map department programs onto Division of Student Affairs and institutional goals
Potential program presenter(s): Becky Clark, intern(s)
October 2008: Articulating Learning and Development Outcomes
Outcomes. Participants will be able to:
1. Differentiate learning outcomes from mission and goals
2. Name types of outcome statements, necessary components of outcome statements, and criteria to judge
outcome statements
3. Identify primary domains of learning (affective, cognitive, physical) and theories applicable to at least one
area (Bloom’s taxonomy)
4. Practice writing and critiquing outcome statements
Program presenter(s): ECPY or ELFH faculty—Amy Hirschy? Outside presenter (review recent conference
presentations?)
November 2008: Overview of Quantitative Methods
Outcomes. Participants will be able to:
1. List at least 3 major considerations when choosing an assessment method
2. Differentiate between direct and indirect evidence
3. Discuss philosophical differences between qualitative and quantitative approaches
4. Name several guidelines that should be followed when selecting a survey instrument
5. Name several important considerations of survey research
Program presenter(s): Faculty from ELFH, Psychology or Sociology? Institutional Research? ELFH survey
professor?
January 2009: Overview of Qualitative Methods: Interviews and Focus Groups
Outcomes. Participants will be able to:
1. Identify philosophical assumptions of qualitative research
2. Explain the types and purpose of interviews
3. Describe the steps of the focus group process
4. List examples of how interviews and/or focus groups might be a preferred assessment method
7. Explain several measures that should be taken to ensure quality of data and findings from interviews and
focus groups
Program presenter(s): Program presenter(s): faculty teaching use of focus groups? Staff who have
experience?
6
February 2009: Using and Communicating Assessment Results
Outcomes. Participants will be able to:
1. List several reasons for documenting assessment projects.
2. Explain three formats used for reporting assessment findings.
3. Describe issues to consider when choosing the reporting format.
4. Identify stakeholders and potential audiences, and how levels of detail and sensitivity may vary among
each.
Program presenter(s): ??
March 2009: Benchmarking and Program Review
Outcomes. Participants will be able to:
1. Identify national, regional, or local sources of benchmarking data
2. Review CAS standards and learn how to apply to improve program services
Program presenter(s): EBI?, Staff having used CAS recently?
April 2009: Assessment Ethics and Politics
Outcomes. Participants will be able to:
1. List several common barriers to performing assessment and incorporating results into practice
2. Identify contextual/institutional factors that contribute to the need for assessment.
3. Recognize risks of assessment results.
4. Explain the purpose of an IRB and when approval is required for assessment
5. Describe the basic tenets of FERPA and how they apply to assessment
6. Give example of how findings might be communicated to respect confidentiality/anonymity.
Program presenter(s): someone from Registrar (FERPA)?, IRB staff??
7
APPENDIX B: STUDENT AFFAIRS DEPARTMENTAL PROGRAM REVIEW SCHEDULE
Current as of July, 2008
Student Affairs Departmental Program Review
Student Affairs departmental assessment is conducted on a regular cyclical basis.
Student Affairs departments assess their effectiveness on a regular basis in one of two ways:
 By engaging a committee consisting of faculty, staff, and students to conduct a departmental selfassessment utilizing the CAS Professional Standards for Higher Education; or
 By inviting a professional from outside the university with expertise in their area to perform a review of
outcomes, programs and services. These reports are utilized to develop departmental goals and objectives
as well as to evaluate current outcomes, programs and services.
Departmental student learning outcomes may be evaluated as part of the CAS criterion.
Department
Assessment Schedule
Completed
Fall 2001
Fall 2002
Spring 2004
Fall 2003
Spring 2004
Spring 2007
Spring 2007
Fall 2007
Fall 2007
Fall 2007
Spring 2008
Spring 2008
Summer 2008
Summer 2008
Fall 2008
Fall 2008
Spring 2009
Spring 2009
Spring 2009
Spring 2009
Fall 2001
Fall 2002
Spring 2004
Spring 2004
Spring 2004
Spring 2007
Fall 2007
Fall 2007
Fall 2007
Fall 2007
Spring 2008
Spring 2008
Intramural and Recreational Sports
Housing and Residence Life - Programs
Housing and Residence Life - Financial/Facilities
Career Development Center
Commuter Student Service (ACCESS)
Student Activities Center and Programs
International Service Learning Program
Greek Life
Service Learning
Intramural and Recreational Sports
Disability Resource Center
Judicial Services
Student Leadership Program
Counseling Center
Housing and Residence Life
National Student Exchange Program
Career Development Center
Student Affairs VPSA
Recognized Student Organizations
Off Campus Student Services
8
APPENDIX C: OVERVIEW OF ASK STANDARDS (ACPA, 2006)
FOUNDATIONAL ISSUES (1-2)
1. Assessment design
 Map (connect) program goals to division and institution
 Identify assumptions related to assessment
 Determine types of assessment: summative vs. formative
2. Articulating learning and development outcomes
 Articulate student learning and development goals and their related outcomes
 Determine the degree to which educational practice contributes to the intended outcomes
TOOLS AND TECHNIQUES (3-9)
3. Selection of data collection and management methods
 Identify types of data/information, including quantitative and qualitative data, and respective advantages
and disadvantages
 Identify direct and indirect methods of assessment
4. Assessment instruments
 Identify strengths and weaknesses of established instruments
 Develop rubrics
 Review accessibility and inclusiveness of instruments
5. Surveys used for assessment purposes
 Create and evaluate a rigorous survey, considering wording, format, administration, and response rate
 Use sampling statistics
6. Interviews and focus groups used for assessment purposes
 Organize and conduct individual and focus group interviews, considering participant recruitment and
selection, logistics, structure, rapport, nuances of discussion, and note-taking
 Develop questions
 Select and train moderators
7. Analysis
 Analyze and interpret quantitative data using uni-and multivariate statistics, with appropriate software
 Analyze and interpret qualitative data using appropriate software, establishing rigor, trustworthiness, and
authenticity
 Interpret data for technical and non-technical audiences, distinguishing between statistical and practical
significance
8. Benchmarking
 Identify national, regional, or local sources of benchmarking data, or create when those do not exist in a
specific functional area
 Use benchmarking data for strategic planning purposes
9. Program review and evaluation
 Use CAS or other program standards to review and improve program services
9
ADVANCED ISSUES (10-13)
10. Assessment ethics
 Understand role of Institutional Review Board
 Determine how assessment respects confidentiality and/or anonymity of participants
 Apply FERPA to assessment
11. Effective reporting and use of results
 Develop effective written and visual reports of findings considering audience and stakeholders
 Apply results to program services, and question underlying assumptions
12. Politics of assessment
 Determine political risks of results and audiences likely to be adversely affected
 Identify, recognize, overcome barriers to performing assessment
13. Assessment Education
Educate others about assessment goals, needs, techniques
10
APPENDIX D: ACPA ASK STANDARDS NEEDS ASSESSMENT
© ACPA 2007 – College Student Educators International
ACPA ASK Standards Needs Assessment
Demographics
a. Gender
Woman
Man
Transgender
b. Number of years in student affairs as a fulltime professional
0-5
6-10
11-15
16-20
c. Highest degree
Associate
Bachelor
Master’s
21 or more
Professional (MD, JD) Doctorate
Other
d. Primary functional area of your position (e.g. student activities, judicial affairs, and so on)
_______________________________________________________________________
Needs Assessment
For each of the following ASK content standards, indicate your level of skill and knowledge by circling the
appropriate number on the right.
Use the following scale: (1) No experience; (2) Beginner; (3) Intermediate; (4) Accomplished
1. Assessment Design
1
2
3
4
2. Articulating Learning and Development and Outcomes
1
2
3
4
3. Selection of Data Collection and Management Methods
1
2
3
4
4. Assessment Instruments
1
2
3
4
5. Surveys Used for Assessment Purposes
1
2
3
4
6. Interviews and Focus Groups Used for Assessment Purposes
1
2
3
4
7. Analysis
1
2
3
4
8. Benchmarking
1
2
3
4
9. Program Review and Evaluation
1
2
3
4
10. Assessment Ethics
1
2
3
4
11. Effective Reporting and Use of Results
1
2
3
4
12. Politics of Assessment
1
2
3
4
13. Assessment Education
1
2
3
4
1
11
Rank Needs and Identify Preferred Learning Methods
In the section below, please do two things: (1) identify the three areas in which you most want or need to
develop and (2) identify the learning method that would best help you to acquire those areas of skill and
knowledge.
First, please identify the three areas in which you would like to gain more skills and knowledge by filling in
the ASK Standard number in each of the questions below (please select only one standard and one learning
method per line):
(a) In which standard do you most want or need to gain more skill/knowledge?
ASK Standard number __________ Preferred Learning Method __________
(b) In which standard do you next most want or need to gain more skill/knowledge?
ASK Standard number __________ Preferred Learning Method __________
(c) Which of the standards describes a third area in which you want or need to gain more
skill/knowledge?
ASK Standard number __________ Preferred Learning Method __________
Second, use the lettered learning methods below to tell us how you would like to gain further skills and
knowledge in each of the areas you’ve identified above. For each of the areas of need above, fill in the
Learning Method line using the options below. Please select the ONE learning method you would find most
useful in acquiring further knowledge and skill for each standard.
A. Individually, on my own
B. Individually, but through a structured experience (e.g., interactive electronic program)
C. Through a mentoring relationship
D. Topic-specific workshops at national conference
E. Topic-specific workshops at regional conference
F. Topic-specific workshops on my own campus
G. A multi-day educational assessment conference
H. Training videos
I. Teleconferences
J. Webinars (a one-time, interactive, discussion in real time)
K. Discussion groups (e.g., listservs, webboards)
L. Short e-Learning courses (3-4 weeks in length)
M. Administrative shadowing (observation of another professional over time)
N. Administrative exchange programs (working at another institution for a short period of time)
O. Site visits to other institutions
P. Academic course
Q. Others __________________________________
Thank you for your help!
2
12
Download