Assessment Toolkit

advertisement

University of Central Florida

Assessment Toolkit:

Measuring What Matters in

Student Development and

Enrollment Services

© 2006 Krist, Atwell, & Poisel

Dr. Ron Atwell

Ms. Pam Rea

Dr. Mark Allen Poisel

International Assessment and

Retention Conference

June 9, 2007

Agenda

 Introduction and Premises

 Development and Purpose of Department

Performance Review Process

 Mission Statements for Departments

 Identify a department mission statement

 Functional Areas

 Develop a Plan for a Functional Area

 Description and Purpose

 Outcomes

 Measures

 Conclusion and Questions

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

2

University of Central Florida

Fast Facts

 Established in 1963 (first classes in 1968),

Metropolitan Research University

 Grown from 1,948 to 46,848 students in 38 years

(Fall 2006 numbers)

 39,661 undergrads and 7,187 grads

 Ten academic colleges and 12 regional campus sites

 89% of lower division and 67% of upper division students are full-time

 Fall 2006 FTICs Enrolled: 4,131; Transfers: 4,006

 Summer 2006 FTICs Enrolled Fall 2006: 2,545

 Average SAT Total: 1201; Average H.S. GPA: 3.7

 Fall 2006 FT FTIC Retention Rate: 83%

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

3

Introduction and Premises

 Nichols’ assessment model (1995)

 Assessment:

 formative: focus on continuous quality improvement

 summative: one time, evaluative

 addresses academic and student support areas

 Tinto (1993); Pascarella & Terenzini (2005)

 success = total college experience

 Upcraft and Schuh (1995)

 Student:

 use and demand

 needs

 satisfaction

 campus environment and cultures

 outcomes

 Institution:

 benchmarking

 nationally acceptable standards

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

4

Exercise 1: What is Assessment?

 Minute paper:

 On the top of a piece of paper, write the components of good assessment.

 On the bottom of the page, write what assessment should not include

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

5

Program Assessment Is

 formative: designed to collect information that can be used for improvement

 ongoing

OR

 summative: takes a picture of where you are today

 contributes to resource allocation

 infused into regular operations

 clear and understandable

 comprehensive

 measures your primary functions or activities

 cost effective

 time

 money

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

6

Program Assessment Is Not

 used to evaluate individual staff or faculty

 used to evaluate individual students

 a solution

 It is a fact-finding mission.

 a replacement for good management and leadership

 an analysis of operations or processes

 could indicate a need for this kind of analysis  something done by one person

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

7

Issues in Effective Assessment

 High level administrative support

 Mission driven

 Resource allocation

 Assessment support: SDES, OEAS, IR

 Culture of assessment:

 motivation

 use of assessment results

 experience

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

8

Objectives of

Department Performance Review

 Conduct comprehensive review of the department

 Develop historical perspective of department

 Identify primary areas for improvement:

 services offered

 processes used

 Discuss programs with division directors, vice presidents, other administrators

 Complement accrediting efforts

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

9

Department Performance Review:

Components of Self- Study

 Examine elements related to the key programs, services, or activities of the department

 Evaluation of:

 Centrality

 Quality

 Demand

 Distinctiveness & competitive advantage

 Cost

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

10

Department Performance Review:

Components of Self- Study

Individual departments within a division:

 develop recommendations: strengths, weaknesses, opportunities

 reach resource decisions: eliminate, reduce, maintain, or enhance

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

11

Phases of a

Department Performance Review

 Department self-study and department level information completed by department

 Review and recommendations completed by the unit head

 Optional: review of the program completed by an external consultant

 Review of unit head (summary and department review) by appropriate administrators

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

12

Centrality of Department

 Department Mission

 Strategic planning goals of the department

 Alignment of department mission with the university mission and vision

 Alignment of department mission with the division mission and vision

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

13

Functional Areas of a Department

 Review the most important functions

(activities, services, or programs) that you said your department provides

 Department Performance Reviews are an indepth examination of the primary functions of your department

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

14

Statement of Purpose and Description

“The purpose of ( the functional area: program, activity, or service ) is to ( describe the reason for the functional area ) by providing ( identify what this functional area does ) to ( your stakeholders ).”

*NOTE: the order of the pieces of the purpose and description may vary from the above structure

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

16

Example:

SDES Administrative Services

The purpose of the technology support area is to improve the quality of university operations and student success by providing the Vice President,

SDES unit and department heads, and SDES staff, with overall management and leadership in all technology related items.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

17

Example: Career Services and

Experiential Learning

The employer liaison functional area of CSEL is responsible for all contact with prospective employers of our graduates. To accomplish this, we coordinate job fairs, interviews, and web posting of positions available.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

18

Function Purpose and Description:

Rate the Examples

 The purpose of the Student Union information services is to provide information.

 The purpose of the Multicultural Student Center training program is to equip student leaders to foster diversity in their organizations by providing specifically designed educational programs.

 The purpose of the advising component of the Student

Success Center is to assist student decision making in terms of course selection and faculty relations by providing information (through brochures and a website) and holding individual consultations.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

19

Investigate Functional Areas

 Cost

 Demand

 Quality: Outcomes and Measures

 Relationship to Strategic Initiatives

 Distinctiveness

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

20

Cost

 Expenditures for each function: program, activity or service

 Budget comparisons (past, present, future)

 Number of staff: headcount and FTE

 Special facilities, equipment, etc. required for each activity specified for the department

 Specialized delivery requirements

 partnerships with other areas

 off-site travel

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

21

Demand for Each Function

 Target market: who are you trying to reach?

 Number of constituents you serve

 Ability to meet demand

 Comments about future anticipated demand

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

22

Exercise 2: Target Market for the Function

 Target market: who are you trying to reach?

 Number of constituents you actually serve

 Ability to meet demand

 Comments about future anticipated demand

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

23

Quality of Department

 department outcomes

 student learning and development outcomes

 students’ and other constituents’ satisfaction

 impact on strategic initiatives:

 e.g. retention; student learning and development

 adequacy and quality of space and facilities

 service optimization

 efficiency

 demand

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

24

Defining Functional/Operational and

Student Learning Outcomes

 Program Outcome

 A specific, measurable statement that describes desired performance

 Function operational outcome: a type of outcome that addresses operational or procedural tasks, such as efficiency or satisfaction

 Student learning outcome: a type of outcome that describes the intended learning outcomes that students should meet as a result of a unit’s program(s) or service(s)

 more precise, specific, and measurable than a goal

 more than one outcome can relate to each goal

 one outcome can support more than one goal

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

25

Writing Outcomes:

Think SMART

S pecific

 Clear and definite terms describing expected outcomes.

For SLOs this includes abilities, knowledge, values, attitudes, and performance.

M easurable

 It is feasible to get the data, data are accurate and reliable; it can be assessed in more than one way

A ggressive but A ttainable

 Has potential to move the program forward

R esults-oriented

 Describe what standards are expected

T ime-bound

 Describe where you would like to be within a specific time period

From: Drucker, 1954

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

26

Examples:

Function Operational Outcomes

Career Services and Experiential Learning

Outcome: CSEL will provide timely service for student requests .

Admissions Office

Outcome: Transcripts will be processed efficiently.

Financial Aid Office

Outcome: There will be a 10% increase in student refunds processed through electronic transfer comparing Fall to Fall refunds.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

27

Function Operational Outcomes:

Rate the Examples

 Orientation Services will increase efficiency of online registration for transfer students.

 University Testing Center will increase the number of students served by the test preparation area.

 Students will be satisfied with the cleanliness of the

Student Union.

 The Counseling Center will provide high quality services.

 Students involved in Student Activities will be retained at a higher rate than those who do not.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

28

Examples:

Student Learning Outcomes

Career Services and Experiential Learning

Outcome: Senior level students who participate in

CSEL activities will demonstrate good interview skills.

LEAD Scholars Program

Outcome: At the end of the first year in the LEAD

Scholars program, participants will show improved communication with their peers.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

29

Student Learning Outcomes:

Rate the Examples

 Students will understand how to get around campus.

 Student Scholars in 2006-2007 will earn a rating of at least satisfactory on their tutoring interaction skills. A rubric will be used to rate their responses to hypothetical situations.

 Freshmen students will successfully navigate the online registration process during Fall 2006 registration.

 After completing SLS 1520 students will show an increase in their ability to use technological resources to conduct research.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

30

MATURE: Measuring Outcomes

M atches

 directly related to the outcome it is trying to measure

A ppropriate methods

 uses appropriate direct and indirect measures

T argets

 indicates desired level of performance

U seful

 measures help identify what to improve

R eliable

 based on tested, known methods

E ffective and E fficient

 characterizes the outcome concisely

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

31

Function Assessment Measures

 direct measure: actual evidence of something; for student learning outcomes, it means direct examination or observation of student knowledge, skills, attitudes or behaviors to provide evidence of learning outcomes

 indirect measure: perceived extent or value of operations or learning experiences

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

32

Assessment Measures for

Operational Outcomes

direct measures

 staff time

 cost

 materials

 equipment

 other resources

 cost per unit output

 reliability

 accuracy

 courtesy

 competence

 reduction in errors

 audit, external evaluator indirect measures

 written surveys and questionnaires:

 stakeholder perception

 students

 administration and staff

 faculty

 interviews

 focus groups

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

33

Linking Function Operational

Outcomes and Measures

Career Services and Experiential Learning

Outcome: CSEL will provide timely service for student requests.

Measure 1: During the Fall of 2006, at least 90% of all responses to telephone requests will be handled within 48 hours. A telephone log of time, nature of request, person handling the request and response time will be maintained.

Measure 2: During the Fall of 2006, at least 90% of all responses to e-mail requests will be handled within 48 hours. E-mail requests and responses will be forwarded to a centralized file and summarized weekly.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

34

Assessment Measures for

Student Learning Outcomes direct measures

 student records

 locally developed exams

 embedded questions

 external judge

 oral exams

 portfolios (with rubrics)

 behavioral observations

 simulations

 project evaluations

 performance appraisals

 minute papers indirect measures

 written surveys and questionnaires:

 student perception

 alumni perception

 employer perception of program

 exit and other interviews

 focus groups

 student records

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

35

Linking Learning Outcomes and Measures

Career Services and Experiential Learning

Outcome: Senior level students who participate in

CSEL activities will demonstrate good interview skills.

Measure 1: Seniors who participate in CSEL interview training will achieve at least 85% on the test of interview protocol at the end of the training.

Measure 2: Seniors who participate in CSEL interview training will be rated at least “good” on the rubric completed by interviewers from industry who interview them during the Fall or Spring job fair.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

36

Linking Learning Outcomes and Measures

LEAD Scholars Program

Outcome: At the end of the first year in the LEAD Scholars program, participants will effectively communicate with their peers.

Measure 1: First year LEAD scholar participants will be rated

“satisfactory” or higher by staff advisors on the presentation made to spring student activity groups. Staff advisors will all use a tested rubric for evaluation.

Measure 2: Students who attend the LEAD scholars’ presentation will score at least 80% on the 5 question test of the information presented.

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

37

Distinctiveness

 Regional & national reputation

 Unique features of department

 Results from benchmarking with other colleges

 Strategic niche

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

38

Impact on Strategic Initiatives

 Level of impact on appropriate strategic initiatives for the division:

 First year retention

 Student learning and development

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

39

Assessment Results: Outcomes

 Results of assessment and or evaluation for each program, activity or service:

 Effectiveness

 Efficiency

 Quality

 Student Learning

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

40

Recommendations by Function

 influence demand

 improve competitiveness

 achieve productivity gains

 achieve efficiencies

 reduce cost

 improve quality

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

41

Action Plan and Overall

Recommendations

 Action plan

 Expand, Reduce, Maintain, Eliminate

 Outsource, Reorganize, Re-engineer, Study further

 Overall comments and recommendations

 Planned and implemented changes

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

42

Review by Unit Heads

 Review Self-Study

 Review Recommendations

 Strategic Planning:

 Reorganization

 Resource allocation:

 funds

 personnel

 space

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

43

What Administrators Learn

 Are departments providing the services they should be providing?

 Do departments have the resources they need to achieve their mission and goals?

 Are they effective in achieving their goals?

 Should programmatic efforts be reduced, expanded or eliminated?

 Is there overlap among departments that can be consolidated?

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

44

Organizations to Support and

Assure Quality of Process

Support offices

 Assessment and Planning, SDES

 DPR process guidance

 data collection and analysis support

 Operational Excellence and Assessment Support

 Faculty Center for Teaching and Learning

 DPR process guidance

 survey support

 training for outcome and measure development

 website support, templates

 Institutional Research

 provide data

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

45

Effective Assessment

 High level administrative support

 Mission driven

 Resource allocation

 Assessment support: SDES, OEAS

 Culture of assessment:

 motivation

 use of assessment results

 experience

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

46

QUESTIONS

?

?

?

?

?

?

?

?

Conference, June 9, 2007, 11:30 AM

?

47

Continue the Conversation

Dr. Ron Atwell, Director

Assessment and Planning

Student Development and Enrollment Services ratwell@mail.ucf.edu

Ms. Pam Rea, Assistant Director

Student Disability Services

Student Development and Enrollment Services prea@mail.ucf.edu

Dr. Mark Allen Poisel, Associate Vice President

Academic Development and Retention mpoisel@mail.ucf.edu

International Assessment and Retention

Conference, June 9, 2007, 11:30 AM

48

Download