Student Services` Student Learning Outcomes

advertisement
Student Learning Outcomes and Assessment
Models for Student Services
By
Angela Caballero de Cordero, Ph. D.
Executive Summary
As calls for increased student learning accountability have gained strength, student services have
found themselves challenged to keep up with instruction in the development of the appropriate
accountability measures that pertain to both, the student services that directly support classroom
instruction, and those that play an instructional role that is independent of the classroom.
This paper focuses attention on the student services that focus on teaching students the skills to
be able to succeed and persist through the completion of their educational goal. It purports to
assist counseling departments, transfer centers, EOPS programs and the like to develop their
student learning outcomes and assessment plans. The Student Learning Outcomes and Student
Learning Outcomes Assessment Models presented in this paper depart from the premise that
teaching and learning take place every time a student comes in contact with a program, be it
counseling, financial aid, or admissions and records, to name a few. The learning acquired
through these contacts is generalizable to real world settings and situations, and it is more than
just tangential learning when the interventions are thoughtful, intentional, and purposeful.
The Student Learning Outcomes Model in this paper, uses structural equation modeling or SEM
to model the dependent, hierarchical relationships between the program mission statement and
the major competencies, and between the major competencies and the student learning outcomes.
Structural Equation Modeling is a general statistical approach that models the relationships of
observable indicators to latent variables (Kline, 1998). Like latent variables, the main feature of
major competencies is its “bundling” of skills, abilities, and/or knowledge into one construct.
“Unbundling” requires a breaking down of this global construct into single observable indicators.
In the case of the Student Learning Outcomes Model, major competencies are likely to be latent,
and can only be measurable through its observable indicators.
The Student Learning Outcomes Assessment Model is derived from a modified version of a table
developed by Merritt College. This table has been changed from a seven-step process into a tenstep process and has been divided into three assessment layers. The aim of this model is to
facilitate the reflexive process of student learning outcome development, and to trigger—a
priori— the identification of the assessment processes that will be required to develop a plan that
is operationalizable and that addresses its macro and micro components.
Exhibit_5-T
1
The student learning outcomes assessment process is cyclical, systematic, and progressive. The
assessment cycle is repeated for as many times as there are competencies identified in the student
learning outcomes model. At the end of each cycle the program decides whether service delivery
must be changed to be more responsive to student’s needs or determine whether the student
learning outcome needs fine tuning. Then the next benchmark is selected, and the cycle is begun
again.
The development and operationalization of the student learning outcomes and student learning
outcomes assessment models for student services programs requires a team approach. Student
services units play a critical role in student success and engaging students in the learning process
begins with the student’s initial contact with program staff. Creating the optimum environmental
conditions for students to maximize their learning opportunities include providing a positive
college climate and customer relations that are student focused.
Exhibit_5-T
2
Student Learning Outcomes and Assessment
Models for Student Services
By
Angela Caballero de Cordero, Ph.D.
Introduction
The emphasis on increased accountability for student learning on community colleges and
universities initiated a flurry of literature that focused on classroom specific student learning
outcomes and assessment (Grubb & Badway, 2005; Baker & Wilson, 2006). Although it has
been understood that student services’ programs are equally responsible for developing a culture
of inquiry and evidence of student learning (Dowd, 2005), existing literature has not covered the
development of these plans with the same level of prominence. Classroom based instruction is
the primary activity of the institution, and logically, the first priority has been given to this
institutional activity.
There has been agreement that student services play a critical role in student success. However
these have been conceptualized as supportive services and there is no consensus that teaching
and learning take place through the utilization of these services, independently from what goes
on in the classroom setting. From this premise, it would seem counterintuitive to identify student
learning outcomes and it would be problematic to develop a meaningful student learning
outcome and assessment plan.
In fact, student services are divided into at least two types. The first type includes those services
that directly support classroom instruction such as tutoring and supplemental instruction. The
student learning outcomes for this cohort of student services focus on outcomes that translate
into better student grades, course completion, and comprehension of the subject matter. The
second type, such as counseling, transfer centers, and financial aid focus on teaching students the
skills to develop an educational plan, how to navigate the educational systems to reach a
graduation goal, and how to set and prioritize tasks, to name a few. These services support
institutional learning outcomes and contribute to the overall major competencies that students are
expected to master as a result of their educational journey.
This short paper focuses on the second type of student services. It purports to assist counseling
departments, transfer centers, EOPS programs and the like to develop their student learning
outcomes and assessment plans. The Student Learning Outcomes and Student Learning
Outcomes Assessment Models presented in this paper depart from the premise that teaching and
learning take place every time a student comes in contact with a program, be it counseling,
financial aid, or admissions and records, to name a few. The learning acquired through these
contacts is generalizable to real world settings and situations, and it is more than just tangential
learning when the interventions are thoughtful, intentional, and purposeful.
Exhibit_5-T
3
In developing the Student Learning Outcomes Model, structural equation modeling or SEM is
used to model the dependent, hierarchical relationships between the program mission statement
and the major competencies, and between the major competencies and the student learning
outcomes (see Figure 1).
Figure 1. A Major Competency-Student Learning Outcomes
Model
Major
Competency
SLO 2a
SLO 2d
SLO 2b
SLO 2c
Each major competency and its corresponding student learning outcomes is considered
a model.
Structural Equation Modeling is a general statistical approach that models the relationships of
observable indicators to latent variables (Kline, 1998). Like latent variables, the main feature of
major competencies is its “bundling” of skills, abilities, and/or knowledge into one construct.
“Unbundling” requires a breaking down of this global construct into single observable indicators.
In the case of the Student Learning Outcomes Model, major competencies are likely to be latent,
and can only be measurable through its observable indicators.
The development of the Student Learning Outcomes Assessment Model, builds on the work done
by Merritt College, San Diego Community College, San Francisco City College, Gavilan
College, and others. Both models are developed a priori. Each component of the model is
identified like a piece of a puzzle that fits during plan development, therefore it fits during
implementation. This process of inquiry is consistent with sound research practices where all the
components of a study are planned and later guide each stage of the research.
In addition to the applicability of these models to student services programs and units, this model
is also applicable to instructional programs and Institutions. The Macro-to-micro developmental
approach is helpful to examine and document student learning of those global concepts such as
critical thinking, literacy, and numeracy. A brief review of program level and institutional
student learning outcomes, for example, shows the following:




Students will be able to locate, analyze, evaluate and synthesize relevant
information.
Students will speak coherently and effectively
Students will be able to perform mathematical operations
Students will be able to utilize technology
Exhibit_5-T
4


Students will develop self-awareness and confidence
Students will be able to manage resources, such as time and money in
order to advance personal and career goal.
These reflect “bundles of skills” that need to be assessed as latent variables. All these are major
competencies that cannot be measured by a single observable indicator.
The Student Learning Outcomes Assessment Model (see Figure 3) is derived from a modified
version of a table developed by Merritt College. This table has been changed from a seven-step
process into a ten-step process and has been divided into three assessment layers. The aim of this
model is to facilitate the reflexive process of student learning outcome development, and to
trigger—a priori— the identification of the assessment processes that will be required to develop
a plan that is operationalizable and that addresses its macro and micro components.
Figure 2. Student Learning Outcomes Model for Student Services Programs
Student Learning Outcomes and Assessment Team:
(Faculty, assistants, and clerical staff)
Program
Mission Statement
Step 1
Major
Competency 2
Major
Competency 1
Step 2
SLO 2a
Major
Competency 3
SLO 2d
SLO 3a
Step 3
SLO 1a
SLO 3c
SLO 1c
SLO 3b
SLO 1b
Step 4
SLO 2b
SLO 2c
Student Learning Outcome Measurement & Assessment
(See detail in Figure 3)
SLO = Student Learning Outcome_____________________________________________
Exhibit_5-T
5
Developing the Student Learning Outcomes Plan
The success in the development of the Student learning outcomes plan for programs and student
services units depends on process inasmuch as it depends on content. In Student services units
and programs all staff contribute to student success which make everyone’s contribution to this
plan necessary. Experiences with faculty and staff have been found to contribute to student’s
self-efficacy perceptions which in turn impact persistence (Caballero de Cordero, 2005; Tinto,
1987) a measure of student success.
The WASC-ACCJC, as other Accreditation Commissions, has given a great deal of autonomy to
instruction and student services on the development of student learning outcomes; however, two
basic requirements have been provided: an evidence of faculty and staff participation and—
when faculty are involved in a service area—a faculty driven process.
Faculty and staff involvement are key in establishing an evidence of participation, and more
importantly, in getting the buy-in that will determine the success of the student services student
learning outcomes plan implementation and assessment1. Student learning begins when the
student comes in contact with the college, and representation from the different constituencies in
the development of the Student Learning Outcomes and Assessment plans assures a more
holistic approach to learning. Forming the student services student learning outcomes and
assessment teams is the foundation that assures that the two basic tenets provided by the
accreditation commission are covered. It further assures a process that is inclusive and allows
for the varying levels of staff input to be provided. Thus, faculty, counseling assistants, and
classified staff in a counseling department for example, should be involved2.
Example 1: Allan Hancock College Counseling Department SLO Team Members:
Department Chair (Tenured Counseling Faculty)
Tenured Counseling Faculty (3)
Part-time Counseling Faculty (1)
Counseling Assistant (1)
Counseling Secretary (1)
Counseling and Matriculation Dean
Administrative Secretary (1)
Facilitator: Student Services Student Learning Outcomes Co-Director
See Appendix A for full model.
Step 1: Developing the Mission Statement
1
The full model of a Counseling Department is provided in Appendix A. Segments of the model have been
imbedded in sections of this paper to illustrate how the model is operationalized.
2
The Student Learning Outcomes Team composition of the model operationalized is shown in the box.
Exhibit_5-T
6
Reviewing the program mission statement is the first step in developing the student learning
outcomes plan. The program mission statement is directly derived from the institutional mission
and it is at the core of program operations. It is also a primary component of program review
and, as the student learning outcome and assessment plan is developed, it also becomes part of
the program review core. When needed, the mission statement should be modified to reflect the
learner-centered paradigm; the team’s reflective review and modification of this statement
assures that team members become intimately aware of the program mission statement. When
this is developed or modified collaboratively by the team, there is an implicit commitment to
fulfill the program’s mission.
Example 2: Counseling Department Mission Statement:
The counseling program is committed to helping each student develop his or her
educational, career, and/or social potential.
Program Functions:
The Counseling Department provides academic and career counseling, assists students with
course/program selection, educational planning in addition to helping them meet personal
goals related to their educational pursuits
See Appendix A for full model.
Step 2: Identifying the Major Competencies
The major competencies are directly derived from the program’s mission statement. A major
competency can be conceptualized as an ability that is composed of a “bundle of skills.” The
National Postsecondary Education Cooperative defines competency as “a combination of skills,
ability, and knowledge needed to perform a specific task at a specified criterion” (Jones &
Voorhees, 2002). Critical thinking and autonomy are two examples of major competencies.
Both are latent constructs that would be evidenced by observable behaviors. Critical thinking for
example would be evidenced by a student’s ability to evaluate the quality of a publication, assess
the credibility of a source, and distinguish between fact and opinion. Critical thinking is said to
be a latent construct since it is not itself observable or measurable, but relies on indicators—
learning outcomes—that demonstrate that a student possesses the ability to think critically.
Exhibit_5-T
7
Example 3: Counseling Department Major Competencies:
1)
2)
3)
Students will be able to navigate the educational and student support services to
enhance their success.
Students will be proactive in the decision-making process and accept final
responsibility.
Students will be able to identify the necessary coursework for reaching their
personal enrichment and/or education.
See Appendix A for full model.
Step 3: Selecting Student Learning Outcomes
Student learning outcomes are the observable, measurable constructs that make up the major
competencies (Choban, et al, 2004). A major competency may be broken down into two, three,
or more student learning outcomes (See Figure 1) depending on the complexity of the major
competency. A program may choose not to identify all the major competency indicators, but
may select the most important ones that demonstrate that the competency has been acquired or
the most appropriate ones for the type of student service.
Example 4: Counseling Department Student Learning Outcomes:
1.1) Students will be able to list at least five student support services and three
instructional support services.
1.2) Students will be able to complete the registration process.
1.3) Students will be able to utilize on-line resources.
2.1) Students will be able to identify barriers to their academic success.
2.2) Students will be able to identify strategies to overcome barriers to achieving
academic success.
3.1) Students will develop a plan that identifies coursework necessary to achieve
their educational goal.
3.2) Students will be able to achieve the coursework necessary to achieve their
educational goal.
See Appendix A for full model
Developing and Implementing the Assessment Plan
Student Learning Outcomes Assessment should be done incrementally and systematically;
Exhibit_5-T
Model was derived from a modified table from Merritt College’s SLO Assessment.
8
Figure 3. Student Learning Outcome Measurement and Assessment Model:
Detail of Step 4.
Layer 1
Program Staff
Layer 2
Program & Research Staff
Layer 3
Research staff
Identify Baseline
data (4)
Fine-tune SLO
Student
Learning
Outcome
Target
population
(1)
Identify
benchmark (5)
Intervention
(2)
Select Assessment
methods and
identify
assessment tools
(6)
Identify
acceptable
performance
level (3)
SLO data
analysis and
evaluation (9)
Informs
service
delivery
Process
Improvement/
Continuation (10)
Student sample to
be assessed (7)
Timeline for data
collection (8)
Informs
appropriateness
of learning
outcome
Improvement-oriented institutions need to put into place a process for
systematically developing a cycle of assessment—not just a set of assessments,
but an ongoing system that can lead to initial assessments, then improvements in
educational practices, and then to the revision and expansion of the assessment
system itself. (Grubb & Badway, 2005; p.27)
Assessing student learning outcomes takes time and resources, and some have argued that it may
take up to ten years to initially assess a full student learning outcomes plan. Even after assessing
a student learning outcome, it may be needed to fine tune it depending on the results.
It has been suggested that programs tackle no more than two or three learning outcomes at a time
(Saddleback College, 2005). This paper proposes that a program must assess the student learning
Exhibit_5-T
9
outcomes of a major competency at a time. As seen in Figure 1, a major competency may consist
of more than one learning outcome, and may include more than three. Assessing all the student
learning outcomes of a competency allows the program to examine not only whether student
learning outcomes are demonstrated, but it also allows for the examination of the learning
outcomes—observable indicators—and how these fit as part of the major competency—factor.
This approach supports the notion of developing a learning institution by institutionalizing a
process of inquiry that is dynamic and informs instruction, service delivery, and overall
institutional planning. As Grubb & Badway (2005) suggest,
When an institution has fully implemented a structured, systematic, and ongoing assessment program, it will typically have become a student-centered
learning organization that is committed to continuously improving the education
its students obtain. Because its culture is focused on student learning,
assessment is woven into the fabric of everything its faculty, staff, and students
regularly do (Grubb & Badway, 2005, p.38).
Step 4: Assessing Student Learning Outcomes
As this model has been developed, the student learning outcomes assessment model consists of a
three-layer, multiple step process that starts in Step 3 (See figure 3). As indicated in Step 3, the
first layer of the assessment process begins at the time of student learning outcome identification
(See Figure 3). Each student learning outcome should be examined to determine whether it is
measurable, whether the program has the means to assess the learning outcome, and/or whether
the assessment of a learning outcome would be too burdensome, therefore not feasible3.
The department’s sustained engagement in this process is key in maintaining the momentum
through the implementation of the assessment steps of the plan. One of the major challenges of
this part of the student learning outcomes assessment plan is to make it complete enough to make
sure a program doesn’t need to backtrack, but simple enough so that it is perceived as doable and
not overtaxing. Since student services staff may not be researchers, the office of institutional
research should work closely with the student services unit to develop surveys and do the data
analysis at every stage of data collection and during student learning outcome evaluation.
As shown in Figure 3, the team needs to identify the target population for each student learning
outcome (1). Student services programs serve new students, continuing students, exiting
students, single parents, students at risk, probationary students, low income students, students
who attend workshops, and a multitude of other groups. A student learning outcome may be
appropriate for all students who come in contact with the program, or may only be applicable to
a select group.
3
In Appendix A the student learning outcomes and the assessment models are operationalized. A sample of a model
in process of development is presented. This is a model that has not been completed and the last steps of the
assessment plan will be undertaken in the fall semester.
Exhibit_5-T
10
The program’s intervention also needs to be identified (2). What do faculty and staff do that
assures students learn the particular skill, or acquire the specific knowledge called for in the
student learning outcome? If no intervention is presently practiced, what will faculty and staff
do to assure that the identified learning takes place? The identification of the specific
intervention that will result in the learning outcome at the time of plan development assures that
there is agreement among team members that interventions are not only purposive, intentional,
and outcome directed, but that the interventions become integrated into the program’s service
delivery so that teaching and learning a specific skill or information is assured, and the learning
outcome can be assessed.
Identifying a priori the acceptable student performance level assists the program in establishing
objective targets (3). This unbiased identification of the student population’s performance level
allows the student service unit to establish acceptable performance targets for a specific ability,
skill, or knowledge base. What percentage of your target population should possess the skill or
knowledge identified in your learning outcome? Students’ ability to register online successfully,
for example, would be a learning outcome that should be mastered by almost all—if not all—
continuing students; what is the performance target that would be considered acceptable? Is it
70% of the students? On the other hand, is it acceptable that 30% of the students don’t
demonstrate a particular learning outcome? Are your outcomes equitable among student
populations by demographic characteristics such as race/ethnicity, gender, and age?
An acceptable performance target should be one that considers the questions above. The target to
shoot for should be one that the program finds reasonable and defensible. Program
accountability measures should demonstrate that program excellence is inclusive and equitable.
Although the students must accept personal responsibility for their own success or failure,
“institutional actors, particularly faculty members, also bear individual and collective
responsibility for student outcomes” (Bauman et al, 2005). In a study of 357 community college
students, experiences with staff significantly contributed to changes in student self-efficacy
(Caballero de Cordero, 205) a construct that has been found to improve student performance
(Bandura, 1997; Bouffard-Bouchard, 1990).
The second layer of the assessment model (See Figure 3) deals with setting the ground for
learning outcome assessment. At this stage the program may need assistance from the
institutional research office depending on data available, program staff’s level of knowledge
about research methodologies, and about the level of sophistication of methodologies identified
to establish data collection and data analysis procedures.
The baseline data or current performance rate (4) of the target population establishes the starting
point from which student progress will be measured. Existing institutional and program data
may be used to identify current performance rate and used to establish the student learning
outcome baseline. In the absence of data about student performance on a specific learning
outcome, the team must identify the method of data collection, the instruments (if any) to collect
data and a timeline. This planning process allows the team to anticipate the level of assistance
that will be needed from the institutional research office.
Exhibit_5-T
11
Establishing benchmarks (5) for improvement should be done once the baseline data is known.
Benchmarks are improvement targets and these should be incremental, reasonable, and time
linked. It is possible that a program’s baseline data reflect that the number of students mastering
the skill, ability, or knowledge called for in a learning outcome is fairly high already. On the
other hand, the baseline data may show that significant progress must be made in order to reach
the performance target. This information should be considered when determining the
performance improvement or benchmark that is reasonable to be achieved within a specified
period of time.
There are many assessment methods to select from depending on the student learning outcome
(6). During the initial cycle of learning outcome assessment it is important to select more than
one assessment method. The program may elect to use the same measures used to establish the
performance baseline, or may select other measures that may be more appropriate. Student
satisfaction surveys, focus groups, interviews, institutional data, student self-reports, and
program data are only some of the possible data sources. Quantitative as well as qualitative
methods may yield significant data to evaluate whether initial interventions and service delivery
strategies or student learning outcomes should be kept or changed and will inform service
delivery enhancement efforts.
The student sample to be assessed (7) depends on the number of students in the target population
and the assessment method selected. Surveys allow for larger student samples to be assessed,
whereas focus groups require smaller groups of students. Random sampling to obtain a
representative sample of the student population is an important consideration when it is
impossible to include all students of the target population in the assessment process.
The timeline (8) for data collection of each student learning outcome should be identified. At the
program level, student learning outcomes may be short term or long term. For some outcomes it
may be appropriate to identify year-end while for others it may be at semester-end. Yet others
should be examined after some years of college attendance.
The third layer of the assessment model focuses on the analysis of learning outcomes assessment
data to examine program effectiveness.
Effectiveness must be evaluated in terms of achievements and results. More than
simply responding to external calls for accountability, institutions engaged in the
learning college movement increasingly find it in their interest to examine
carefully the degree to which their accomplishments match their intentions
(Baker & Wilson, 2006, p.2).
Program effectiveness may be also conceptualized as the extent to which a program reaches its
mission by meeting its educational objectives.
The data analysis (9) represents the culmination of one student learning outcomes assessment
cycle. This analysis informs service delivery and results in modifications to services and or in
refinement of the student learning outcome (10). When using quantitative measures, the
Exhibit_5-T
12
examination of all student learning outcomes in a competency, as in the measurement model of
structural equation modeling (Kline, 1998) the analysis of the direct effects of the latent
variable—major competency—on observed scores—factor loadings may be performed.
A major competency may have several observable indicators and, depending on the student
population, some may have stronger correspondence to the major competency than others.
Assessing and analyzing the data of all the student learning outcomes of a competency in one
assessment cycle allows for the determination of the correspondence of the observed indicators to
latent variable for each major competency-student learning outcomes model (See Figure 3) to be
Major Competency-Student Learning Outcomes Model:
Example 5: Counseling Department*:
Major Competency 1: Students will be able to navigate the educational and
student support services to enhance their success.
SLO 1.1) Students will be able to list at least five student support services.
SLO 1.2) Students will be able to list at least three instructional support
services.
SLO 1.3) Students will be able to complete the registration process.
SLO 1.4) Students will be able to utilize on-line resources.
Example 6: Institutional Level:
Major Competency A: Students will demonstrate critical thinking skills:
SLO A.1) Students will be able to evaluate the quality of a publication,
SLO A.2) Students will be able to assess the credibility of a source.
SLO A.3) Students will be able to distinguish between fact and opinion.
*See Appendix A for the full counseling department model.
evaluated. A major competency may have several observable indicators and since not all the
observable indicators are selected at the beginning of the assessment cycle, there are other
possible models for each competency that may be more appropriate. Finding the best major
competency-student learning outcomes model (See text box) may require more than one
assessment cycle.
Exhibit_5-T
13
Next Steps
The student learning outcomes assessment process is cyclical, systematic, and progressive. The
assessment cycle is repeated for as many times as there are competencies identified in the student
learning outcomes model. At the end of each cycle the program decides whether service delivery
must be changed to be more responsive to student’s needs or determine whether the student
learning outcome needs fine tuning. Then the next benchmark is selected, and the cycle is begun
again.
Table 1. Sample of a Full Assessment Cycle
YEAR 1
YEAR 2
YEAR 3
YEAR 4
Assessing SLO’s
for Competency 1
CYCLE 1

…
Program enhancement or
Student Learning outcome
fine tuning

Assessing SLO’s for
Competency 1
CYCLE 2

Program enhancement or
Student Learning outcome
fine tuning

Assessing SLO’s for
Competency 2
CYCLE 1
Program enhancement
or
Student Learning
outcome fine tuning 
Assessing SLO’s for
Competency 2
CYCLE 2
Assessing SLO’s for
Competency 3
CYCLE 1

Program enhancement or
Student Learning outcome
fine tuning



Ongoing … 
Conclusion
The development and operationalization of the student learning outcomes and student learning
outcomes assessment models for student services programs requires a team approach. Student
services units play a critical role in student success and engaging students in the learning process
begins with the student’s initial contact with program staff. Creating the optimum environmental
conditions for students to maximize their learning opportunities include providing a positive
college climate and customer relations that are student focused. Student learning outcomes
assessment is cyclical, incremental, systematic, and ongoing.
Exhibit_5-T
14
References
Baker, R.L. & Wilson, C. (2006). Accreditation and the Learning College: Parallel Purposes
and Principles for Practice. Community College League: Learning Abstracts.
http://www.league.org/publication/learning/edition.cfm. Downloaded on April 2, 2006.
Bandura, A. (1997). Self-efficacy: The exercise of control. New York: W.H. Freeman and
Company.
Bauman, G.L., Bustillos, L.T., Bensimon, E.M., Brown M.C. II, and Bartee, R.D. (2005).
Achieving equitable Educational Outcomes with All Students: The Institution’s Roles and
Responsibilities. Making Excellence Inclusive: Preparing Students and Campuses for an Era
of Greater Expectations. Association of American Colleges and Universities. www.aacu.org.
Bouffard-Bouchard, T. (1990). Influence of self-efficacy on performance in a cognitive task. In
Journal of Social Psychology, 130(3).
Caballero de Cordero, A. (2005). Self-Efficacy Perceptions and Their Impact on White and
Latino Community College Student Persistence. Dissertation. University of California,
California.
Grubb, W.N. & Badway, N.N. (2005). From Compliance to Improvement: Accountability and
Assessment in California Community Colleges. Higher Education Evaluation and Research
Group. Prepared for and funded by California Community Colleges Chancellor’s Office.
Carl D. Perkins Vocational and Technical Education Act of 1998 (VTEA) Contract #030392. (Revised May 2005).
Kline, R.B. (1998). Principles and Practice of Structural Equation Modeling. The Guilford
Press. New York.
League for Innovation in the Community Colleges (2004). Assessment Framework for the
Community College: Measuring Student Learning and Achievement as a Means of
Demonstrating Institutional Effectiveness. White Paper. Questionmark Corporation.
MPlus Discussion: Confirmatory Factor Analysis
http://www.statmodel.com/discussion/messages/9/9.html?1142787862; Downloaded on
03/19/06.
Saddleback College Student Learning Outcomes Implementation Team (2005). A Guide to
Developing and Assessing Student Learning Outcomes and Administrative/Service Unit
Outcomes at Saddleback College. www.saddleback.edu/gov/senate/ie/.
Structural Equation Modeling. http://www.statsoft.com/textbook/stsepath.html; Downloaded on
03/19/06.
Exhibit_5-T
15
Tinto, V. (1987). Leaving college: Rethinking the causes and cures of student attrition. Chicago:
The University of Chicago Press.
U.S, Department of Education, National Center for Education Statistics. Defining and Assessing
Learning: Exploring Competency Based Initiatives. NCES 2002-159, prepared by Elizabeth
A. Jones and Richard A. Voorhees, with Karen Paulson, for the Council of the National
Postsecondary Education Cooperative Working Group on Competency-Based Initiatives.
Washington, DC: 2002
Exhibit_5-T
16
APPENDIX A:
Operationalizing the Models
COUNSELING DEPARTMENT STUDENT LEARNING OUTCOMES AND ASSESSMENT 4
MODEL
February 16, 2006
Original Document: 02-16-06
Draft # 1 SLO Team
Revision #1: 03-02-06: Full
Staff Meeting
Facilitator: Student Services Student Learning Outcomes Co-Director
Allan Hancock College Counseling Department SLO Team Members:
Department Chair (Tenured Counseling Faculty)
Tenured Counseling Faculty (3)
Part-time Counseling Faculty (1)
Counseling Assistant (1)
Counseling Secretary (1)
Counseling and Matriculation Dean
Administrative Secretary (1)
Facilitator: Student Services Student Learning Outcomes Co-Director
INTRODUCTION – FACILITATOR

The meeting began by briefing committee members on student learning, the purpose of establishing
learning outcomes, the process, time frame, and how it will affects student learning and the
accreditation process.
Mission Statement (Revised)
The counseling program at Allan Hancock College is committed to helping each student develop his or
her educational, career, and/or social potential.
Primary Functions (Revised)
The Counseling Department provides academic and career counseling, assists students with
course/program selection and educational planning, and helps students meet personal goals related to
their educational pursuits.
4
The conceptualization of the model continues in a dynamic stage. Since the development of this plan, the model
presented in this paper differs somewhat from this model and lacks steps (9) and (10) of the Learning Outcomes
Assessment Model on page 7.
Exhibit_5-T
17
Major core competencies:
1)
2)
3)
Students will be able to navigate the educational and student support services to
enhance their success.
Students will be proactive in the decision-making process and accept final
responsibility.
Students will be able to identify the necessary coursework for reaching their personal
enrichment and/or education.
Student Learning Outcomes:
1.1)
1.2)
1.3)
1.4)
Students will be able to list at least five student support services
Students will be able to list at least three instructional support services.
Students will be able to complete the registration process.
Students will be able to utilize on-line resources.
2.1) Students will be able to identify barriers to their academic success.
2.2) Students will be able to identify strategies to overcome barriers to achieving
academic success.
3.1) Students will develop a plan that identifies coursework necessary to achieve
their educational goal.
3.2) Students will be able to achieve the coursework necessary to achieve their
educational goal.
Exhibit_5-T
18
Student Learning
Outcomes:
Target
Audience:
Which
students will
be affected?
Intentional
Intervention:
What do you do
or will do to
assure that the
desired learning
outcome takes
place?
General
Description:
Brief overview of
plan. What will
you do to assess
whether learning
has occurred?
Baseline Data:
What is your
baseline? If
none exists,
how will you
establish a
baseline?
Projected
Outcomes:
What would you
consider
success?
(rate of change)
Data
Collection:
What is your
method of data
collection?
Time
Frame:
What is
your time
frame? By
when will
you collect
the data?
1.1) Students
will be able to list
at least five
student support
services.
All students
who utilize
counseling
services to
develop an
SEP.
Student survey
and/or random
sample.
After the fact –
Random
Sample survey.
Pre – survey
Post – survey
TBD
All students
who utilize
counseling
services to
develop an
SEP.
Develop brief
questionnaire
that will be
given to
students before
they complete
SEP.
Develop brief
questionnaire
that we will give
to students
before they
complete SEP.
Need baseline
data first.
1.2) Students
will be able to list
at least three
educational
support services.
Need baseline
data first.
After the fact –
Random
Sample survey.
Pre – survey
Post – survey
TBD
1.3) Students
will be able to
complete the
registration
process
successfully.
Those
students who
take high
school START
and complete
application /
admission
form.
PD 101
classes and
LA 101 class.
Will discuss and
handout a
“bookmark” to
students listing /
explaining
student support
services.
Will discuss and
handout a
“bookmark” to
students listing /
explaining
student support
services.
Orientation and
advising
provided for
students.
Gathering
institutional data.
Gathering
institutional
data.
Need baseline
data first.
Institutional
Research office.
TBD
Department
Chair will contact
PD / LA
instructors.
Department
Chair will contact
PD / LA
instructors.
Instructor
rosters,
Develop brief
questionnaire
that we will give
to students at
the beginning of
Need baseline
data first.
Pre – survey /
Post – survey of
PD classes.
1.4) Students will
be able to utilize
o-line resources.
Student survey
and/or random
sample.
Projected
impact on
continuous
improvement:
(How will you
apply what you
have learned to
improve your
students’
learning
outcomes?
19
2.1) Students will
be able to
identify barriers
to their academic
success.
At Risk
Probation
students with
a code of 3/7
(readmitted
students)
Readmitted
(code 7)
changes to
(code 5) and has
a successful
semester.
Readmission
application,
appointment,
P.O.A.
2.2) Students will
be able to
identify
strategies to
overcome their
educational
goals.
Students who
are on code 7
(readmitted
after dismissal
and become a
three
(dismissed) or
5 (successful
semester.
The student
must meet with a
counselor to
develop a plan
for improvement.
Institutional data
(mainframe
report)
the semester.
Readmission
plan, Plan of
Action,
successful
semester.
Get data on
dismissed
students who
were able to
enroll due to
Weber.
Institutional
data, AHC
Information
Technology
program
(mainframe
report)
Need baseline
data first.
CUM 112 report
Institutional
data, AHC
Information
Technology
program
(mainframe
report)
Institutional data
(mainframe
report) CUM
112
TBD
.
TBD
.
20
Student
Learning
Outcomes:
Target
Audience:
Which
students will
be affected?
Intentional
Intervention:
What do you do
or will do to
assure that the
desired learning
outcome takes
place?
General
Description:
Brief overview of
plan. What will
you do to assess
whether learning
has occurred?
Baseline Data:
What is your
baseline? If
none exists,
how will you
establish a
baseline?
Projected
Outcomes:
What would you
consider
success?
(rate of change)
Data
Collection:
What is your
method of data
collection?
Time
Frame:
What is
your time
frame? By
when will
you collect
the data?
3.1) Students
will develop a
plan that
identifies
coursework
necessary to
achieve their
educational
goal.
3.2) Students
will be able to
achieve the
coursework
necessary to
achieve their
educational
goal.
Nursing
students
(CNA
students)
Attendance at
workshops,
priority
registration,
completion of a
Student
Educational
Plan.
Number of SEP’s
and students
maintaining their
priority
registration
status.
Instructor
rosters.
Need baseline
data first.
List of students
maintaining their
priority
registration
status.
Instructor
rosters.
TBD
Nursing
students.
Attendance at
workshops,
updated SEP’s ,
and continuation
of their priority
registration
status.
Number of SEP’s
completed, and
the number of
students
accepted and
moved up the
Nursing ladder.
Students who
completed CNA
and program
prerequisites.
Need baseline
data first.
Instructor and
counselor
rosters,
enrollment
verification and
program
completion.
TBD
Projected
impact on
continuous
improvement:
(How will you
apply what you
have learned to
improve your
students’
learning
outcomes?
2/5/2016
Exhibit_5-T
21
Student Services
List of Areas Needing Learning Outcomes and Scoring Rubrics
Fall Retreat August 22, 2007


















Financial Literacy
Time Management
Interpersonal skills
Matriculation awareness
Self advocacy
Decision making
Utilizing
Campus Resources
Personal/Academic Planning
Leadership
Social Responsibility
Personal Responsibility
Ethnic/Cultural Identity
Policies and Procedures
Rights and Responsibilities
Learning Strategies
Institutional Navigation
Academic Integrity
Exhibit_5-T (2)
Download