Paul D. Camp Community College Institutional Effectiveness 2012-2013

advertisement
Paul D. Camp Community College
Institutional Effectiveness
2012-2013
Program Faculty
Section I of III
Paul D. Camp Community College
100 North College Drive
Franklin, VA 23851
The Office of Assessment & Institutional Research
757-569-6719 (office phone)
757-569-6795 (fax number)
1
Table of Contents
Section 1: Academic Programs (3.3.1.1) ....................................................................................... 3
Transfer Programs Student Learning Outcomes ............................................................................ 7
Transfer Programs (AA & S) ....................................................................................................... 50
Science ......................................................................................................................................... 51
Business Administration .............................................................................................................. 58
General Studies ............................................................................................................................ 61
Education ..................................................................................................................................... 66
Occupational/Technical Programs (AAS) ................................................................................... 69
Nursing ........................................................................................................................................ 69
Management ................................................................................................................................. 80
Administrative Support Technology ............................................................................................ 90
Early Childhood Development .................................................................................................... 96
Industrial Technology ............................................................................................................... 101
Administration of Justice ........................................................................................................... 106
Modes of Instruction (Traditional vs. Distance Learning vs. Dual Enrollment) ....................... 111
Section Summary ...................................................................................................................... 117
Conclusion ……………………………. ................................................................................... 119
2
ACADEMIC PROGRAMS
The College’s academic programs participate in an annual assessment process as well as
program reviews every five years. The College offers Associate of Arts and Science (AA&S)
degrees and Associate in Applied Science (AAS) degrees. The AA&S degrees are transfer
degrees, and the AAS degrees correspond to occupational and technical disciplines. Each
academic program identifies expected student learning outcomes (SLOs), assesses the extent to
which those outcomes are achieved, and makes program adjustments based on analysis and
results of those outcomes. During the SACS reaffirmation in 2009, the Collage was found in
compliance with CS 3.5. This section of the report articulates many of the reasons for the
College’s successes in program assessment and improvement.
The narrative discussion and the following matrix format are used throughout this report
to document the College’s compliance with requirement CS 3.3.1.1.
Student Learning
Objectives (SLOs)
FORMAT EDUCATIONAL PROGRAMS
Assessment Methods &
Results Of Assessment
What were the findings of
Analysis Methods
What assessment tools
the Analysis (actual
&/or methods used to
assessment results)?
determine achievement of
the outcome? Describe
how data from tools &/or
methods will be collected.
Identify the procedure to
analyze the data.
Use Of Evaluation
Results (Action Taken to
improve program)
What changes were made
as a result of the outcome
assessment process?
Assessment Methods :
Analysis Procedure:
The discussion begins with the College’s Transfer Program SLOs. The Transfer Program
SLOs are woven throughout all transfer programs. Transfer students can choose to matriculate to
3
many four-year colleges or universities to continue their education. The College’s Transfer SLOs
are assessed annually, and the resulting data is utilized for program improvement.
One of the mechanisms for assessing the Transfer Program SLOs is the Shock-Tucker
Assessment of General Education (STAGE). This assessment was first developed by faculty at
Virginia’s Mountain Empire Community College in response to institutional dissatisfaction with
commercially available objective tests. The initial STAGE assessment was based on the VCCS
definition of general education. STAGE development continued as a collaborative effort between
the faculty and Institutional Research (IR) staff of various Virginia Community Colleges.
Eventually, the STAGE assessment was reworked by the IR staff members from the VCCS in
1999 and revised again in 2010. The assessment scores range from 1 to10 with 1 representing the
lowest score. The Benchmark is set at 5. The external validation and analysis of STAGE’s
reliability were conducted by James Madison University. In recent years, the VCCS has
modified its general education goals resulting in STAGE Test modifications to reflect changes in
these general education objectives. The VCCS general education goals include: (1)
communication, (2) critical thinking, (3) quantitative reasoning, (4) scientific reasoning, (5)
information literacy, (6) cultural/social understanding, and (7) personal development. The
STAGE Test assesses all of these general education objectives. However, it is only one of the
assessment instruments used to analyze general education skills. The next paragraphs briefly
present the VCCS Standardized Core Competency Tests development information.
Foremost among the mechanisms used for assessing program SLOs are the VCCS Core
Competency Tests. The VCCS, and therefore the College, defines general education program
content as “…that portion of the collegiate experience that addresses the knowledge, skills,
attitudes, and values characteristic of educated persons. It is unbounded by disciplines and
4
honors the connections among bodies of knowledge.” VCCS and the College’s degree graduates
are expected to demonstrate proficiency in all core competency subject matter. To this end, the
VCCS and the College continue to collaborate with James Mason University to develop or adapt
instruments and rubrics for assessing the core competencies.
Communication proficiency is measured by two instruments: the Writing Competency
Test and the Oral Communication Competency Test. The Writing Rubric, which uses a scale of
1 to 6 with 1 representing the lowest score, was developed by VCCS English Faculty and
Assessment Coordinators. The writing scoring grid is based on the following areas: focus,
organization, content, style, conventions, and documentation. The Oral Communication Rubric,
which uses a scale of 1 to 6 with 1 representing the lowest score, was also developed by VCCS
English Faculty and Assessment Coordinators. The scoring grid is based on the following areas:
appropriateness, verbal effectiveness, nonverbal effectiveness, and responsiveness. The VCCS
Core Competency Tests uses mean scores, and the College’s Benchmark is the College’s mean
divided by the VCCS mean.
The Critical Thinking Test was modified by James Madison University from the WatsonGlaser Critical Thinking Appraisal. The College’s Benchmark for the critical thinking test
equals the College’s sub-category mean divided by the VCCS sub-category mean. The
Information Literacy Test was developed primarily from the American Library Association
Information Literacy Competency Standards for Higher Education and James Madison
University (JMU) Information Literacy Skills for General Education (ILT). The College’s
Benchmark is that its proficiency will be at/or above VCCS Proficiency, which has the following
ratings: meets or exceed standards with a score of 37 or higher; proficient with a score of 37 to
41.9; and advanced proficient with a score of 42 or higher. The Quantitative Reasoning Test was
5
developed by James Madison University and the VCCS. The College’s Benchmark is that
graduates will achieve a proficiency rate of 80% or higher. The Scientific Reasoning Test
comparison was also developed by James Madison University in conjunction with the VCCS.
The VCCS mean score of 19.97 (20/35 items) equals 57%. The College’s Benchmark is that
graduates will achieve a proficiency rate of 80% or higher. The College utilized the Standardized
Wellness Inventory developed by Notre Dame University for its Personal Development
Assessment. It was modified by the VCCS and Blue Ridge Community College. The valueadded ratings calculated for the College by the Institutional Assessment and Research Director
are based on a scale from 1 to 5 with 1 representing the lowest score and 5 representing the
highest.
Additional evidence of continuing improvement can be found in the following table from
the College’s student and graduate surveys of academic programs quality of services. The table
presents those results.
PAUL D CAMP COMMUNITY COLLEGE
STUDENT SURVEY
ACADEMIC PROGRAMS QUALITY OF SERVICES
Service Unit
2007-08
2010-11
Shop & Technical Instruction
3.77
4.25
Academic Instruction
4.13
4.25
Developmental Instruction
3.96
4.22
College Survival Courses
4.05
4.26
GRADUATE SURVEY
ACADEMIC PROGRAMS QUALITY OF SERVICES
Service Unit
2011
2012
Shop & Technical Instruction
3.68
3.88
Academic Instruction
3.91
4.09
Developmental Instruction
3.77
4.09
Note: The ratings are based on a five point scale with 1 being the lowest
and 5 being the highest.
2011-12
3.98
4.11
4.09
4.34
2012-13
4.12
4.39
4.28
4.43
2013
3.82
4.01
3.92
2014
6
Further evidence of the College’s impact can be found in the rising graduation rates. The
College’s graduation rates have progressively advanced and exceed those of the VCCS. The
following table presents those results.
Year
Cohort
Institution
2006
Fall 2003
2007
Fall 2004
2008
Fall 2005
2009
Fall 2006
2010
Fall 2007
2011
Fall 2008
2012
Fall 2009
PDCCC
VCCS
PDCCC
VCCS
PDCCC
VCCS
PDCCC
VCCS
PDCCC
VCCS
PDCCC
VCCS
PDCCC
VCCS
GRADUATION RATES
Students
Graduates
164
11,111
121
12,412
119
12,288
136
15,356
171
17158
148
17822
144
17822
24
1,778
21
2,070
23
2,135
29
2,684
26
3043
40
3250
27
3250
Transfers
30
1,202
13
1,344
11
1,816
25
3,097
34
4680
35
4720
22
4720
Graduate
Rate
14.6%
16.0%
17.4%
16.7%
19.3%
17.4%
21.3%
17.5%
15.2%
17.4%
27.0%
18.2%
18.8%
23.0%
Transfer
Rate
18.3%
10.8%
10.7%
10.8%
9.2%
14.8%
18.4%
20.2%
19.9%
27.3%
23.6%
26.5%
15.3%
14.8%
Notes:
Dual enrollment students are included in these cohorts.
Cohort: Students who were first-time, full-time, & program-placed
Graduates: Students earning an award in three academic years, plus the following summer. This is
a 150% completion period, which attaches summer awards to the prior year.
Transfer: Beginning with Fall 2004 cohort, National Student Clearinghouse Data was used to
determine enrollment at another institution. This does not include graduates who transferred.
TRANSFER PROGRAMS STUDENT LEARNING OUTCOMES
In May 2006, the Virginia State Board for Community Colleges (SBCC) approved seven
general education competencies. The approved general education competencies have been
adopted by the VCCS and the College’s faculty. Once again, the competencies are
communication (written and oral), information literacy, quantitative reasoning, scientific
reasoning, critical thinking, cultural and social understanding, and personal development. The
College’s students are assessed every spring semester for their performance level. The specific
7
general education goals and SLOs in which all the College’s degree graduates are expected to
demonstrate competency are presented in the following Transfer Programs SLOs matrix.
8
Student Learning
Outcomes (SLO)
Communication-Writing:
In a written discourse the
student will demonstrate
the ability to state the
purpose that addresses the
writing task in a thoughtful
way; organize content with
effective transitions &
effective beginning &
ending paragraphs;
develop logical & concrete
ideas with effective use of
paragraph structure; use
appropriate & precise
word choice; demonstrate
few mechanical & usage
errors with evidence of
control of diction.
Assessment Methods &
Analysis Methods
Assessment Methods :
VCCS Writing Rubric.
STAGE Testing.
Faculty & Staff Survey.
Graduate Survey.
Capstone Course (PHI
115).
Analysis Procedure:
Writing Rubric using a
scale of 1-6 with 1 being a
low score. The scoring
grid is based on the
following areas: focus,
organization, content,
style, conventions, &
documentation.
STAGE test was
developed by IR staff
members from the VCCS
in 1999 & revised in 2010.
The score ranges from 110 with 1 being low.
Benchmark is set at 5.
Faculty/Staff Survey was
developed by IA&R,
faculty & staff is based on
a 5-point Likert scale with
one being low. Benchmark
is set at 3.
TRANSFER PROGRAM SLOs
Results Of Assessment
2012-13:
On the STAGE test which
was developed by VCCS
Assessment Coordinators,
2013 graduates scored
6.50 vs. 6.55 for 2012
graduates vs. 6.24 for 2011
graduates. All years are
above the benchmark
score set at 5.00.
The Graduate Survey
based on a 5-point scale
with 1 being low showed
value-added for the 2013
graduates. Their score
increased from 3.78 upon
entering the college to
4.47 at graduation. The
2013 graduates showed a
slight decrease from the
2012 graduates’ score of
4.51.
Use Of Evaluation Results (Action Taken)
2012-13 Action:
The college continues to improve on its capstone course. The capstone
team of instructors uses rubrics for each core competency in order to
identify weaknesses in any of the skill levels under each core
competency. The capstone instructors use the three rubrics developed by
the VCCS for written communication. The team created additional
rubrics for additional assessments.
To aid students in being able to maintain focus in their writing, English
faculty integrated varied practice exercises, videos, and current readings
into the course to help students be able to locate the thesis statement and
write essays that demonstrated a consistent focus. English faculty
directed students to select their own topics, thus making the writing
assignments more relevant and increased students' ability to develop
their ideas and demonstrate consistent focus. In addition, faculty used
the Learning Resource Center as literacy resource to provide more
current topics to support writing.
On the Faculty and Staff
Survey, the 2012-2013
graduates were rated 3.91
on a 5-point scale vs. 3.92
for 2011-2012 graduates.
The 2013 score was
virtually the same as the
2012 graduates and
slightly above the 20082009 graduates (3.89) and
the 2007-2008 graduates
with 3.64. The 2012-13
score is above the
9
benchmark set at 3.00.
The Capstone Course
based on a 70%
proficiency rubric showed
that the 2013 graduates
were proficient in writing
communication skills 96%
of the time vs. 89% in
2012 vs. 88% in 2011
vs.85% in 2010, and 75%
in 2009. The rubric also
showed the 2013 graduates
had a rating of 2.27 vs.
2.20 for 2012 vs. 2.17 for
2011 (based on a 0-3 scale
with 3 being the highest).
The weakest area for 2013
graduates was in the area
of focus (Being able to
state the purpose that
addresses the writing task
in a thoughtful way) with
84% proficiency vs. 82%
in 2012.
2011-12:
STAGE test showed the
2012 graduates scored
6.55, 2011 graduates
scored 6.24 and the 2010
graduates scored 6.89. All
years are above the
benchmark score of 5.00.
Faculty/Staff Survey 2011
rated graduates 3.92 on a
5-point scale (with a
benchmark of 3.00) vs.
3.94 in 2010; and 3.73 in
2011-12 Actions:
To increase students' skills in conventions, professors taught
conventions within the confines of actual writing documents as oppose
to insolated teaching of conventions. More one-on-one conferences and
tutoring were added in course plans to aid students. Constructive
feedback on how to improve writing aided students on how to best work
on areas of conventions and focus. Various assignments were integrated
in courses to assist students in strengthening their ability to maintain
focus in writing. Professors reviewed the VCCS grading rubric with
students and had students work collaboratively to develop mini-teaching
segments on the focus and conventions.
10
2007.
Graduate Survey showed
an increase in value-added
for the 2012 graduates,
from 3.93 upon entering
the College to 4.51 at
graduation vs. 4.36 to 4.80
in 2011
Capstone Course based on
an 80% proficiency rubric
showed that the 2012
graduates were proficient
in writing communication
skills 89% of the time, vs.
88% in 2011. The rubric
also showed the 2012
graduates had a rating of
2.20 vs. 2.17 in 2011 In
2012, the weakest area
was in FOCUS: stating
purpose that addresses the
writing task in a thoughtful
way (82% proficiency). In
2011, the weakest area
was in Conventions:
Demonstrating few
mechanical and usage
errors (81% proficiency).
CCSSE: PDCCC's CCSSE
mean score for writing was
2.90 in 2005; 2.91 in 2008,
and 3.00 for 2012. The
PDCCC's 2012 score
(3.00) was significantly
higher than that of small
colleges (2.78) and the
2012 cohort (2.77).
11
2010-11:
On the VCCS Writing
Rubric, 2010 graduates
showed a statistical valueadded for the writing core
competency using a crosssectional analysis when
compared to new degree
students enrolled in fall of
2008.
Writing rubric, 2010
graduates who did not
need any developmental
courses performed better
than graduates who had
taken a developmental
course. Overall, graduates
were weakest in writing
conventions (3.88). This
was also a decrease from
2008 graduates with a
mean score of 5.35 on a
six-point scale. The
strongest scores were in
rhetorical knowledge
(4.28).
2010-11 Actions:
English faculty adopt They Say, I Say by Gerald Graff & Cathy
Birkenstein, to improve student success in organization for written
communication, use of effective transitions, & development of
beginning & ending paragraphs which contain sentence templates &
transitional words that show students how to transpose them within their
writing to help with coherence & organization.
Faculty provided numerous examples of effective & ineffective
paragraphs & essays that provided students concrete examples of weak
& strong written expressions. These analytical skills were employed as
students wrote their own papers.
Faculty reviewed writing rubric with students & demonstrated how
organization is evaluated in their writing.
Faculty required outlines for student papers, which aided students in
developing their writing.
Science courses: Four additional current event writings required in Bio
101 to improve on writing proficiency in course.
Bio 142, instructors enhanced the use of the Learning Resource Center
(LRC) for research & writing by increasing previous requirement of two
writing units to four. BIO 270, 150, 205, NAS 125 & GOL 111 required
mandatory project writings.
New requirements in science courses were aimed at improving students’
organization of content skills & cohesion of ideas implemented that
were identified as weak using the writing rubric.
12
Graduate Survey valueadded Survey was
developed by IA&R,
faculty & staff based on a
5-point scale with 1 being
low showed value-added.
Benchmark is set at 3.
Capstone Course based on
a 80% proficiency rubric
& ratings based on a 0-3
scale with 3 being the
highest.
Community College
Survey of Student
Engagement (CCSSE):
general education mean
score based on a 4-point
scale and compared with
other cohorts.
Graduate Survey showed
an increase in value-added
from 3.99 to 4.45 in 2010;
and 3.72 to 4.38 in 2009.
Capstone Course based on
an 80% proficiency rubric
showed that 84.6% of the
time in 2010 and 74.2% in
2009. The rubric also
showed the 2011 graduates
had a rating of 2.17 vs.
2.17 in 2011 vs. 2.55 in
2010 vs. 2.23 in 2009. In
2011, the weakest area
was in Conventions:
Demonstrating few
mechanical and usage
errors (81% proficiency).
CCSSE: PDCCC's CCSSE
mean score for writing was
2.90 in 2005 and 2.91 in
2008. The PDCCC's 2008
score was significantly
higher than that of small
colleges and the cohort.
13
Student Learning
Outcomes (SLO)
Communication—Oral:
The student will
demonstrate skill in idea
development & verbal
effectiveness by the use of
language & the
organization of ideas for a
specific audience, setting
and occasion & to achieve
a purpose; nonverbal
effectiveness, assuring that
the nonverbal message
supports & is consistent
with the verbal message &
responsiveness,
communication skills
modified based on verbal
& nonverbal feedback.
Assessment Methods &
Analysis Methods
Assessment Methods :
VCCS Oral Competency
Exam based upon the Test
of Oral Communications
developed by James
Madison University.
Community College
Survey of Student
Engagement (CCSSE).
Value-Added Graduate
Survey.
Faculty Staff Survey
Capstone Course (PHI
115).
CST 100 (public
Speaking).
Analysis Procedure:
VCCS Oral
Communication Mean
Scores
Community College
Survey of Student
Engagement (CCSSE):
general education mean
score based on a 4-point
scale and compared with
other cohorts.
Value-Added Graduate
Survey based on a 5 point
scale with 5 being high
and 1 being low.
Results Of Assessment
2012-13:
The Graduate Survey
based on a 5-point scale
with 1 being low showed
value-added for the 2013
graduates. Their score
increased from 3.63 upon
entering the college to
4.39 at graduation. The
2013 graduates (4.39)
showed a slight decrease
from the 2012 graduates’
score of 4.49.
Use Of Evaluation Results (Action Taken)
2012-13 Action:
The oral communication skills of VCCS graduates were analyzed using
a newly developed rubric of eight competencies that were adapted from
the National Communication Association guidelines for effective
speaking. Based on the Oral Communication Assessment Report results
from the VCCS, PDCCC has two of eight objectives that need
improvement – Objective 2: Communicating the thesis/specific purpose
in a manner appropriate for the audience and occasion, and Objective 4:
Using an organizational pattern appropriate to the topic, audience,
occasion, and purpose. Previously our students were assessed using a
broad rubric analyzing four main areas: Verbal Effectiveness, Nonverbal
Effectiveness, Appropriateness, and Responsiveness. Actions taken
include the adoption of this new rubric in CST100 courses that
addressing all eight competencies, with emphasis on Objectives 2 and 4.
On the Faculty and Staff
Survey, the 2012-2013
graduates were rated 3.97
on a 5-point scale vs. 3.87
for 2011-2012 graduates.
The 2012-2013 score was
higher than the 2011-2012
graduates and above the
2007-08 graduate score of
3.73. The 2012-13 score is
above the benchmark set at
3.00.
The Capstone Course
based on a 70%
proficiency rubric showed
that the 2013 graduates
were proficient in oral
communication skills 87%
of the time vs. 89% in
2012 vs. 76% in 2010. The
rubric also showed the
2013 graduates had a
rating (based on a 0-3
14
Faculty Staff Survey based
on a scale from 1-5 with 1
being low & 5 being high.
Graduate Survey based on
a 5-point scale with 1
being low showed valueadded.
Capstone Course based on
80% proficiency. Rubric
rating based on a 0-3 scale
with 3 being the highest.
CST 100 (public speaking)
based on a comprehensive
test using a benchmark of
70% proficiency.
scale with 3 being the
highest) of 1.98 vs. 2.55 in
2012 vs. 2.60 in 2011. The
weakest area for 2013
graduates was in the area
of appropriateness: Idea
development, use of
language and the
organization of ideas for a
specific audience, setting,
and occasion are
appropriate (77%
proficiency).
The Oral Communication
test developed by the
VCCS colleges showed no
significant differences
between PDCCC and the
other colleges. PDCCC
scored the highest on
objective 8 (uses physical
behaviors that support the
verbal message) when
compared to the other
VCCS colleges. PDCCC
and the other VCCS
colleges weakest area are
in objective 2
(Communicates the
thesis/specific purpose in a
manner appropriate for the
audience and occasion.)
and objective 4 (Uses an
organizational pattern
appropriate to the topic,
audience, occasion, and
purpose).
15
2011-12:
PDCCC's CCSSE mean
score for oral
communication was 2.94
in 2012 vs. 2.81 in 2008
and 2.79 in 2005. The
PDCCC's 2012 score
(2.94) was significantly
higher than that of small
colleges (2.70) and the
2012 cohort (2.68).
2011-12 Actions:
Used rubric for grading all oral assessments, which has brought about
more consistency within classes. Faculty who teach CST 100 have made
minor changes in an effort to improve overall proficiency of student
performance.
Students were given lectures, videos, and PowerPoint presentations to
facilitate the learning process of Informative and Persuasive Speeches.
These actions enabled students to design and deliver Informative and
Persuasive speeches with 80% proficiency.
Faculty & Staff Survey
2011, rated graduates 3.87
on a 5-point scale (with a
benchmark of 3.00)
vs.3.97 in 2010 and 3.73 in
2007.
Graduate Survey showed
an increase in value-added
for 2012 graduates from
3.72 upon entering the
College to 4.49 at
graduation vs. 3.96 to 4.76
in 2011; 3.76 to 4.40 in
2010; and 3.54 to 4.37 in
2009.
Capstone Course based on
an 80% proficiency rubric
showed that the 2012
graduates were proficient
in oral communication
skills 89% of the time vs.
96% in 2011, 75.6% in
2010, and 92.6% in 2009.
The rubric showed 2012
graduate’s rating of 2.55
vs. 2.60 in 2011, 2.17 in
16
2010, and 2.75 in 2009.
The weakest area for 2012,
2011, and 2010 was in
Responsiveness:
Communication may be
modified based on verbal
and nonverbal feedback
(75% proficiency in 2012
vs. 90% in 2011, and 67%
in 2010).
2010-11:
Graduate Survey showed
an increase in value-added
for 2011 graduates from
3.96 upon entering the
College to 4.76 at
graduation vs. 3.76 to 4.40
in 2010; and 3.54 to 4.37
in 2009.
2010-11 Actions:
Teaching in CST100 changed from theory-based learning to skill-based
learning, with students going through authentic learning exercises in
order to apply knowledge gleaned from class materials. Students are
now required show a higher level of comprehension by taking
information they have learned & applying the techniques in real-life
situations.
CST 100 (public speaking)
proficiency for fall 2010
was 86% vs. 89% for fall
2009.
Capstone Course based on
an 80% proficiency rubric
showed that the 2011
graduates were proficient
in oral communication
skills 96% of the time in
2011 vs. 75.6% in 2010,
and 92.6% in 2009. The
rubric showed 2011
graduate’s rating of 2.60 in
2011 vs. 2.17 in 2010, and
2.75 in 2009. The weakest
area for 2011 and 2010
was in Responsiveness:
17
Communication may be
modified based on verbal
and nonverbal feedback
(90% in 2011 vs. 67% in
2010).
18
Student Learning
Outcomes (SLO)
Information Literacy:
The information literate
student will demonstrate
the ability to determine the
nature & extent of the
information needed; access
needed information
effectively & efficiently;
evaluate information & its
sources critically &
incorporate selected
information into his/her
knowledge base & value
system; use information
effectively to accomplish a
specific purpose; &,
understand many of the
economic, legal, & social
issues surrounding the use
of information & access &
use information ethically
& legally.
Assessment Methods &
Analysis Methods
Assessment Methods :
VCCS Information
Literacy Competency Test
(ILT) developed by James
Madison University.
Community College
Survey of Student
Engagement (CCSSE).
STAGE Testing.
Graduate Survey.
Faculty Staff Survey.
Capstone Course (PHI
115) – rubric.
TRANSFER PROGRAM SLOs
Results Of Assessment
2012-13:
On the STAGE test which
was developed by the
Virginia Community
College System (VCCS)
Assessment Coordinators,
the 2013 graduates scored
6.66 vs. 7.29 for 2012
graduates vs. 6.17 for 2011
graduates and 7.02 for
2010 graduates. The 2013
graduate score is above the
benchmark score set at
5.00 and above the 2011
graduate score of 6.17, but
it is below the 2012
graduates score.
Analysis Procedure:
The PDCCC’s Benchmark
= Graduates will achieve a
success rate of 80% or
higher on proficiency with
Criteria: meets or exceed
standards (37 or higher);
advanced proficient (42 or
higher); proficient (3741.9).
The Graduate Survey
based on a 5-point scale
with 1 being low showed
value-added for the 2013
graduates. Their score
increased from 3.62 upon
entering the college to
4.33 at graduation. The
2013 graduates (4.33)
score was virtually the
same as the 2012
graduates’ score of 4.32,
and the 2010 graduates’
score of 4.31.
STAGE Testing score
ranges from 1-10 with 1
being low. Benchmark is
set at 5.
On the Faculty and Staff
Survey, the 2012-2013
graduates were rated 3.91
on a 5-point scale vs. 3.83
ITE 115: Computer
Applications & Concepts.
Use Of Evaluation Results (Action Taken)
2012-13 Action:
The college continues to improve on its capstone course. The capstone
team of instructors has developed a critical thinking rubric in order to
better identify weaknesses in sub-categories of informational literacy
such as development of a search topics, development of a search
strategy, analysis of data quality and integrity. Each fall and spring term,
the information literacy rubric is used in the PHI 115 class assessment
and in the development of action plans for course improvement.
PDCCC met the student proficiency objectives of Information Literacy.
However, the college ranked less than the VCCS in 4 out of 5 ILT
overall mean sub-scores by objective standard. Though PDCCC scored
higher than the VCCS standard in the area of evaluating information and
its sources critically and incorporating selected information into his or
her knowledge base, the action planned in the capstone course is to
bridge all gaps record by continued monitoring of 2012 modifications
and instituting additional instructional tools. Videos and the
QuickSearch @ PDCCC search tool will be emphasized to enhance
students’ understanding, improve their performance and reinforce the
objectives.
19
Value-Added Graduate
Survey based on a 5 point
scale with 5 being high
and 1 being low.
Faculty/Staff Survey, rated
on a 5-point scale with 1
being low.
Graduate Survey valueadded based on a 5-point
scale with 1 being low.
Capstone Course based on
80% proficiency & rubric
rating based on a 0-3 scale
with 3 being highest.
ITE 115: Computer
Applications & Concepts,
based on a comprehensive
test using a benchmark of
70% proficiency.
for 2011-2012 graduates.
The 2013 score was
slightly above the 20112012 graduates and above
the 2007-2008 graduates
(3.73). The 2012-13 score
is above the benchmark
score set at 3.00.
The Capstone Course base
on a 70% proficiency
rubric showed that the
2013 graduates were
proficient in information
literacy skills 93% of the
time vs. 80% in 2012
vs.74% in 2011 and 60%
in 2010. The rubric also
showed the 2013 graduates
had a rating of 1.91 vs.
1.98 in 2012 vs. 1.82 in
2011, and 1.80 for 2010
graduates (based on a 0-3
scale with 3 being
highest). The weakest area
for 2013 graduates was
being able to evaluate
information and its sources
critically, and incorporate
selected information into
his or her knowledge base
and value system with
82% proficiency. This is
an increase from the 68%
proficiency in 2012.
20
2011-12:
PDCCC's CCSSE mean
score for information
literacy was 3.05 for 2012
vs. 3.07 in 2005.The
PDCCC's 2012 score
(3.05) was significantly
higher than that of small
colleges (2.85) and the
2012 cohort (2.79).
2011-12 Actions:
More courses have included links to LRC resources via the course
BBsites. Some of the tools that faculty and students have reported as
highly effective are
- Library Help Button added to BB courses & student oriented to
Library Help Button in BB.
- Library Help Buttons customized according to the content area were
added to Blackboard sites, upon request by faculty, to familiarize
students with library resources to support their course(s) & make
them more accessible.
The Virginia Community
College System (VCCS)
Information Literacy Test
(ILT) showed PDCCC’s
mean score of 64.4 vs.
65.2 for the VCCS. Subscores showed that the
weakest area for
improvement for the
VCCS (52.1) and PDCCC
(50.6) was student’s ability
to access needed
information effectively
and efficiently. PDCCC
(72.8) scored higher that
the VCCS (71.9) in
graduate’s ability to
evaluate information & its
sources critically and
incorporate selected
information into his or her
knowledge base.
Presentations were made to acquaint students with Library Help Buttons
as needed.
Continued usage of the guided worksheet that was created to teach
students in student development (SDV) courses how to utilize library
resources has been successful.
On the capstone course, include more emphasis in students ability to
access needed information effectively and efficiently.
Capstone course (PHI
115), the information
literacy proficiency level
for 2012 graduates was
80% vs. 74% in 2011, and
21
60% in 2010. The rubric
showed 2012 graduates
rated 1.98 vs. 2.02 in
2011, 1.80 in 2010, and
1.35 in 2009. A rubric
showed the weakest area
for 2012, 2011, and 2010
graduates was in the
ability to evaluate
information and its sources
critically (68% proficiency
in 2012; 63% in 2011; and
22% in 2010).
STAGE test showed the
2012 graduates scored
7.29 on a 10-point scale.
the 2011 graduates scored
6.17, and the 2010
graduates scored 7.02. All
scores are above the 5.00
benchmark scored.
Faculty & Staff Survey
2011 rated graduates 3.83
on a 5-point scale (with a
benchmark of 3.00) vs.
4.06 in 2010; 3.93 in 2008;
and 3.73 in 2007.
Graduate Survey showed
an increase in value-added
for 2012 graduates from
3.64 upon entering the
College to 4.32 at
graduation vs. 3.94 to 4.63
in 2011; 3.75 to 4.31 in
2010; and 3.56 to 4.22 in
2009
22
2010-11:
Capstone course (PHI
115), the information
literacy proficiency level
for 2011 graduates was
74% vs. 60% in 2010. The
rubric showed 2011
graduates rated 2.02 in
2011 vs. 1.80 in 2010, and
1.35 in 2009. A rubric
showed the weakest area
for 2011 and 2010
graduates was in the
ability to evaluate
information and its sources
critically ( 63%
proficiency in 2011; and
22% in 2010).
STAGE test showed the
2011 graduates scored
6.17 on a 10-point scale
vs. 7.02 for the 2010
graduates. All scores are
above the 5.00 benchmark
scored.
Faculty & Staff Survey
2010 rated graduates 4.06
on a 5-point scale (with a
benchmark of 3.00) vs.
3.93 in 2008; and 3.73 in
2007.
Graduate Survey showed
an increase in value-added
for 2011 graduates from
3.94 upon entering the
College to 4.63 at
graduation vs. 3.75 to 4.31
2010-11 Actions:
The following items were added or revised to clarify capstone course
assignments & to strengthen students’ skills in the areas of accessing
needed information efficiently & effectively, evaluating information &
its sources critically, & incorporating selected information into his/her
knowledge base & value system.
- Search Strategy Worksheet example was added.
- Discovering, Comparing & Selecting Library Resources Worksheet
was revised to combine an example with the exercise.
“The Good the Bad & the Ugly or Why It’s a Good Idea to Evaluate
Web Sources”, a website that gives criteria & examples for evaluating
websites was added.
Other strategies implemented to help students become more information
literate included:
- Library Help Button added to BB courses & student oriented to
Library Help Button in BB.
- Library Help Buttons customized according to the content area were
added to Blackboard sites, upon request by faculty, to familiarize
students with library resources to support their course(s) & make
them more accessible.
- Presentations were made to acquaint students with Library Help
Buttons. A “Guided Worksheet” was created to teach students in
student development (SDV) courses how to utilize library resources.
PHI 115 course assignments revisions: The PHI 115 Blackboard site was
revised to include a section on keyword searching tutorial, clarifications
were made on using Google Gadget & other databases, & the UNC
Documentation Information on citation builder was emphasized in the
course.
Science courses have enhanced use of the Learning Resource Center
(LRC) for research & writing by increasing previous requirement of two
writing projects units to four.
In PHI 115 the Information Literacy Rubric was revised.
The ITE 115 course has been revised to include extensive use of a
simulator. It is anticipated that students will exit this course with a
better command of the tools they have available to them. Since this
23
in 2010; and 3.56 to 4.22
in 2009
course is offered early in the curriculum, it is designed to help students
perform better on the information literacy tasks they are expected to
master.
For ITE 115, proficiency
in fall 2010 was 76%
based on a comprehensive
test and a benchmark set at
70% proficiency. This was
in increase from fall 2009
73% proficiency.
24
Student Learning
Outcomes (SLO)
Quantitative Reasoning:
The student will
demonstrate the ability to
use logical &
mathematical reasoning
within the context of
various disciplines;
interpret & use
mathematical formulas;
interpret mathematical
models; use arithmetic,
algebraic, geometric, &
statistical models to solve
problems; estimate &
consider answers to
mathematical problems in
order to determine
reasonableness; recognize
& communicate the
appropriate applications of
mathematical & statistical
models; &, represent
mathematical information
numerically, symbolically,
& visually, using graphs &
charts.
Assessment Methods &
Analysis Methods
Assessment Methods :
VCCS Quantitative
Reasoning (QR) Core
Competency Test
developed by James
Madison University.
STAGE Testing.
Graduate Survey.
Faculty Staff Survey.
Capstone Course (PHI
115).
Community College
Survey of Student
Engagement (CCSSE):
general education mean
score based on a 4-point
scale and compared with
other cohorts.
Analysis Procedure:
PDCCC’s Benchmark =
Graduates will achieve a
proficiency rate of 80% or
higher.
STAGE Test score ranges
from 1-10 with 1 being
low. Benchmark is set at 5.
TRANSFER PROGRAM SLOs
Results Of Assessment
2012-13:
On the STAGE test based
on a 10=point scale
developed by the Virginia
Community College
System (VCCS)
Assessment Coordinators,
the 2013 graduates scored
4.17 vs. 4.95 for 2012
graduates vs. 3.99 for 2011
graduates. This was above
the benchmark set at 3.00,
but below the 2012
graduates.
Use Of Evaluation Results (Action Taken)
2012-13 Action:
The college continues to improve on its capstone course. The capstone
team of instructors has developed a quantitative reasoning rubric in
order to better identify weaknesses in sub-categories of quantitative
reasoning such as probability of reasoning, grouping data, and various
discipline model. Each fall and spring term, the quantitative reasoning
rubric will be used in the PHI 115 class assessment and in the
development of action plans for course improvement. The college
continues to monitor increased levels of proficiencies that are
correlated with slight decreased percentage levels of people meeting
the proficient range. We are isolating various indicators of the
correlation and instituting additional instructional communication
tools aimed at encouraging student performance and reinforcing
objectives in the capstone course.
The Graduate Survey
based on a 5-point scale
with 1 being low showed
value-added for the 2013
graduates. Their score
increased from 3.27 upon
entering the college to
4.19 at graduation. The
2013 graduates’ score
(4.19) was slightly higher
than the 2012 graduates’
score of 4.17.
On the Faculty and Staff
Survey, the 2012-2013
graduates were rated 3.83
on a 5-point scale vs. 3.83
for 2011-2012 graduates.
25
Graduate Survey ValueAdded based on a 5 point
scale with 5 being high
and 1 being low.
Faculty Staff Survey
Rating is based on a scale
from 1-5 with 1 being low
& 5 being high.
Capstone Course based on
an 80% proficiency rubric
The 2013 score was the
same as the 2011-2012
graduates, but above the
2008-2009 graduates
(3.82) and the 2007-08
graduates (3.53). The
2012-13 score is above the
benchmark score set at
3.00.
The Capstone Course
based on a 70%
proficiency rubric showed
that the 2013 graduates
were proficient in
quantitative reasoning
skills 96% of the time vs.
76% in 2012 vs. 80% in
2011. The rubric also
showed the 2013 graduates
had a rating of 2.30 vs.
2.16 for 2012 vs. 2.02 for
2011 (based on a 0-3 scale
with 3 being the highest).
The weakest area for 2013
graduates was in the area
of recognizing and
communicating the
appropriate applications of
mathematical and
statistical models when
solving problems.
2011-12 Actions:
Students were required to research how statistics is used in the real
world. This was done in order for students to gain a better
understanding of how statistics is use in their everyday lives and how
to better apply it in specific situations.
Faculty also increased the number of in-class and independent
assignments to improve student comprehension of quantitative data &
data relationships.
Faculty continued to demonstrate appropriate techniques to study and
solve a variety of problems.
26
2011-12:
PDCCC's CCSSE mean
score for quantitative
reasoning (solving
numerical problems) was
2.75 for 2012, vs. 2.56 in
2005. The PDCCC's 2012
score (2.75) was higher
than that of small colleges
(2.70) and the 2012 cohort
(2.67).
2011-12 Actions:
Students were required to research how statistics is used in the real
world. This was done in order for students to gain a better
understanding of how statistics is use in their everyday lives and how
to better apply it in specific situations.
Faculty also increased the number of in-class and independent
assignments to improve student comprehension of quantitative data &
data relationships.
Faculty continued to demonstrate appropriate techniques to study and
solve a variety of problems.
Graduate Survey showed
an increase in value-added
for 2012 graduates from
3.49 upon entering the
College to 4.17 at
graduation vs.3.69 to 4.48
in 2011; 3.54 to 4.15 in
2010; and 3.37 to 4.14 in
2009.
Capstone Course based on
an 80% proficiency rubric
showed that the 2012
graduates were proficient
in quantitative reasoning
skills 76% of the time vs.
80% in 2011, 90% in
2010, 69% in 2009. Also
showed a rating of 2.32 for
2012 vs. 2.02 for 2011,
2.52 for 2010, and 2.07 for
2009. For 2012 and 2011
graduates, the weakest
area using a rubric was
estimating and considering
answers to mathematical
problems in order to
determine reasonableness (
27
67% proficiency in 2012,
60% in 2011).
2010-11:
Faculty & Staff Survey
2011 rated graduates 3.83
on a 5-point scale (with a
benchmark of 3.00) vs.
4.00 in 2010; 3.82 in 2008;
and 3.53 in 2007.
Graduate Survey showed
an increase in value-added
for 2011 graduates from
3.69 upon entering the
College to 4.48 at
graduation vs. 3.54 to 4.15
in 2010; and 3.37 to 4.14
in 2009.
Capstone Course based on
an 80% proficiency rubric
showed that the 2011
graduates were proficient
in quantitative reasoning
skills 80% of the time vs.
90% in 2010, 69% in
2009. Also showed a
rating of. 2.02 for 2011,
2.52 for 2010, and 2.07 for
2009. For 2011 graduates,
the weakest area using a
rubric was estimating and
considering answers to
mathematical problems in
order to determine
reasonableness ( 60%
proficiency in 2011).
2010-11 Actions:
Overall data with 84% or better proficiency in six objectives were
consistent with the College’s rubric measures.
Science instructors emphasized the significance of quantitative &
scientific reasoning skills in science by having students require more
projects.
In the capstone course, a “Using Data from a Chart” section was
included in the Quantitative Reasoning module. It introduced diverse
types of graphs & provided numerous examples related to interpreting
graphs. Students learned methods that assist in describing quantitative
data & data relationships using numerical methods of analysis &
symbolic representations in an effort to organize, interpret, plan &
execute the appropriate quantitative reasoning operation. Interactive
exercises & quizzes were included. More examples were also done in
class on estimating & use of mathematical information numerically,
symbolically, and visually.
28
Student Learning
Outcomes (SLO)
Scientific Reasoning:
The student will be able to
generate an empirically
evidenced & logical
argument; distinguish a
scientific argument from a
non-scientific argument;
reason by deduction,
induction & analogy; &,
distinguish between causal
& correlational
relationships.
Assessment Methods &
Analysis Methods
Assessment Methods :
VCCS Scientific
Reasoning (SR) Core
Competency assessment
developed by James
Madison University.
STAGE Testing.
Graduate Survey.
Faculty Staff Survey
Capstone Course (PHI
115).
TRANSFER PROGRAM SLOs
Results Of Assessment
2012-13:
On the STAGE test based
on a 10-point scale and
developed by the Virginia
Community College
System (VCCS)
Assessment Coordinators,
the 2013 graduates scored
a 4.81 vs. 5.79 for 2012
graduates vs. 5.16 for 2011
graduates and 5.14 for
2010 graduates. The 2012
graduates’ score was
below the benchmark
score set at 5.00.
All Science Lab Courses.
Analysis Procedure:
VCCS Mean score 19.97
(20/35 items) = 57%. The
College’s Benchmark =
Graduates will achieve a
proficiency rate of 80% or
higher.
Stage Testing The score
ranges from 1-10 with 1
being low. Benchmark is
set at 5.
Value-Added Graduate
Survey based on a 5 point
scale with 5 being high
and 1 being low.
Faculty Staff Survey
Rating is based on a scale
The Graduate Survey
based on a 5-point scale
with 1 being low showed
value added for the 2013
graduates. Their score
increased from 3.45 upon
entering the college to
4.25 at graduation. The
2013 graduates’ score
(4.25) showed a decrease
from the 2012 graduates’
score of 4.32, but above
the 2010 graduates’ score
of 4.22.
Use Of Evaluation Results (Action Taken)
2012-13 Action:
The college continues to improve on its capstone course. The capstone
team of instructors has developed a scientific reasoning rubric in order
to better identify weaknesses in sub-categories of scientific reasoning
such as identification and collection of empirical evidence,
identification of inductive and deductive arguments experiments, and
various types of cause efficiencies and correlations. Each fall and
spring term, the scientific reasoning rubric will be used in the PHI 115
class assessment and in the development of action plans for course
improvement.
As in previous years, the focus should be on consistent attendance.
Laboratory science, as well as simulation experiments, are quite
engaging, and often require partnership. Students must understand that
unless they are conversant with many academic disciplines, especially
mathematics, they should open themselves to working in groups,
rather than individually. We continue to expose students to activities
that demonstrate thinking and practice in science, as well as have
them work in groups to discuss experiment before and after doing
them.
Perhaps, the protocol of science should be directed pointed out, rather
than let if diffuse along with other concepts. One or two additional
lecture modules is dedicated to further explanation of experimental
methods.
On the Faculty and Staff
Survey, the 2012-2013
graduates were rated 3.77
on a 5-point scale vs. 3.79
for 2011-2012 graduates.
29
from 1-5 with 1 being low
& 5 being high.
Capstone Course based on
80% proficiency. Rubric
rating based on a 0-3 scale
with 3 being the highest.
All Science Lab Courses
based on a comprehensive
test and a benchmark
proficiency of 70%.
The 2013 score was
virtually the same as the
2011-2012 graduates, but
above the 2008-2009
graduates (3.76) and the
2007-08 graduates (3.47).
The 2012-13 score is
above the benchmark
score set at 3.00.
The Capstone Course
based on a 70%
proficiency rubric showed
that the 2013 graduates
were proficient in
scientific reasoning skills
92% of the time vs. 94%
in 2012 vs. 82% in 2011,
71% in 2010 and 58% in
2009. The rubric also
showed the 2013 graduates
had a rating of 1.84 vs.
2.09 in 2011 vs. 2.01 in
2010 (based on a 0-3 scale
with 3 being the highest).
The weakest area of 2013
graduates was in the area
of distinguishing between
causal and correlational
relationship with 90%
proficiency vs. 88%
proficiency in 2012.
2011-12 Action:
Students overall success rate in science courses for the skills below
improved due to increased lab activities, modeling, and critical
thinking discussions:
Generate an empirically evidenced and logical argument.
Distinguish a scientific argument from a non-scientific argument.
Reason by deduction, induction, and analogy.
30
2011-12:
On the SR Test of 2011
graduates, 86% of
graduates were at least
minimally proficient in
scientific reasoning.
PDCCC graduates (82%)
and the (VCCS) as a
whole had the highest
performance on
distinguishing a scientific
argument from a nonscientific argument.
PDCCC performed the
weakest in the ability to
generate an empirically
evidenced and logical
argument (50.8%).
2011-12 Action:
The new scientific reasoning rubric has received minor modification.
This is an ongoing reviewed process.
Students overall success rate in science courses for the skills below
improved due to increased lab activities, modeling, and critical
thinking discussions:
Generate an empirically evidenced and logical argument.
Distinguish a scientific argument from a non-scientific argument.
Reason by deduction, induction, and analogy.
Distinguish between causal and correlation relationships.
STAGE test showed the
2012 graduates scored
5.79 on a 10-point scale;
the 2011 graduates scored
5.16; the 2010 graduates
scored a 5.14. All years
are above the 5.00
benchmark.
Faculty & Staff Survey
2011 rated graduates 3.79
on a 5-point scale (with
benchmark of 3.00) vs.
3.88 in 2010; 3.76 in 2008;
and 3.47 in 2007.
Graduate Survey showed
an increase in value added
for the 2012 graduates
from 3.70 upon entering
31
the College to 4.32 at
graduation vs. 3.75 to 4.55
in 2011; 3.60 to 4.22 in
2010; and 3.45 to 4.14 in
2009.
Capstone Course 2012,
graduates were proficient
in scientific reasoning
skills 94% of the time vs.
82% of the time for 2011;
71% of the time for 2010,
vs. 58% in 2009. Rubric
also showed the 2012
rating of 2.42 vs. 2.09 in
2011 vs 2.01 in 2010 vs.
1.76 in 2009. In 2012, the
weakest area was in
distinguishing between
causal and correlational
relationships. In 2011, the
weakest area was in
reasoning by deduction,
induction, & analogy (72%
proficiency). In 2010, the
weakest area was also in
reasoning by deduction,
induction, & analogy (47%
proficiency).
Student Learning
Outcomes (SLO)
Assessment Methods &
Analysis Methods
TRANSFER PROGRAM SLOs
Results Of Assessment
2010-11:
Graduate Survey showed
an increase in value added
for the 2011 graduates
from 3.75 upon entering
the College to 4.55 at
Use Of Evaluation Results (Action Taken)
2010-11 Actions:
Science course faculty have increased the number of lab sessions
offered that involved experimental design from 3 to 5 deduction,
induction, & analogy & provided more experiences for enrolled
students. Courses have allowed & cultivated greater creativity on the
part of the students by providing minimal, unrestrictive guidelines to
32
graduation vs. 3.60 to 4.22
in 2010; and 3.45 to 4.14
in 2009.
STAGE test showed the
2011 graduates scored
5.16 on a 10-point scale
vs. the 2010 graduates
score of 5.14. All years are
above the 5.00 benchmark.
Faculty & Staff Survey
2010 rated graduates 3.88
on a 5-point scale (with
benchmark of 3.00) vs.
3.76 in 2008; and 3.47 in
2007.
Capstone Course 2011,
graduates were proficient
in scientific reasoning
skills 82% of the time vs.
71% of the time for 2010,
vs. 58% in 2009. Rubric
also showed the 2011
rating of 2.09 vs. 2.01 in
2010 vs. 1.76 in 2009. In
2011, the weakest area
was in reasoning by
deduction, induction, &
analogy (72%
proficiency). In 2010, the
weakest area was also in
reasoning by deduction,
induction, & analogy (47%
proficiency).
students’ field & in-door learning activities. There has been greater
infusion of a sustainability component in most of the basic science
courses. This has enabled students to recognize the human & global
relevancies of their curricula, & promoted greater citizenry education
and responsibility.
The new scientific reasoning rubric has received minor modification.
This is an ongoing reviewed process.
Capstone course: a Methods of Scientific Inquiry section was included
in the Empirical Evidence & Logical Argument section. The section
introduced students to various scientific methods of inquiry related to
data collection that leads to scientific knowledge. This included more
examples on deduction, induction, and analogy. It familiarized
students on how to effectively select the appropriate form of inquiry,
interpret scientific data, draw inferences from analyzed data & test the
conclusion in order to gain reasonable scientific knowledge.
In all science laboratory courses, exercises are modified to tackle more
problems that address the scientific method, especially in attempting
to distinguish scientific from non-scientific arguments. In addition, the
pre-lab segment for all weekly laboratory activities was enhanced to
incorporate specific logic, empirical, & testing principles. Increased
use of models, charts for demonstrations, & other hands-on strategies
were employed during in-class & laboratory exercises.
All Science Lab Courses
33
showed an increase in
proficiency. Fall 2010
proficiency was 82%. This
compared to 80%
proficiency in fall 2009 &
78% proficiency in fall
2008.
34
Student Learning
Outcomes (SLO)
Critical Thinking:
The student will
demonstrate the ability to
discriminate among
degrees of truth or falsity
of inferences drawn from
given data; recognize
unstated assumptions or
presuppositions in given
statements or assertions;
determine whether certain
conclusions necessarily
follow from information;
weigh evidence & decide
if generalizations or
conclusions based on
given data are warranted;
&, distinguish between
arguments that are strong
& particular question at
issue.
Assessment Methods &
Analysis Methods
Assessment Methods :
VCCS Critical Thinking
Core Competency Exam
based on the California
Critical Thinking Skills
Test (CCTST).
STAGE Test.
Graduate Survey.
Faculty Staff Survey.
Capstone Course (PHI
115).
Community College
Survey of Student
Engagement (CCSSE):
general education mean
score based on a 4-point
scale and compared with
other cohorts.
Analysis Procedure:
VCCS Critical Thinking:
Mean scores.
STAGE testing score
ranges from 1-10 with 1
being low. Benchmark is
set at 5.
Value-Added Graduate
Survey based on a 5 point
scale with 5 being high &
1 being low.
TRANSFER PROGRAM SLOs
Results Of Assessment
2012-13:
On the STAGE test which
was developed by Virginia
Community College
(VCCS) Assessment
Coordinators, 2013
graduates scored 4.87 vs.
5.25 for 2012 graduates vs.
4.70 for 2011 graduates.
The 2013 graduates scored
higher than the 2011
graduates but below the
benchmark set of 5.00.
Use Of Evaluation Results (Action Taken)
2012-13 Actions:
The college continues to improve on its capstone course. The capstone
team of instructors has developed a critical thinking rubric in order to
better identify weaknesses in sub-categories of critical thinking such
as induction, deduction, analysis, inference, and evaluation. Each fall
and spring term, the critical thinking rubric is used in the PHI 115
class assessment and in the development of action plans for course
improvement. The class action plan was completed at the end of term
and was sent to their dean and to the Director of Assessment & IR.
The Graduate Survey
based on a 5-point scale
with 1 being low showed
value-added for the 2013
graduates. Their score
increased from 3.61 upon
entering the college to
4.43 at graduation. The
2013 graduates’ had a
larger value-added gain
(0.66 for 2012 graduates
vs. 0.82 for 2013
graduates. The 2013
graduates’ score 4.43 was
similar to the 2012
graduates’ score of 4.42.
On the Faculty and Staff
Survey, the 2012-2013
graduates were rated 3.80
on a 5-point scale vs. 3.86
for 2011-2012 graduates.
The 2013 score was
2011-12 Actions:
Students continued to maintain or increase overall proficiency in
critical thinking competencies. Courses incorporated more
components of critical thinking through discussion, lectures, and
readings. Faculty members have focused more attention on skill across
disciplines. ENG 112 courses focused heavily on argumentative
35
Faculty Staff Survey
Rating is based on a scale
from 1-5 with 1 being low
& 5 being high.
slightly below the 20112012 graduates, but above
the 2007-2008 graduates
(3.51). The 2012-13 score
is above the benchmark
score set at 3.00.
writing proficiency.
The Capstone Course
based on a 70%
proficiency rubric showed
that the 2013 graduates
were proficient in critical
thinking skills 87% of the
time vs. 80% in 2012 vs.
84% in 2011 vs. 62% in
2010. The rubric also
showed the 2013 graduates
had a rating of 1.85
compared to 1.98 for 2012
graduates (based on a 0-3
scale with 3 being the
highest). The weakest area
for 2013 graduates was in
the area of induction &
deductive reasoning with
83% proficiency. This is,
however, an increase from
2012 graduates with 55%
proficiency.
2011-12:
STAGE Test: The 2012
graduates scored 5.25 on a
10-point scale; the 2010
graduates scored 5.15. All
graduates were above the
5.00 benchmark & above
2002 graduates’ score of
5.08.
36
Faculty & Staff Survey
2011 rated graduates 3.86
on a 5-point scale (with a
benchmark of 3.00) vs.
3.97 in 2010; 3.85 in 2008;
and 3.51 in 2007.
Graduate Survey showed
an increase in value-added
for the 2012 graduates
from 3.76 upon entering
the College to 4.42 at
graduation vs. 4.03 to 4.68
for 2011; 3.84 to 4.44 in
2010; 3.66 to 4.29 in 2009.
Capstone Course: 2012
graduates were proficient
in critical thinking skills
80% of the time vs. 84%
in 2011, 62% in 2010, and
90.2% in 2009. Rubric
showed the 2012 graduates
had a rating of 1.98 vs.
1.98 in 2011, 1.84 in 2010
and 2.70 in 2009. In 2012,
the weakest area using a
rubric was pertained to
inductive and deductive
reasoning (55%
proficiency). In 2011, the
weakest area pertained to
inference (60%
proficiency). In 2010, the
weakest areas was in the
areas of Analysis (60%
proficiency), Inference
(42% proficiency) &
inductive & deductive
37
reasoning (22%
proficiency).
Student Learning
Outcomes (SLO)
Assessment Methods &
Analysis Methods
Capstone Course based on
PHI 115. 80% proficiency.
Rubric rating based on a 03 scale with 3 being the
highest.
TRANSFER PROGRAM SLOs
Results Of Assessment
Use Of Evaluation Results (Action Taken)
PDCCC's CCSSE mean
score for critical thinking
was 2.88 in 2005; 2.93 in
2008, and 3.08 for 2012.
The PDCCC's 2012 score
(3.08) was higher than that
of small colleges (2.96)
and the 2012 cohort (2.94).
2010-11:
Faculty & Staff Survey
2010 rated graduates 3.97
on a 5-point scale (with a
benchmark of 3.00) vs.
3.85 in 2008; and 3.51 in
2007.
Graduate Survey showed
an increase in value-added
for the 2011 graduates
from 4.03 upon entering
the College to 4.68 at
graduation vs. 3.84 to 4.44
in 2010; 3.66 to 4.29 in
2009.
2010-11 Actions:
All data within the critical thinking competency module was reanalyzed & assessment outcomes were heavily scrutinized. As a
result:
- Over 10% of assessment material was modified, exercise
questions, quiz questions, & test questions.
- 20% of Critical Thinking Quiz One was enhanced. Presently, it
contains case study arguments that require each student to
accurately employ good critical thinking engagement of analysis,
interpretation, evaluation, inferences, explanation, and cognitive
thought.
- 25% of student related engagement was enhanced.
- Group sessions were included to promote collaborative
transference of ideas & alternative methods of logical reasoning.
- Additional 10% of directive instructions & interpretive examples
were included within the critical thinking competency module.
Capstone Course: 2011
graduates were proficient
in critical thinking skills
In all life science courses, critical thinking, scientific reasoning &
quantitative skills were taught as lecture concepts & lab activities.
Although the College wide data for critical thinking scores were rather
38
84% of the time vs. 62%
in 2010. Rubric showed
the 2011 graduates had a
rating of 1.98 vs.1.84 in
2010. In 2011, the weakest
area pertained to inference
(60% proficiency). In
2010, the weakest areas
was in the areas of
Analysis (60%
proficiency), Inference
(42% proficiency) &
inductive & deductive
reasoning (22%
proficiency).
weak, science students had better than a 75% proficiency in the area of
inductive & deductive reasoning.
39
Student Learning
Outcomes (SLO)
Cultural & Social
Understanding:
A culturally & socially
competent person
possesses an awareness,
understanding, &
appreciation of the
interconnectedness of the
social & cultural
dimensions within &
across local, regional,
state, national, & global
communities. Degree
graduates will demonstrate
the ability to (1) assess the
impact that social
institutions have on
individuals and culturepast, present, & future; (2)
describe their own as well
as others’ personal ethical
systems & values within
social institutions; (3)
recognize the impact that
arts & humanities have
upon individuals &
cultures; (4) recognize the
role of language in social
& cultural contexts; & (5)
recognize the
interdependence of
distinctive world-wide
social, economic, geopolitical, & cultural
systems.
Assessment Methods &
Analysis Methods
Assessment Methods :
STAGE Test.
Value-Added Graduate
Survey.
Faculty Staff Survey.
Capstone Course (PHI
115).
Social Science Courses
(US History & Western
Civilization).
Community College
Survey of Student
Engagement (CCSSE):
general education mean
score based on a 4-point
scale and compared with
other cohorts.
TRANSFER PROGRAM SLOs
Results Of Assessment
2012-13:
On the STAGE test which
was developed by the
Virginia Community
College (VCCS)
Assessment Coordinators,
2013 graduates score was
4.49 vs. 4.86 for 2012
graduates vs. 4.41 for 2011
graduates. The 2013
graduates score is below
the benchmark score set at
5.00, but does show an
increase from the 2011
graduates score of 4.41.
Value-Added Graduate
Survey based on a 5 point
scale with 5 being high
and 1 being low.
The Graduate Survey
based on a 5-point scale
with 1 being low showed
value-added for the 2013
graduates. Their score
increased from 3.61 upon
entering the college to
4.39 at graduation. The
2013 graduates showed a
decrease from the 2012
graduates’ score of 4.44,
but equal to the 2010
graduates’ score. The
weakest area again appears
to be recognizing the
importance of leaning a
second language.
Faculty Staff Survey based
on a scale from 1-5 with 1
being low & 5 being high.
Clubs and Student Support
Services continue to offer
activities to support
Analysis Procedure:
STAGE testing score
ranges from 1-10 with 1
being low. Benchmark is
set at 5.
Use Of Evaluation Results (Action Taken)
2012-13 Actions:
Faculty and students have participated in college based activities and
clubs at an increased rate. Active involvement has included
participation in Women’s Empowerment seminar, Hispanic Heritage
Month speaker, as well as movie night showings that related to
societal issues. Many courses outside of the social sciences also
include information related to diversity and social issues. More course
offerings in foreign languages have been added via SSDL to increase
cultural knowledge.
Faculty in social science courses continued to maintain added
activities provided to support cultural & social understanding
2011-12 Actions:
Faculty in social science courses continued to maintain added
activities provided to support cultural & social understanding.
Faculty have continued to build on the success of increased use of
multi-media sources to expand and enhance student understanding of
the impact that social institutions have on individuals and culture-past,
present, & future.
40
Social Science Courses
(US History & Western
Civilization) based on a
comprehensive test and a
benchmark set at 70%
proficiency.
cultural and social
understanding.
On the Faculty and Staff
Survey, the 2012-2013
graduates were rated 3.86
on a 5-point scale vs. 3.82
for 2011-2012 graduates.
The 2012-2013 score was
slightly higher than the
2011-2012 graduates
(3.82) and above the 201011 graduates (3.74), the
2008-09 graduates (3.69),
and the 2007-08 graduates
(3.56). The 2012-13 score
is above the benchmark
score set at 3.00. The
increase may be due to the
increase in student
activities, a movies series
which focuses on different
issues and clubs.
2011-12:
PDCCC's CCSSE mean
score for cultural/social
(understanding people of
other racial and ethnic
backgrounds) was 2.58 in
2012 vs.2.36 in 2008 vs.
2.41 in 2005. The
PDCCC's 2012 score
(2.58) was higher than
that of small colleges
(2.41) and the 2012 cohort
(2.43).
2010-11 Actions:
Faculty organized student group study sessions.
Group review of lectured materials was conducted before each of the
(three) exams. Students took more responsibility for their own
learning which resulted in a more social & cultural understanding of
American History.
Social science instructors used multi-media technologies to aide
instructing students.
Current events, recent video documentaries, movies on historical
events have all been utilized to enhance instruction.
Course notes & Internet source references were made available to
students through new online connections & hyperlinks.
Some added activities provided to support cultural & social
understanding were attending cultural movies, plays like the Color
Purple, listening to lectures on a variety of topics, visiting the cultural
arts centers, the general assembly, & museums.
Faculty & Staff Survey
41
2011 rated graduates 3.82
on a 5-point scale (with a
benchmark of 3.00) vs.
3.74 in 2010; 3.69 in 2008,
and 3.56 in 2007.
The social science instructors used multi-media technologies to aid in
instructing students. Current events, recent video documentaries,
movies on historical events have been utilized to enhance instruction.
Also course notes & Internet source references were made available to
students through new online connections.
Graduate Survey showed
an increase in value-added
for 2012 graduates from
3.75 upon entering the
College to 4.44 at
graduation vs. 4.00 to 4.67
in 2011; 3.81 to 4.39 in
2010; and 3.82 to 4.37 in
2009.
2010-11:
STAGE Test: 2010
graduates scored a 5.02
above benchmark score set
at 5.00.
Graduate Survey showed
an increase in value-added
for 2011 graduates from
4.00 upon entering the
College to 4.46 at
graduation vs. 3.81 to 4.39
in 2010; and 3.82 to 4.37
in 2009.
Students’ grades in
American History (History
121 & History 122)
improved Fall 2010
semester. Average test
range was 80-82 as
opposed to range of 75-78
previous semesters.
2010-11 Actions:
Current events, recent video documentaries, movies on
historical events have all been utilized to enhance instruction.
Course notes & Internet source references were made
available to students through new online connections &
hyperlinks.
Some added activities provided to support cultural & social
understanding were attending cultural movies, plays like the
Color Purple, listening to lectures on a variety of topics, visiting
the cultural arts centers, the general assembly, & museums.
The social science instructors used multi-media technologies to
aid in instructing students. Current events, recent video
documentaries, movies on historical events have been utilized
to enhance instruction.
42
Clubs & Student Support
Services continue to offer
activities like constitution
day celebration with
speakers, black history
celebrations, & lectures
series to support cultural &
social understanding.
Social Science Courses (US
History & Western
Civilization) met
proficiency. For fall 2010,
proficiency was 83%. This
compared to 88% for Fall
2009 and 80% for Fall
2008.
43
Student Learning
Outcomes (SLO)
Personal Development:
An individual engaged in
personal development
strives for physical wellbeing & emotional
maturity. Degree graduates
will demonstrate the
ability to (1) develop &/or
refine personal wellness
goals; & (2) develop &/or
enhance the knowledge,
skills, & understanding to
make informed academic,
social, personal, career, &
interpersonal decisions.
Assessment Methods &
Analysis Methods
Assessment Methods :
Wellness Inventory
developed by Notre Dame
University & modified by
Blue Ridge Community
College in Virginia.
STAGE Test.
Value-Added Graduate
Survey.
Faculty Staff Survey.
Community College
Survey of Student
Engagement (CCSSE):
general education mean
score based on a 4-point
scale and compared with
other cohorts.
SDV 100 & 108 (College
Success Courses).
Analysis Procedure:
Wellness Inventory
developed by Notre Dame
University & modified by
Blue Ridge Community
College in Virginia. Scores
are based on a six-point
scale with 1 being low.
STAGE test based on a
TRANSFER PROGRAM SLOs
Results Of Assessment
2012-13:
The main evaluation
measure for personal
development is the
Wellness Inventory
developed by Notre Dame
University and modified
by Blue Ridge Community
College in Virginia.
Scores are based on a sixpoint scale with 1 being
low. The overall personal
wellness scores for 2013
graduates were positive.
The mean score for 2013
graduates was 4.96 vs.
5.03 for 2012 graduates vs.
4.91 for 2011 graduates.
This was slightly lower
than the 2012 graduates,
but higher than the 2011
graduates. The weakest
area for 2007, 2008, 2009,
2010, 2011, 2012, and
2013 appears to be in the
area of physical wellness.
Physical wellness
decreased from 4.55 in
2012 to 4.49 in 2013.
Environmental wellness
has decreased from 5.21 in
2012 to 5.14 in 2013.
Emotional/Psychological
wellness decreased from
4.85 in 2012 to 4.74 in
Use Of Evaluation Results (Action Taken)
2012-13 Actions:
The SDV classes continued to incorporate an emphasis on student
health to include good eating habits, proper rest, and exercise. The
instructors provided activities that encouraged students to track their
eating habits and fitness exercises. To assist students in managing
their time, students were required to keep a journal of their daily
activities by using a daily planner provided by the College. As the
instructors focused on the physical wellness of the students, they also
concentrated on the VA Wizard and financial literacy. The VA Wizard
is a valuable tool that assists students in determining the correct
program of study and financial aid information. Financial literacy
assisted students in understanding the importance of managing their
finances and avoiding credit card debt. The instructors recognize that
the financial wellness of students is as important as their physical
wellness. Overall, College Success Skills courses provide the
educational opportunities that assist students in gaining the skills
necessary to ensure successful learning outcomes that prepare them
for the twenty-first century workforce.
44
10-point scale benchmark
score set at 5.00.
Faculty & Staff Survey,
rated on a 5-point scale
with 1 being low.
Graduate Survey valueadded based on a 5-point
scale with 1 being low
showed value-added.
SDV 100 & 108 (College
Success Courses) based on
a comprehensive test &
benchmark set at 70%
proficiency.
2013. Intellectual wellness
decreased from 5.11 in
2012 to 5.00 in 2013.
Occupational wellness
increased from 5.21 in
2012 to 5.24 in 2013. This
increase may have resulted
from a grant which
provided a number of
career coaches to assist
students in career
development. Social
wellness decreased from
5.23 in 2012 to 5.16 in
2013. Overall, all scores
have been above the
benchmark score of 4.00
(frequently).
On the STAGE test based
on a 10-point scale and
developed by the Virginia
Community College
System (VCCS)
Assessment Coordinators,
the 2013 graduates scored
6.33 vs. 7.42 for 2012
graduates vs. 6.48 for 2011
graduates. The 2013
graduates’ score is above
the benchmark score set at
5.00, but below the 2012
graduates’ score.
The Graduate Survey
based on a 5-point scale
with 1 being low showed
value-added for the 2013
graduates. Their scores
45
increased from 3.72 upon
entering the college to
4.37 at graduation. The
2013 graduates (4.37)
showed a slight decrease
from the 2012 graduates’
score of 4.42.
On the Faculty and Staff
Survey, the 2012-2013
graduates were rated 3.83
on a 5-point scale vs. 3.86
for 2011-2012 graduates.
The 2013 score was
slightly below the 20112012 graduates, but above
the 2008-2009 graduates
(3.80) and above the 200708 graduates (3.70). The
2012-13 score is above the
benchmark score set at
3.00.
2011-12:
PDCCC's CCSSE mean
score for personal
development
(understanding yourself)
was 2.79 in 2012 vs. 2.80
in 2008 vs. 2.71 in 2005.
The PDCCC's 2012 score
(2.79) was higher than that
of small colleges (2.68)
and the 2012 cohort (2.66).
STAGE Test, The 2012
graduates scored 7.42 on a
10-point scale; the 2011
graduates scored 6.48; the
2011-12 Actions:
SDV classes continued the incorporation of more emphases on
exercising & healthy living tasks.
Student activities team incorporated significantly more activities
related to personal wellness such as games, exercising, and
information sessions.
Additional workshops & seminars were delivered in stress reduction,
time management & study strategies.
Science courses, biological concepts & principles are delivered
through Anatomy, Physiology, & Microbiology. Instruction
emphasized the relevance of subject to students’ health & wellbeing.
46
2010 graduates scored a
6.81. All scores were
above the 5.00 benchmark
score. 2012 graduates
scored higher than 2011.
Faculty & Staff Survey
2011 rated graduates 3.86
on a 5-point scale (with a
benchmark of 3.00)
vs.3.83 in 2010; 3.80 in
2008; and 3.70 in 2007.
2010-11:
Overall personal wellness
2010 graduates mean score
on the Wellness Inventory
was 5.00, an increase from
2009 graduate scores of
4.94 & from 2008
graduate scores of 4.88.
Weakest area for 2007,
2008, 2009, & 2010
appears to be in the area of
Physical wellness
especially the lack of
exercising including
activities that build the
heart, muscles, &
flexibility. Physical
wellness, however, has
increased from 4.44 in
2007 to 4.57 in 2010.
Environmental wellness
has increased from 4.70 in
2007 to 5.15 in 2010.
Emotional/Psychological
wellness has remained
constant from 4.79 in 2007
to 4.80 in 2010.
2010-11 Actions:
Changed SDV textbook to provide more current set of learning
principles for instructors & students.
SDV classes incorporated more emphases on exercising & healthy
living tasks. Specific changes made in the SDV classes to address
weaknesses were the following:
Physical Wellness: Provided information & emphasized the benefits
of maintaining optimal physical health. Allowed students to give
examples of how they incorporate physical movement in their daily
lives.
Emotional/Psychological Wellness: Upon students identifying the
many responsibilities that they have, lessons taught on the impo of
setting daily priorities. By achieving this, students shared that it
reduced some negative stress. The results were verbally shared by the
students.
Occupational Wellness: Students identified long-term goals. In
accordance with those goals, additional time was spent in class
confirming that students were enrolled in the appropriate educational
plan. In cases where students were not enrolled in the correct program
of study, they were advised on the better plan & instructed to change
their major in the admissions office.
Social Wellness: To ensure that students were exposed to working
47
Intellectual wellness has
increased from 5.02 in
2007 to 5.06 in 2010.
Occupational wellness has
fallen from 5.25 in 2007 to
5.16 in 2010. Social
wellness has also fallen
from 5.50 in 2007 to 5.26
in 2010. Overall, however,
all scores have been above
the benchmark score of 4.
with others, the students were engaged in various in-class group
activities.
STAGE Test, The 2011
graduates scored 6.48 on a
10-point scale vs. the 2010
graduates scored a 6.81.
All scores were above the
5.00 benchmark score.
Science courses, biological concepts & principles are delivered
through Anatomy, Physiology, & Microbiology. Instruction
emphasized the relevance of subject to students’ health & well-being.
Several PED classes were added such as Yoga, saltwater fishing,
bowling, & tennis to provide some variety for students.
Student activities team incorporated more activities related to personal
wellness esp. exercising.
Additional workshops & seminars were delivered in stress reduction,
time management & study strategies.
Faculty & Staff Survey
2010 rated graduates 3.83
on a 5-point scale (with a
benchmark of 3.00) vs.
3.80 in 2008; and 3.70 in
2007.
Graduate Survey showed
an increase in value-added
for 2011 graduates from
4.21 upon entering the
College to 4.79 at
graduation vs. 3.86 to 4.42
in 2010; and 3.74 to 4.34
in 2009.
Clubs & Student Activities
continued to offer personal
development activities
each semester. SDV 100 &
108 (College Success
Courses) met its
48
benchmark proficiency in
personal development. For
Fall 2010 proficiency was
86%. This compares to
86% in Fall 2009 & 83%
in Fall 2007.
49
The College is committed to using the VCCS Standardized Core Competency assessment
data across all transfer programs for overall academic improvement. The previous matrix
presents evidence of Transfer Program SLO assessment and the resulting data used for program
improvements. In 2007, the faculty adopted a capstone course, PHI 115: Practical Reasoning.
Since its inception, data-driven improvements have been made to this course. The first
modification was to eliminate open enrollment in PHI 115. Only graduating students were
enrolled in the course. Additionally, ENG 111: College Composition I was added as a
prerequisite for PHI 115. Handouts, manuscripts, and booklets for each core competency module
were prepared to assist students. Websites were added for each core competency module,
allowing students to visit and acclimate themselves to terms, principles, and formula associated
with specific subject content. Each student was invited to visit courses related to any module or
subject in which he or she may have experienced difficulty in satisfactorily completing the
requirements. This course was instructed by a diverse team of full-time and adjunct instructors.
A Blackboard course management site was created and modified for use with the PHI 115:
Practical Reasoning course. It organized and categorized data so that students can easily follow
instructions and locate assignments or assessments. Utilizing the VCCS Core Competency
Report and BlackBoard websites, instructors began reinforcing the importance and significance
of not only successfully completing PHI 115 but also successfully completing the core
competency assessments. A PHI 115 team member coordinated and advocated the use of the
College’s Learning Resource Center (Library). Additionally, VCCS tutorials were made
available to faculty and students. The result of these interventions has been improved Core
Competency Test results for the College.
50
Additional program-specific examples of the College’s program assessment and datadriven improvement are provided throughout the remainder of this section. This documentation
provides a snapshot view of the distinct learner-centered outcomes and data results used for
program improvement and actions taken. This sampling of SLO statements includes the Student
Learning Outcome, Assessment and Analysis methods, Results of Evaluations for this outcome,
and Use of Evaluation results and actions taken for improvement. SLOs specific to each
Occupational-Technical programs are presented following the transfer programs discussion.
TRANSFER PROGRAMS
The College offers a group of four degrees under the title AA&S Transfer Programs.
These programs are designed for students planning to transfer to a four-year college or university
to complete a baccalaureate degree. The specific colleges Transfer Programs are Science,
Business Administration, General Studies, and Education.
A review of Transfer Programs’ assessment reports verified the College’s assertions of its
utilization of assessment data and demonstrates the practice of following assessment with datadriven program improvement based on analysis of results and actions taken. This section of the
Monitoring Report presents a sample of transfers program-specific SLOs, data analysis, and
actions taken resulting in program improvement. Student proficiency in core competencies is
formatively assessed throughout each program. The faculty members use multiple measures,
such as rubrics, midterm and final examinations, Blackboard discussion questions, and oral and
written assignments across the programs. Presented in the following tables are additional
strategies used to assess and evaluate program learning outcomes and the actions taken to
improve programs.
The next section presents a few program-specific objectives for the Science Program.
51
Science Program
The Transfer Programs lead to the Associate in Arts and Science (AA&S) degree. It is the
first two years of a four-year college or university degree. The AA&S Science Program is
recommended for students who plan to transfer to a four-year college or university to complete a
bachelor’s degree, usually the Bachelor of Science degree, in the pre-professional or scientific
fields.
Learning Outcomes
The graduates will be able
to communicate about
science using appropriate
oral and written means
70% of the time.
SCIENCE PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
Assessment Methods
2012-13:
Primary assessment is
Both the nature of science
performed in BIO 101-102 and its application were
using classroom learning
grasped fairly well by 90
assessments, Blackboard
percent of the students. To
discussions, & projects.
a relatively lower rate of
proficiency was the
Students differentiate
understanding of scientific
among degrees of
investigation, with 82% of
credibility, accuracy, &
students demonstrating
reliability of conclusions
skill on consistent basis.
drawn from given data
As for value added,
through mandatory
graduates were 72%
classroom & online
proficient at the end of the
discussions.
program compared to
when they first entering
Assess the graduate’s
the program on a 5-point
identification of
rubric scale.
presuppositions, parallels,
& assumptions through
2011-12:
For 2011-12 the 70 percent
observation,
proficiency expectation was
demonstration, &
exceeded in all respects;
application of scientific
Assessment results lower
instrumentation &
than 90 percent were on
laboratory protocols.
Analysis Procedure:
Instructors have used
rubrics to match
curriculum to core
competencies & ensure
well-designed instruction.
Rubrics have been
developed to use with the
account of truancy, when
some students failed to
attend classes or failed to
complete course assessment
tasks.
2010-11:
For 2010-11, greater than
70% of the students were
able demonstrate skills in
recognizing links between
data & conclusion more
than 80% of the time.
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
Perhaps, the protocol of
science should be directed
pointed out, rather than let
if diffuse along with other
concepts. One or two
additional lecture modules
is dedicated to further
explantion of experimental
methods.
2011-12 Actions:
We continued to make the
effort in seeing that students
complete all their required
course work and make
referrals to counselors or
tutoring.
2010-11 Actions:
Student working in lab
groups stopped the
practice of apportioning
exercises & lab work
among group members on
a permanent basis. This is
52
Greater than 75% of
students have
demonstrated ability to
recognize parallels &
assumptions in scientific
data & summaries during
more than 80% of the
time.
Eight out of 10 students
were able to follow lab
protocol & make effective
use of equipment &
instruments during 9 out of
10 episodes.
the practice in which an
individual group member
is assigned one task, e.g.,
recording, & may or may
not collect data for the
group throughout the
semester. By rotating
tasks, members of each
group were able to develop
the necessary manual
dexterity & practical
ability to operate lab
equipment.
Proficiency levels
exceeding 80% in other,
non-primary measures,
namely: 85% for items
from laboratory practical
testing - correlation tests; a
90% proficiency in
identifying & using
analogies; higher than
90% proficiency for
scientific & deductive
skills measured from
Learning Outcomes
The graduates will be
capable of using
quantitative information
and/or mathematical
analysis to obtain sound
results and recognize
questionable assumptions
70% of the time.
SCIENCE PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
students. Based on the
routine quiz & course test
varied tasks used to assess items; & 87% proficiency
as mentioned above,
in evidence of scientific &
instructors review for
empirical skills,
trends & satisfactory
determined from
progress in the selected
assessment items in
areas of study.
laboratory & field
experiment reports.
Assessment Methods
Assess the use of graphs,
symbols, & numerals on
numerical methods when
analyzing, organizing, &
interpreting data. The
primary measures will be
from BIO 101-102 lab
work using varied tasks.
Analysis Procedure:
Based on the varied tasks
2012-13:
Achieving this objective in
science courses came
relatively easy because
students would always
perform experiments in
groups, thus were able to
prop each other. However,
a few students working
individually (by their
choice) were not able to
demonstrate basic
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
Laboratory science, as
well as simulation
experiments, are quite
engaging, and often
requires partnership.
Students must understand
that unless they are
conversant with many
academic disciplines,
especially mathematics,
they should open
53
used to assess as
mentioned above,
instructors looked for
trends & satisfactory
progress in the selected
areas of study. Students’
ability to explain &
interpret data in labs was
75% and 80% & they were
able to follow lab
protocols that required
analysis. Students’
responses to online
discussion & projects
(99%) far exceeded exam
settings (80%). Still all
progress is satisfactory.
Instructors have used
rubrics to match
curriculum to core
competencies & ensure
well-designed instruction.
competency in managing
data or presenting it.
Overall percent
proficiency was 88%. As
for value added, graduates
were 72% proficient at the
end of the program
compared to when they
first entering the program
on a 5-point rubric scale.
themselves to working in
groups, rather than
individually.
2011-12:
For Fall 2011, Proficiency,
expressed as percent
students' success on
specific measures and
rates would have been
100% at rates higher than
70 percent. However,
because of late
withdrawals and failure on
part of some students to
meet certain, required
course tasks, the overall
proportion has dropped to
less than 90%. the
frequency of success rates
has been over 80%. In
other words, 90% percent
of the students has met the
required, expected
outcome in over 80% of
the tasks involved in the
course work.
2011-12 Actions:
As noted below, truancy
leads to failure. We must
ensure that students
enrolled in our courses
have the mental and
material readiness to meet
the rigor and complete the
required work. Thus, more
and better advising and
counseling is emphasized.
Most of the problems that
impair students' success
are from outside the
college. I believe that
family, social and financial
issues play a major role in
students' success or
failures. The college is
looking into those aspects
of students’ lives for a
broader, integrated
approach for students'
success.
2010-11:
For Fall 2010, there was
an improvement of nearly
9 percent in aggregate
performance, & in some
individual assessment of
separate components, the
gain was up to 11 % over
that of the previous year. It
appears that additional
class activities in data
analysis & computer
remediation helped in this
improvement.
2010-11 Actions:
Students were advised to
enroll in basic computer
courses & spreadsheet
application prior to or
along with their basic
science courses.
2009-10:
For Fall 2009: Initially,
slightly fewer than 70%
students were able to
More writing exercises are
given to students, &
students are encouraged to
enhance their use of the
Learning Resource Center
and other writing aids to
improve on the quality of
their writing.
We have substantially
increased the use of
54
demonstrate effective use
of spreadsheet, design &
develop graphs. This gap
was closed by the end of
the weeks covering these
activities.
Students will be able to
collect, analyze, and
interpret mathematical
formulas, models, tables
and schematics with 70%
proficiency.
Assessment Method:
Assessed the use of
graphs, symbols, and
numerals for analyzing,
organizing, and
interpreting data.
Analysis Procedure:
The primary measures will
be from BIO 101-102 lab
exercises, simulation
experiments, and field
activities.
2012-13:
Success rates in BIO
courses were over 89%,
well above the 70%
expected minimum
proficiency rates. Similar
success rate applies to the
volume of work achieved.
As for value added,
Science graduates were
72% proficient at the end
of the program which
represented a gain of 0.20
points from when they first
entering the program on a
5-point rubric scale.
2011-12:
In fall 2011, 87.5 percent
of students assessed were
able to demonstrate skills
in data interpretation and
presentation by correct
tabulation of experimental
results, as well as using SI
units to assign correct
meanings to values,
namely, micrograms,
microliters, etc. These
students were also able to
respond to questions and
make accurate predictions
based on data which they
themselves collected.
Students had also shown
consistency in positive
performance over 90
percent of the times.
Fewer than 15% of the
students failed to
participate in course
activities or failed to
present enough work for
adequate assessment. Test
and quiz assessments
multimedia for
illustrations &
experimental simulations.
We have purchased new
equipment worth $70 000
during the last year & a
half, that are undoubtedly
helping with student
success.
2012-13 Actions:
As in previous years, the
focus is on consistent
attendance in order to
master the skills.
2011-12 Actions:
My evidences indicate that
the acquisition of
proficiency is almost
always guaranteed for
those students who
invested the time, and
completed individual and
group course work
activities, as well as
attended classes and labs.
Therefore, the college is
geared towards attendance
and completion of all
required course work - a
fact that has been in the
literature for a long time.
55
support evidence of
proficiency collected from
laboratory, simulation and
field activities.
The Science department, considered the essence of proficiency in scientific skills, as the
ability to communicate ideas and concepts learnt - by hands-on demonstration, oral and or
written communications. The courses used for these determination included general biology,
anatomy and physiology, and microbiology. Other courses included meteorology and
oceanography. In all cases students were able to perform and set up live experiment; collect data,
and respond positively to questions concerning lessons learnt in those experiments. Similar
learning experiences were achieved in simulated experiments, especially in field-based
oceanography where site experiments were difficult to set up. In cases where data was streamed meteorology and oceanography, student were able to interpret data or extract new, sub-sets of
data from the original ones. Students in all the courses used for the present assessment were
assigned special projects, which provided the means for both written and oral demonstrations of
scientific skills. Students were required to present all aspects of scientific work, whether original
or, as review of someone's work. Components of these skills were assessed and recorded. There
was also a major, additional assessment component drawn from weekly lab exercises, which
incorporates most of the elements of scientific skills. Tests and quiz items, although used to a
limited extent for assessment of program success, were especially selected to measure specific
aspects of scientific skills.
The previous matrices provide supporting documentation that the Science Program does
establish program-specific objectives. The Program also identified assessment methods and
analysis procedures. Finally, the results were used to take program-improving action. The
56
College’s Science Program has shown an excellent upward trend in retention and student
enrollment. The Science Program retention rate has increased from a level of 60% for fall 2007 –
Spring 2008 to 75% for fall 2012 – spring 2013.
Category
Percent
Numbers
SCIENCE PROGRAM RETENTION RATES
Fall 2007
Fall 2008
Fall 2009- Fall 2010Spring
Spring
Spring
Spring
2008
2009
2010
2011
60%
45/75
68%
47/69
68%
50/73
70%
57/81
Fall 2011Spring
2012
68%
56/82
Fall
2012Spring
2013
75%
56/75
The program has shown a marked increase in both FTE’s and student headcount over the
last three years of data points as depicted in the following table. The full-time equivalent students
(FTE) have increased from 38 in fall 2009 to 52 in fall 2011. This is above the SCHEV’s
standard of 17 FTES for Transfer Programs. Student headcount has also increased from 56 in fall
2009 to 77 in fall 2011.
SCIENCE PROGRAM: HEADCOUNT AND FTE
FALL 2009
FALL 2010
FALL 2011
FTE’s
38
44
52
Student Headcount
56
66
77
FALL 2012
48
72
The program has been relatively constant over the past 5-year with an average of 6.8 graduates.
It did however drop in 2013 due to a decline in enrollment.
Category
Graduates
SCIENCE PROGRAM GRADUATES
2009
2010
2011
2012
7
5
8
10
2013
4
57
The previous matrices present evidence of satisfactory progress within the program.
Instructors are continuing to analyze student knowledge and make adjustments to instruction and
assessments. Annual course assessments continue to provide data for ongoing analysis. The
actions taken by the faculty have resulted in improved Fall to Spring Retention as
The next section presents a few program-specific objectives for the Business
Administration Program.
58
Business Administration Program
There is a demand for qualified personnel in business administration to promote
leadership to facilitate economic growth in Virginia business and industry. The College’s AA&S
degree program in Business Administration is designed for students who plan to transfer to a
four-year college or university to complete a baccalaureate degree in business administration,
accounting, management, marketing, economics, or finance. The following matrix presents
evidence that this program is assessing student outcomes and using evaluation results to improve.
Learning Outcomes
Students will demonstrate
accounting principles &
application to various
businesses which cover
the accounting cycle,
income determination, &
financial reporting with
70% proficiency.
BUSINESS ADMINISTRATION PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
Assessment Methods
2012-13:
Final exam ACC 211, Principles
As for value added,
of Accounting.
graduates were 87%
proficient at the end of
Pre- & Post-testing in ACC 211,
the program which
using an instructor-made rubric.
represented a gain of
1.17 points from when
Analysis Procedure
they first entering the
The following rubric is used by
program on a 5-point
the instructor to analyze student
rubric scale.
proficiency.
Standards:
For 2011-12:
1. Write Accounting Equation.
Pre/Post testing in
2. Create an Income Statement. Spring Classes for ACC
3. Create Retained Earnings
211 showed an increase
Statement.
from 30% (pre-test) to
4. Create a Balance Sheet.
90% (post-test). Fall
Criteria:
ACC 211 Class mid1. No attempt.
term results were 99%
2 Partial correct response.
correct for accounting
3 Correct response.
cycle.
2010-11:
For all 2010 pre- &
post- test, proficiency
was 85%.
2009-10
For Fall 2009,
proficiency was 71% vs.
75% for Fall 2008 &
100% for Fall 2007.
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
No additional action at
this time
2011-12 Actions:
Upgraded to
QuickBooks 2012
software. Required
formal report analyzing
annual summaries of
businesses for
profitability and
solvency.
2010-11 Actions:
Developed an
accounting project
focusing on student
analysis of business
annual reports &
financial statements.
Incorporated discussions
of real-life examples
using corporate financial
59
For Spring 2009,
proficiency was 83% vs.
78% for Spring 2008.
Upon completion of
degree program,
graduates will be able to
identify, compare, &
evaluate various
economic principles 80%
of the time.
Assessment Methods
Final exam in ECO 201:
Principles of Economics I,
using an instructor-made rubric.
Analysis Procedure
Instructor analyzed student
responses in several dimensions
through short answer, cumulative
essays & discussions. The
instructor examined the responses
to the above three dimensions by
item analysis as well as
comparison of knowledge level
to application level.
2012-13:
As for value added,
graduates were 87%
proficient at the end of
the program which
represented a gain of
1.20 points from when
they first entering the
program on a 5-point
rubric scale.
statements.
Expanded course
incorporation of webbased software including
use of the Cengage
software & commercial
accounting software
packages. A one credit
course in Computerized
Accounting, ACC 110,
was offered for students
to take concurrently with
the regular accounting
course, ACC 211.
2012-13 Actions:
No additional action
needed at this time.
2011-12:
Math instructors pointed
out applications in their
math classes that
specifically related to
economic & business
situations.
2011-12 Actions:
Incorporated deeper
discussions within
lectures & provided
more frequent real-life
examples to supplement
text.
2010-11:
For Fall 2010,
proficiency was 92.5%.
2010-11:
Math instructors will
point out applications in
their math classes
related to economic and
business situations.
The College’s Business Administration AA&S degree program has shown an increase in
both FTE’s and student headcount over a three-year period as depicted in the following table.
This is above the SCHEV’s standard of 17 FTE’s for Transfer Programs.
60
BUSINESS ADMINISTRATION: HEADCOUNT AND FTE
FALL 2009
FALL 2010
FALL 2011 FALL 2012
FTE’s
49
47
55
50
Student Headcount
80
76
85
86
The fall to spring retention rate for the Business Administration program has increased
from 63% in fall 2008 to spring 2009 to 70% in fall 2012 to spring 2013.
BUSINESS ADMINISTRATION PROGRAM RETENTION RATES
Category
Fall 2008
Fall 2009
Fall 2010Fall 2011- Fall 2012Spring
Spring
Spring
Spring
Spring
2009
2010
2011
2012
2013
Percent
63%
67%
68%
78%
70%
Numbers
43/76
53/79
49/72
59/76
60/86
The program increased its number of graduates from 13 in 2009 to 18 in 2013 which is above
SCHEV’s standard of 12 graduates.
BUSINESS ADMINISTRATION PROGRAM GRADUATES
Category
2009
2010
2011
2012
2013
Graduates
13
11
14
13
18
The previous matrix of SLOs presents evidence of satisfactory progress within the
Business Administration program. Instructors continue to analyze student knowledge and make
adjustments to instruction and assessments. Annual course assessments continue to provide
ongoing data for analysis and new actions taken for improvement.
The next section will present a few program-specific objectives for the General Studies
Program.
61
General Studies Program
The General Studies Program is specifically designed for students who desire to transfer
to a four-year college or university and who need the flexibility to broaden or narrow as much as
possible their first two years of undergraduate education. There are two specializations under
General Studies: Computer Science and General Studies General. The General Studies
specialization allows students to take twelve hours of courses specific to their needs, whereas the
Computer Science Specialization has content specific to the computer field.
The Computer Science Specialization is specifically designed to provide the student with
preparation necessary to transfer to a university program in Computer Science. Students seeking
immediate employment in the computing field are better served by choosing one of the
Information Service Technology (IST) specializations in the management program such as, the
General Business Management or Computer Science Specialization. However, those students
who desire to complete a Bachelor’s degree in Computer Science can get the foundation
necessary to successfully transfer. Courses are taught with the Association for Computing
Machinery (ACM) guidelines and parallel university instruction. In addition, students are
encouraged to select mathematics courses based on where they plan to transfer. Students should
understand that most university computer science programs require engineering calculus. The
following matrix demonstrates how the General Science Program assesses SLOs and uses the
results for program improvement.
62
GENERAL STUDIES: COMPUTER SCIENCE SPECIALIZATION PROGRAM
Learning Outcomes
Assessment & Analysis
Results Of Analysis
Use Of Evaluation
Methods
Results (Action Taken)
Students will demonstrate
Assessment Methods
2012-13:
2012-13 Actions:
an understanding of
Students have several
As for value added,
Videos developed for the
computing math concepts
projects & exam sections
graduates were 77%
CSC 200 course have
at the 80% level.
on the math content. The
proficient at the end of the shown value in improving
work is evaluated on the
program which
student performance.
projects for students
represented a gain of 0.90
having 70 or above. The
points from when they first
test questions in the area
entering the program on a
are the same form but
5-point rubric scale.
different content pre- &
At the end of the CSC 205
post-testing.
assessment cycle, the
student proficiency on the
Analysis Procedure:
math skills was 88% for
Students take the math
Spring 2013. In CSC 200,
content evaluation several
these same students start
times in the term, with
with virtually no
access to video instruction understanding of the
to help on the topics.
computer math topic.
Students can repeat the
process until they master
2011-12:
2011-12 Actions:
the content. Students are
For the Spring 2011,
The content has been
encouraged to repeat
student performance on
increased in CSC 200 with
content, & the exit skill
the math skills in the CSC
new videos developed and
level is what is measured.
205 course were rated at
more online help being
the 76% level.
available. A computer
architecture practice tool is
For the Spring 2012,
planned and implemented
preliminary CSC 205
in the Fall of 2012.
results showed students
Finally, the CSC 200
were performing at the
course received a grant for
77% level. The final
re-engineering to focus on
measure collected in late
those math skills and video
April of 2012 showed a
and online delivery of
completion performance at content. The revision is
the 74% level.
implemented in the Fall of
2012.
2009-10:
For fall 2009, proficiency
was 73% vs. 71% for Fall
2007 based on a
comprehensive exam.
2009-10 Actions:
Program objectives were
rewritten to be more
reflective of current
Computer Science topics.
In the Spring of 2010,
student proficiency in the
overall course was similar
(67%), but the math-only
assessment indicated the
desired proficiency
(80.9%). New instructional
tools were developed &
introduced, & the rates
Instructional video content
was developed on the math
content, which helped both
math-specific & overall
course outcomes.
Because of its
foundational importance to
the study of computer
63
increased to 75% for the
course & 90% for the math
in Fall 2010.
science, the math was
made a separate objective
in 2009.
A stronger link was
established with the
program exit point (CSC
205), using a pre- & posttest scheme over the twocourse sequence.
Major content in cluster
computers was added to
improve course depth.
Students will demonstrate
an understanding of
hierarchical structure of
computer architecture at
the 80% level.
Assessment Methods
Overall exam performance
in CSC 200 & 205,
Computer Organization
using an instructor made
rubric.
Analysis Procedure
Student composite exam
scores are evaluated over
the two-course sequence.
The scores of completers
for CSC 200 are the only
scores used for analysis
purposes. The exams are
on the same content, but
the CSC 205 exams are
much more detailed.
2012-13:
As for value added,
graduates were 77%
proficient at the end of the
program which
represented a gain of 0.79
points from when they first
entering the program on a
5-point rubric scale.
At the end of the CSC 205
assessment cycle, the
overal proficiency in the
content was 67% percent.
While this is lower than
the target, when average
with the computer math
skill the target is nearly
accomplished.
Improvement is still
needed in the architecture
area.
2012-13 Actions:
Additional content was
added to the start of CSC
205 with added emphasis
on the computer
architecture area.
2011-12:
With the Spring 2011 and
2012 groups, results are
moving toward the 80%
objective for CSC 205
completers.
2011-12 Actions:
The content has been
increased in CSC 200 with
new videos developed and
more online help being
available. A computer
architecture practice tool is
planned and will be
implemented in the Fall of
2012. Finally, the CSC
200 course received a
grant for re-engineering to
focus on those math skills
and video and online
64
delivery of content. The
revision will be
implemented in the Fall of
2012.
2010-11:
For Spring 2011, the
student understanding of
computing architecture
was at the 61% level. The
Spring 2012 assessment
completed in late April of
2012 showed a level of
64%, but the
understanding of machine
instruction meaning had
increased to 74%, which is
approaching the target.
2010-11 Actions:
A multi-test evaluation
was completed in both
CSC 200 & 205 to
compare outcomes.
CSC 205 content was
revised to discuss & then
build parts of a cluster
computer to help
understand the way
systems are built.
Similar elements from
CSC 200 for course
completers had been at
75% (exams) & 90%
(assessments) for the Fall
of 2010. Since the CSC
205 is at a higher level of
content, the outcomes had
improved.
2009-10:
For Fall 2009, proficiency
was 73% vs. 100% for Fall
2008 & 71% for Fall 2007
based on comprehensive
exam.
Spring of 2010, final exam
was replaced by exam
composites & separate
evaluated assessed
elements. Student
performance was up to
82% in the course & 89%
on the assessed elements.
2009-10 Actions:
A detailed rubric was
developed in 2010.
Enhanced content on
transistors & the link to
Central Processing Units
(CPU’s) was added to
course syllabus & content
to improve course depth
The program’s student headcounts has remained relatively constant over the last three years of
data points. The FTE’s is well above the SCHEV’s standard of 17 FTES the Transfer Program.
65
GENERAL STUDIES PROGRAM: HEADCOUNT AND FTE
FALL 2009
FALL 2010
FALL 2011
FALL 2012
FTE’s
291
273
264
223
Student Headcount
197
188
183
150
The fall to spring retention rate for the General Studies program has shown an increase
from 60.0% for fall 2008- spring 2009 to 66% for fall 2012-spring 2013.
Category
Percent
Numbers
GENERAL STUDIES RETENTION RATES
Fall 2008
Fall 2009
Fall 2010Fall 2011Spring
Spring
Spring
Spring
2009
2010
2011
2012
60.0%
63.0%
65.7%
73.6%
177/295
170/270
180/274
209/284
Fall 2012Spring
2013
66%
153/232
The program over the past 5-years has exceeded SCHEV’s standard of 12 graduates.
GENERAL STUDIES PROGRAM GRADUATES
Category
2009
2010
2011
2012
Graduates
40
32
30
32
2013
42
The previous matrices present evidence that students in General Studies Computer
Science Specialization improved when the math content was emphasized and additional study
materials were developed and made available. The fact that these were video based gives
students a better opportunity to master the skill. The results show that students improve along
the sequence as they go from CSC 200 to CSC 205. In CSC 205, the increase in the detailed
content in the architecture area has improved student performance because of the additional
depth of coverage. Additionally, the General Studies Program retention rate has increased from
a level of 32% for Fall 2007 to Fall 2008 to 40% for Fall 2009 to Fall 2010.
The next section presents a few program-specific objectives for the Education Program.
66
Education Program
This program of study is recommended for students who plan to transfer to a four-year
college or university to receive a bachelor’s degree and meet the state teacher certification
requirements for Early Childhood (PK-3), Elementary (PK-6), Middle School (6-8), or selected
areas of Special Education. The program conforms to the recommendations of the VCCS
Teacher Education Task Force that were adopted by the State Board for Community Colleges in
February 2003. Four-year institutions that accept this completed program for transfer include
George Mason, James Madison, Longwood, Norfolk State, Old Dominion, Radford, Virginia
Commonwealth, William and Mary, Mary Baldwin, and Virginia Union. The following matrix
demonstrates an Education Program specific SLO and how the evaluation (assessment, analysis,
and use of data-driven evaluation results and actions taken) is accomplished.
Learning Outcomes
Upon completion of
degree program, 80% of
graduates will be aware of
current issues & trends in
the K-12 education field,
& can relate principles,
theories, & the history of
education in the United
States to actual practice in
the classroom.
EDUCATION PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
Assessment Methods
2012-13:
Primary: final exam in
As for value added,
EDU 200, Introduction to
graduates were 87%
Teaching as a Profession.
proficient at the end of the
program which
Instructor made rubric.
represented a gain of 1.11
points from when they first
Field component
entering the program on a
completion.
5-point rubric scale.
Analysis Procedure:
Based on the students’
responses to the
cumulative exam, the
instructor will compare
knowledge to the
benchmarks determined.
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
No additional action at this
time.
2011-12:
For Fall 2011, proficiency
was 100% vs. 91% for fall
2010 based on a
comprehensive test.
2011-12 Actions:
The instructor provided
web based resources, guest
speakers and numerous
print materials that the
students analyzed and
evaluated. Continue using
current trends and field
experiences in the
classroom to better prepare
them to teach.
2010-11:
For Fall 2010, proficiency
was 91% based on a
2010-11 Actions:
Teaching methods were
adjusted to accommodate
Additionally, satisfactory
completion of field
component meets this
learning outcome.
67
comprehensive test. This
compares to Fall 2009’s
proficiency of 83% vs.
65% for Fall 2008.
the various student
learning styles
Researched current trends
related to course content &
in their class lectures relate
the course content to real
world experiences.
Students are required to
increase the time they
spend observing teachers
in the classroom by one
hour per week. This allows
the students to gain a more
in depth understanding of
the profession.
Improvements made
included the addition of
topics about teaching the
social network generation.
New initiatives included
incorporating the use of
SMART Boards to
enhance learning &
provide students with the
experience of using
technology before they
enter the classroom as
teachers.
The previous matrix presents evidence that education majors are satisfactorily measured on the
planned learning outcomes. The Education Program retention rate has increased from a level of
69% for fall 2007-spring 2008 to 74% for fall 2011-spring 2012.
Category
Percent
Numbers
EDUCATION PROGRAM RETENTION RATES
Fall 2007
Fall 2008
Fall 2009
Fall 2010
Spring 2008
Spring 2009
Spring 2010
Spring 2011
69%
71%
75%
71%
42/61
37/52
38/51
39/55
Fall 2011
Spring 2012
74%
41/55
The program’s FTE’s has remained relatively constant over the last three years of data
points and surpasses SCHEV’s standard of 17 FTES for the Transfer program.
68
EDUCATION PROGRAM: HEADCOUNT AND FTE
FALL 2009
FALL 2010
FALL 2011
FTE’s
37
37
40
Student Headcount
52
53
61
FALL 2012
36
56
The program has increased its graduates from 6 in 2009 to 8 in 2013.
Category
Graduates
EDUCATION PROGRAM GRADUATES
2009
2010
2011
2012
6
7
10
9
2013
8
The instructors continue to improve student knowledge in educational history by using
current trends and field experiences to better prepare them for transition to a teacher education
program in a 4-year institution. This is supported by the following IPEDS Graduation and
Transfer Rates for the College.
Year
Cohort
2011
2010
2009
2008
2007
Fall 2008
Fall 2007
Fall 2006
Fall 2005
Fall 2004
IPEDS GRADUATION RATES
Students
Graduates
Transfers
148
171
136
119
121
40
26
29
23
21
35
34
25
11
13
Graduation
Rates
27.0%
15.2%
21.3%
19.3%
17.4%
Transfer
Rates
23.6%
19.9%
18.4%
9.2%
10.7%
The previous narratives present evidence that both the Transfer Program SLOs and
program specific SLOs are being assessed and the results are used for program improvements.
Selected SLOs specific to each Occupational-Technical Program are presented in the next
section.
69
OCCUPATIONAL/TECHNICAL PROGRAMS
The College’s Occupational and Technical Programs are comprised of six degrees,
including Nursing, Management, Administrative Support Technology, Early Childhood
Development, Industrial Technology, and Administration of Justice. This section presents a
sample of student learning objectives and actions for improvement from each program. Student
proficiency in the majority of the learning outcomes is formatively assessed throughout each
program by faculty members using multiple measures, such as rubrics, midterm and final
examinations, Blackboard discussion questions, and oral and written assignments across the
programs. Additional program strategies used to assess and evaluate learning outcomes can be
found in the matrices below. The first section will present a few program-specific objectives for
the Nursing Program.
Nursing Program
The Nursing Program, which began at the College in Fall 2004, prepares selected
students to qualify as practitioners of technical nursing in a variety of healthcare settings. After
successful completion of the program, students are eligible to sit for the National Council
Licensure Examination for Registered Nurses (NCLEX-RN) for the Commonwealth of Virginia.
The statewide results of this examination (pass/non-pass for first time takers only) are sent to the
College’s program head quarterly. The program provides a background for maximum transfer
opportunities to four-year colleges and universities; therefore, its graduates must also meet the
College’s Transfer Programs SLOs.
The nursing faculty and the program advisory committee analyze the results of the VCCS
Core Competencies and STAGE test results for Information Literacy as they relate to the VCCS,
the College, and the Nursing Program. The STAGE test has a benchmark of 5. The College has
70
scored above this benchmark since 1999. The following tables demonstrate the data comparisons
(2003-2008) for the VCCS Core Competency and the STAGE Information Literacy (1999-2012)
Test.
Graduates
2003
2004
2008
N=112
2012 N=126
Year
Score
THE GENERAL EDUCATION AND CORE COMPETENCIES REPORT
Information Literacy Test Comparisons
Mean
Percent Meets or
Percent Advanced
Percent Proficient
Exceed Standard
Proficient
PDCCC
33.90
PDCCC
38.0%
PDCCC
18.0%
PDCCC
20.0%
VCCS
36.40
VCCS
53.2%
VCCS
28.7%
VCCS
24.4%
PDCCC
35.46
PDCCC
46.2%
PDCCC S
1.0%
PDCCC
45.2%
VCCS
46.1%
VCCS
1.0%
VCCS
45.1%
PDCCC
37.59
PDCCC
63.4%
PDCCC
30.4%
PDCCC
33.0%
Capstone
38.99
Capstone 66.7%
Capstone
33.3%
Capstone 33.3%
Nursing
41.32
Nursing
83.3%
Nursing
50.0%
Nursing
33.3%
PDCCC
VCCS
1999
7.26
64.4
65.2
PDCCC
VCCS
52.8%
58.0%
PDCCC
PDCC STAGE TESTING REPORT
Information Literacy Test Comparisons
2002
2003
2010
6.68
6.80
7.02
0.8%
PDCCC
2011
6.17
52.0%
2012
7.29
The analysis of results for the Information Literacy Core Competency scores
demonstrated that nursing student scores are higher than both the VCCS and the College’s
general student scores. Analysis of the STAGE test scores indicate that the College as a whole
scores higher than the established benchmark. In the case of Nursing, this can be attributed to the
integration of computer applications throughout the nursing curriculum. This integration is
important as healthcare institutions increase their use of technology throughout their systems. In
addition, a curriculum requirement is successful completion of ITE 115: Introduction to
Computer Applications and Concepts. The faculty and advisory board have recommended no
information literacy changes at this time and continue analysis of results as they are received.
The Nursing Program assesses its courses each semester and analyzes their overall impact
on the program each May. Further, the Nursing Program has ten student-focused program
outcomes, which are mandated and approved by the Virginia Board of Nursing. These are in
71
addition to the VCCS Core Competency requirements. The following matrix demonstrates a few
of the Nursing program-specific SLOs and how the evaluation (assessment, analysis, and use of
data driven evaluation results and actions taken) is accomplished.
Student Learning
Outcomes
By the completion of
the program the student
will: Use the nursing
process & critical
thinking to meet
multiple health needs
for clients across the
lifespan in a variety of
healthcare settings.
NURSING PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
VCCS Critical Thinking Core
Assessment Methods
VCCS Critical Thinking
Core Competency.
CCSSE Critical Thinking
Score
PDCCC STAGE Testing
Faculty/Staff Survey.
Kaplan Admission & Exit
Critical Thinking test.
Completion of Kaplan
(Kaplan) Remediation
Kaplan NCLEX-RN
Predictor Exam (Kaplan).
Faculty/Staff Survey –
perception critical thinking
skill of graduates.
Analysis Procedure:
VCCS Competency Critical Thinking: Nursing
students will score >
PDCCC benchmark score.
CCSSE Critical Thinking
Score
PDCCC STAGE Testing
Faculty/Staff Survey was
developed by IA&R,
faculty & staff is based on
a 5-point Likert scale with
one being low. Benchmark
is set at 3.
Graduate Survey valueadded Survey was
developed by IA&R,
faculty & staff based on a
5-point scale with 1 being
low showed value-added.
Benchmark is set at 3.
Critical thinking rating
Use Of Evaluation
Results (Action Taken)
Competency
PDCCC VCCS
Sp 2006 12.87
14.83
Sp 2007 13.86
15.39
Above mean score totals are for all 5
content areas assessed.
CCSSE Critical Thinking Score
(Q12)
2005 2008 2012
PDCCC
VCCS
Cohort
2.88
2.76
2.81
2.93
2.85
2.78
3.08
2.96
2.94
PDCCC STAGE Testing
(Benchmark 5)
2002
5.08
2003
5.39
2010
5.15
2012
5.25
Faculty/Staff Survey – perception
critical thinking skill of graduates:
2008
3.51
2009
3.85
2011
3.97
Kaplan Admission & Exit Testing
indicated that the students’ critical
thinking ability has increased from the
time they are admitted to when they
exit the program.
Kaplan Remediation:
2006-2010: 100% of students
requiring ATI remediation
successfully completed.
2009-2010: 100% of students
requiring Kaplan remediation
successfully completed.
2010-2011: 100% of students
requiring Kaplan remediation
successfully completed
2011-2012: 100% of students
requiring Kaplan remediation
successfully completed
2012-13 Actions:
In NUR 202 Set up a Mock
NCLEX day (NCLEX
Simulation). Check ID as
students enter the room,
fingerprint them. Have
security checks if the need
to leave. They can bring
nothing with them.
Continue current critical
thinking implementation in
the classroom, while
including more active
participation &
discussions. Continually
revise/update content to
remain current in practice.
Continue to evaluate test
items analysis and improve
as needed.
2011-12 Actions:
Nursing faculty completed
revised form to document
student successful
completion of Kaplan
remediation for all courses.
During Summer 2012
Curriculum Content
Mapping resulted in
movement of some content
to areas to enhance the
simple to complex
methodology used in the
program.
The faculty voted
unanimously, beginning
May 2012, to require
Focus Review tests,
counting as a quiz
grade, and to require
Integrated Tests with
remediation as assigned
in each course
Fall 2006 used ATI
Integrated Testing System
for content specific tests
throughout the program &
72
skill level based on a scale
from 1-5 with 1 being low
and 5 being high.
Benchmark is 3.
Kaplan Critical Thinking
Exams by ATI Testing
services. Nursing students
will score > Kaplan Norm.
Kaplan NCLEX-RN
Predictor Exam by ATI
Testing services,
nursing students will
score > the passage
score for individuals,
groups and National
Score for ATI and
Kaplan Exams
2012-13: 100% of students requiring
Kaplan remediation successfully
completed
ATI NCLEX-RN Predictor Exam
(Administered Spring, NUR 202
Graduating Class)
Class
PDCCC
Nat’
(Mean %)
(Mean %)
2009
2010
72.5%
71.2%
72.2%
71.0%
Kaplan NCLEX-RN Predictor
Exam Generic ADN
Class
PDCCC
Nat’
(Mean %)
(Mean %)
2011
2012
2013
70.6
70.4
68.2
68
68.1
66
Kaplan NCLEX-RN Predictor
Exam LPN-RN Bridge ADN
Class
PDCCC
Nat’
(Mean %)
(Mean %)
2011
2012
2013
65.7
70.4
66
68
68.1
66
Group scores slightly above
national level which should
indicate students ready to assume
this role. Faculty to continue to
analyze test results & identify
strategies to improve success rate.
an NCLEX-RN Predictor
before graduation.
Class 2006 & 2007
encouraged to take Hurst
NCLEX review course
arranged by Nursing to be
hosted on campus for their
convenience. Less than
half took the course.
2008-09 Actions:
From Class 2008 forward
all students were
encouraged to take a
review course prior to
NCLEX testing.
Class Fall 2009
admissions began Kaplan
Integrated Testing across
curriculum, which includes
NCLEX Review Course at
completion of program
(Class 2011 will be first to
complete this testing
requirement).
2009-2010 Actions:
Increased pharmacology
content on all unit & final
exams to 30%.
Increased simulation
methods & content across
the curriculum. Increased
simulation labs & content
20%-30% across the
curriculum. Increased code
simulation across the
lifespan.
Designated a specific
professor to coordinate the
simulation lab experience.
2010-11 Actions:
Fall 2010 Adopted new
Fundamentals of
Nursing text with
increased focus on
critical thinking
incorporated within the
text. In addition,
students now receive a
DVD with all the skills
needed for the
curriculum. Can view
24/7 & not just when on
campus
Adopted new medicalsurgical nursing text
that incorporates critical
thinking & is easier for
students to comprehend.
73
By the completion of
the program the student
will: synthesize &
communicate relevant
data in a comprehensive
& concise manner,
verbally & in writing &
through information
technology.
Student Learning
Outcomes
Data Collection
VCCS Core
Competency: Written
Communications.
CCSSE Writing Scores
(Q12)
VCCS Writing Core
Competency (mean Score)
PDCCC
VCCS
Sp 2008
5.07
5.94
Nursing scores: 4.99, is above
mean of 3.5. Writing across
curriculum already incorporated in
program.
PDCCC Stage Testing
(Benchmark 5)
NURSING PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
VCCS Core
Competency: Oral
CCSSE Writing Scores
Communication.
2005 2008 2010
PDCCC 2.90 2091 3.00
CCSSE Oral
VCCS
2.60 2067 2.78
Communication Scores
Cohort 2.64 2069 2.77
(Q12)
Faculty Staff Survey
(Oral Skills)
VCCS Core
Competency:
Information Literacy.
CCSSE Information
Literacy Scores (Q12)
PDCCC STAGE
Computer Literacy
Scores
Analysis Procedure:
VCCS Competency Writing: mean scores.
Nursing student s will
score > to PDCCC
benchmark score.
Scoring; meets or
PDCCC Stage Testing
2002
6.75
2003
7.16
2010
6.89
2011
6,25
2012
6.55
Changed Grading
Criteria for oral
assignments to rubrics
for more consistent
grading across
instructors. Criteria:
Score of 3.0 or higher
on a 5 point scale. In
use Spring 2011.
2004: Writing across the
Nursing curriculum was
incorporated throughout
nursing program.
Communication skills
are taught & evaluated
in each nursing course
throughout the program.
Weekly clinical
evaluations have an
objective for therapeutic
Use Of Evaluation
Results (Action Taken)
communication. Every
course has a
communication specific
content & lab. Patient
teaching is required &
evaluated each week in
clinical. Teaching plans
& implementation are
required clinical
components of N180
(new families) & N246
(preschoolers).
Faculty & Staff Survey –
perception writing skill of
graduates:
2008
3.64
2009 3.89
2011 3.94
2012 3.92
2013 3.91
VCCS Competency Oral
Communication (mean score)
PDCCC VCCS
Sp 2006 55.25
60.87
PDCCC scores 87% of
Benchmark.
74
exceed standards (>37),
proficient (37-41.9),
advanced proficient
(>42).
CCSSE Writing Scores
(Q12). Mean score
based on a 4-point scale
with 1 = very little and 4
= very much
VCCS Core
Competency
Computer Literacy:
PDCCC’s Benchmark =
Graduates will achieve a
success rate of 80% or
higher on proficiency
with Criteria:
Student Learning
Outcomes
Sp 2007 56.10
60.91
PDCCC scores 90% of
Benchmark.
CCSSE Oral
Communication Score (Q12)
2005 2008 2012
PDCCC 2.79 2.81 2.94
VCCS
2.47 2.57 2.70
Cohort 2.55 2.60 2.68
PDCCC scored above the
benchmark for each year of the
available data
Faculty and Staff Survey –
Perception of Oral Skills of
graduates:
2008
3.73
2009
4.02
2011
3.97
2012
3.87
2013
3.97
NURSING PROGRAM
Assessment &
Results Of Analysis
Analysis Methods
meets or exceeds
standards (37 or
CCSSE Information
higher); advanced
Literacy Score (Q12)
proficient (42 or
2005 2008 2012
higher); proficient (37PDCCC 3.07 2.85 3.05
41.9). Nursing Students
VCCS
2.68 2.78 2.85
will score > PDCCC
Cohort 2.66 2.71 2.79
scores
CCSSE Information
Literacy Scores (Q12).
Mean score based on a
4-point scale with 1 =
very little and 4 = very
much
VCCS Competency Information Literacy:
PDCCC’s Benchmark =
Graduates will achieve
a success rate of 80% or
higher on proficiency
with Criteria: meets or
exceed standards (37 or
higher); advanced
proficient (42 or
PDCCC STAGE Information
Literacy Scores (Benchmark 5)
2002 6.68
2003 6.80
2010 7.02
2012 7.29
Faculty Survey: Information
Literacy:
2007-08
3.73
2008-09
3.93
2010-11
4.06
2012-13
3.91
(ratings based on a scale 1-5 with
1 being low and 5 being high)
VCCS Computer Literacy Core
Competency (mean scores)
Use Of Evaluation
Results (Action Taken)
Students are required to
use APA format for all
writing assignments
(required textbook) &
evaluated on use as
component of grading
criteria.
2004: Speech &
Computer Literacy
Courses were
implemented as Nursing
Program curriculum
requirement.
Nursing students did oral
presentations & written
assignments in every
nursing class & service
learning experiences,
patient teaching, &
communication with
other healthcare
professionals &
paraprofessionals.
75
higher); proficient (3741.9). Nursing students
will score > PDCCC
scores.
Value-Added
Graduate Survey
based on a 5 point scale
with 5 being high & 1
being low with
benchmark set at 3.00.
PDCCC
NSG
Sp2008 37.59
41.32
Nursing Program students above
general PDCCC students &
VCCS. No action required at this
time. Continue with requirements
per curriculum plan.
Written drug cards are
now required in all
clinical courses.
2004 (cont.): Therapeutic
communication questions
are now included on all
Exams.
Students will score
mean of 3 on a 5 point
Rubric for writing
competency (nursing
faculty developed).
Students perform
community health
teaching to groups &
individuals as part of
their Service Learning
requirement for
N211/212.
Students will score
mean of 3 on a 5 point
Rubric for oral
competency (nursing
faculty developed).
Student Learning
Outcomes
All courses now require
library research.
NCLEX Exam was
computerized therefore
Computerized testing
used in all nursing
courses except NUR 136
NURSING PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
Use Of Evaluation
Results (Action Taken)
(dosage calculation).
2007: Mandatory new &
returning student
orientation included test
taking skills session.
2010-11 Actions:
Fall 2010: Changed from
“grading criteria” to
“rubrics” & blind review
for all written
assignments for more
consistent grading across
instructors. Rubrics
developed & used all Fall
2010 nursing courses.
Criteria: Score of 3.0 or
higher on a 5 point scale.
Continuing use Spring
2011.
By the completion of the
program the student will:
Assessment Methods:
Preceptor Survey and
Result:
Preceptorship Evaluation
Fall 2004: Implemented
policy that students who
76
assume the role of the
associate degree nurse as
care provider, advocate,
teacher and manager.
(“assume” role meaning
“to take on”)
Preceptorship Evaluation.
Student Survey.
Nurses Legislative Day at
the General Assembly.
CCSSE (Q12)
Cultural/Social:
Developing a Personal
Code of Values and
Ethics
CCSSE (Q12)
Cultural/Social: Working
effectively with others
and learning on your own
Faculty/Staff Survey:
Cultural & Social
Understanding of
graduates
Student Learning
Outcomes
Forms for May 2012
graduates were analyzed. All
32 students passed
preceptorship.
Year Proficiency (%)
Met
Not met
2006
95.5
4.5
2007
100
0
2008
100
0
2009
100
0
2010
100
0
2011
100
0
2012
100
0
2013
100
0
100% of students requiring
preceptorship remediation
(additional clinical
assignments) successfully
completed. Results indicate
students are ready to assume
this role at graduation.
Analysis Procedure:
Preceptorship
evaluations Nursing
Faculty developed;
completed by preceptors
NURSING PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
at the end of spring
Generic ADN NCLEXsemester & rated on
RN Results
Class
PDCCC
rubric Scale 5
(Excellent) to 1 (Poor).
Need mean >3 to pass.
2008
81.8
2009
77.42
CCSSE (Q12)
2010
73.53
Cultural/Social:
2011
91.6
Developing a Personal
2012
88
Code of Values and
2013
Pending
Ethics: (Q12). Mean
The Board of Nursing expects a
score based on a 4-point
program pass rate of 80 or more.
scale with 1 = very little
and 4 = very much
LPN Bridge ADN
NCLEX-RN Results
CCSSE (Q12)
Class
PDCCC
Cultural/Social:
Working effectively
2010
100
with others: Mean
2011
81.8
score based on a 4-point
2012
66.7
scale with 1 = very little
2013
Pending
and 4 = very much
The Board of Nursing expects a
program pass rate of 80 or more.
PDCCC STAGE
fail one nursing course
can return when course
next meets. The student
who failed in
preceptorship in 2006
was provided an
opportunity to retake the
course. Second time
passed preceptorship.
2009-10 Actions:
A variety of settings are
used for clinical
experiences, for example
pediatric experiences.
Students attended asthma
& bereavement camps
during summer 2008 as
pre-clinical experiences
for their NUR 246
Pediatric Nursing Class
Spring 09. Spring 2010
CHKD became a clinical
site for NUR246.
Use Of Evaluation
Results (Action Taken)
Each Spring: Students
attend Annual Virginia
Nurses Day at the
General Assembly,
focus on health care
issues before the
Assembly, meet with
legislators & discuss
health care issues
currently before
Assembly.
Student are assigned to
alternative clinical
experiences in various
departments of the
health care facilities
where they are expected
to observe various roles
of nurses or other health
care providers.
Fall 2008: NUR
211/212 assigned to role
77
of charge nurse.
scores for
cultural/social
Faculty/Staff Survey:
Cultural & Social
Understanding of
graduates
2007 3.56
2008 3.69
2010 3.74
Student Survey based
on a 5-pt. scale from 1-5
with 1 being low.
Graduate Survey based
on a 5-pt. scale from 1-5
with 1 being low.
CCSSE Cultural/Social
Values & Ethics
PDCCC
VCCS
Cohort
2005
2.48
2.20
2.28
2008
2.38
2.31
2.34
2012
2.51
2.47
2.43
CCSSE Personal
Development: Own Learning
2005 2008 2012
PDCCC 3.05 3.06 3.13
VCCS
2.82 2.89 2.97
Cohort 2.86 2.89 2.95
PDCCC State Scores: Cultural
& Social Understanding:
2002
5.07
2003
5.14
2010
5.02
2011
4.86
2011-2012 Effective
May 2013 in NUR 115:
increase the number of
class and lab hours for
Maternity Nursing
review. Also add
Simulations of
intrapartal and newborn
nursing care. Maryview
Psych will replace the
community experiences
and the Behavioral
Medicine unit
experience at Sentara
Obici. Students will
continue to have a
med/surg experience at
Sentara Obici with
psychosocial emphasis
NUR 212: Two smaller
lecture units were
combined into one unit
which provided extra
class time
The previous matrices present evidence that SLOs from the Nursing Program are
assessed and analyzed; and data-driven results are used to make program and/or curricular
changes. This sample indicated that the assessment of learning in nursing is a process of
continuous improvement, having occurred over a number of years. Complementary evidence of
the success of the Nursing Program and its students can be found in VCCS Core Competency
Examination Scores, such as Information Literacy.
The program has remained relatively constant over the last three years of data points and
surpasses SCHEV’s standard of 7 FTE’s for Nursing.
78
NURSING PROGRAM: HEADCOUNT AND FTE
FALL
FALL
FALL
FALL
2009
2010
2011
2012
FTE’s
52
46
45
38
Student Headcount
77
66
71
55
FALL
2013
38
55
Further, the Retention Rate for the program has increased over time as the program has
evolved. The program retention rate is depicted in the following table.
Category
Percent
Numbers
NURSING PROGRAM RETENTION RATES
Fall 2008
Fall 2009
Fall 2010
Fall 2011
Spring 2009
Spring 2010
Spring 2011
Spring 2012
81%
89%
80%
95%
64/79
56/63
61/76
55/58
Fall 2012
Spring 2013
83%
39/47
The program has constantly exceeded over the past 5-years SCHEV’s graduation
standard of 5. The Nursing program had 30 graduates in 2013.
Category
Graduates
NURSING PROGRAM GRADUATES
2009
2010
2011
2012
31
38
36
32
2013
30
The following table demonstrates the viability of this program. It depicts the FTEs and
Nursing Program graduates from the first graduating class in 2006 to the graduating class of
2009. The number of program graduates in Spring 2012 was 32.
Category
FTEs
Graduates
VCCS PROGRAM VIABILITY STUDY
2006-2007
2007-2008
2008-2009
47
59
67
18
32
30
3-Year Average
58
27
New assessments are introduced as the nursing faculty and its advisory board refines both
expectations of students and graduates and identifies the tools with which objectives can be
79
assessed and measured. The College’s commitment to developing a culture of evidence and
assessment established this process as the standard practice for the Nursing Program.
The next section presents a few program-specific objectives for the Management
Program.
80
Management Program
The Management Program is designed to provide a broad general education as
preparation for a variety of positions and careers in business. This curriculum is intended to
provide students the option of choosing between several specialization tracks. Upon completion
of any of the Management Program concentrations, students demonstrate basic skills in human
relations and communications as well as specialized knowledge related to their selected content
area. The Computer Support Specialist Specialization AAS Program trains students to provide
technical assistance, support, and advice to customers and other users. Employees in these
positions include technical support specialists and help-desk technicians. These positions
troubleshoot and interpret problems as well as provide technical support for hardware, software,
and systems. Answering telephone calls, analyzing technical problems using automated
diagnostic programs, and resolving recurring difficulties are examples of three learned skills.
Support specialists may work either within a company that uses computer systems or directly for
a computer hardware or software vendor. Increasingly, these specialists work for help-desk or
support services firms, where they provide computer support to clients on a contract basis. In
smaller firms, the support specialists solve a variety of computing problems to help businesses
serve their customers. The General Business Management Specialization is designed to provide
basic management and communication skills for the position of general manager. Emphasis on
supervision, applied business math, accounting, economics, law and communication prepares the
student for the many aspects of general management. Additionally the program incorporated a
major emphasis on the global/international aspects of management. Students are required to
read, write and speak limited “conversational” (as if you were a tourist) words and phrases such
as but not limited to: Spanish, Mandarin Chinese, Arabic, and Portuguese. The countries
81
speaking these languages are current and/or total world economic powers. Successful graduates
possess the necessary skills for entry into a variety of management positions.
The Hardware and Software Support Specialization is designed to provide the student
with preparation for a career in computing/technology hardware and software. The importance of
studying computing in the context of business is that the student has an area to start applying
computer problem solving and methodology. Upon completion of the program, students possess
basic skills to enter a variety of computing jobs in computer support services, including Personal
Computer (PC) repair and Installation Repair Technicians. Students may also use this program as
a step toward advanced study in specialized areas of computing, such as the Hardware and
Software Aspects of Networking Specialization. Students who complete this program are
prepared to take the A+ industry certification exam. The student also studies material covered in
other industry certification exams, such as the certification tests offered by Microsoft. In
addition, students may choose to study for the Cisco Certified Network Associate Examination
(CNAA). The Marketing Management Specialization is designed to provide a basic foundation
in marketing management. This includes financial, legal, planning and selling aspects of the
field. Additionally, more emphasis has been placed on international marketing and electronic
commerce. Upon completion of the program, the student possesses basic skills to enter various
marketing and sales positions.
The following matrix provides evidence of student learning outcome assessment and
data-driven program improvements in the College’s Management curriculum.
82
MANAGEMENT PROGRAM
Learning Outcomes
Communicate effectively
orally & in writing.
Students’ oral skills should
be clear, cogent, presented
with confidence & sound
under questioning.
Assessment & Analysis
Methods
Assessment Methods
VCCS Oral Core
Competency test.
Oral Presentation.
Written Papers.
Analysis Procedure:
Students were scored on a
5 point rubric for oral
presentation competency.
Peer review.
Management student
scores on the VCCS Oral
Core Competency test will
equal or exceed the VCCS
average.
Results Of Analysis
2012-13:
Fall 2012 proficiency was
85% based on a
comprehensive exam with
80% proficiency vs. 95%
in fall 2011 using 70%
proficiency. Use rubric to
determine proficiency. As
for value added, graduates
were 84% proficient at the
end of the program which
represented a gain of 0.78
points from when they first
entering the program on a
5-point rubric scale.
2011-2012:
Management student oral
communication
proficiency was 92% for
the Fall of 2011 based on a
comprehensive exam.
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
Student are now required
to demonstrate their oral
and written
communication skills
earlier in the semester.
This gives them the
opportunity to demonstrate
and to improve on their
proficiency.
2011-12 Actions:
Formal presentation using
PowerPoint but adding
more sophisticate options
such as voice-over, music,
and/or videos
Developed student study
groups. But added
assigned (first by
volunteer) study groups
with a team captain to
assist one another.
Students will apply
leadership, ethical
standards, and team
building skills necessary
for managerial positions in
the 21st century
Assessment Methods :
Complex case studies;
Power Point presentations
Analysis Procedure:
Students scored on 3
examinations & a
comprehensive final and a
4-point rubric which
included class
presentations
2010-11:
VCCS Oral
Communication Core
Competency:
PDCCC VCCS
Sp 2006
55.3
60.9
Sp 2007
52.6
60.9
2010-11 Actions:
Curriculum added a
presentation focusing on
improving language,
spelling, & grammatical
skills.
2012-13:
90% proficiency based on
4 equal weight exams and
optional field trip to local
courtroom case. Field trip
rubric clearly increased
final scores. As for value
added, graduates were
84% proficient at the end
of the program which
represented a gain of 0.68
points from when they first
entering the program on a
5-point rubric scale.
2012-13 Actions:
Changed textbook to
reflect more specific and
realistic business situations
rather than general law.
83
Students will demonstrate
the ability to use
technology in analyzing
and solving business
problems.
Students will use basic
computation skills to
analyze and solve business
problems requiring the use
of mathematics
Assessment Methods:
Team Project
Analysis Procedure:
Students worked in teams
and presented findings via
power point to class as
part of a 4-point rubric.
Assessment Methods:
Specific exam questions
Analysis Procedure:
Students scored on 3
exams and final using a 4point rubric
2011-12:
80% exceeded
expectations; 4 equal
weight exams and optional
field trip to local
courtroom case. Field trip
rubric clearly aiding final
scores.
Some students lacked
sophisticated computers &
high speed internet
providers. Student
proficiency for fall 2011
was 82%
2012-13:
100% of graduates met or
exceeded the analysis and
construction of the
business problem vs. 88%
in 2011-12. As for value
added, graduates were
85% proficient at the end
of the program which
represented a gain of 0.61
points from when they first
entering the program on a
5-point rubric scale.
2011-12 Actions:
Continued use of team
study groups. Students
required to make a field
trip to local courtroom and
reported and analyzed 5
completed trials. The
exercised substituted for
lowest Exam grade.
Students encouraged using
free college technical
facilities, i.e. computer &
library lab equipment.
2011-12:
Student proficiency 88%;
working in teams helped
many students gain
confidence in the studies.
2012-13:
As for value added,
graduates were 89%
proficient at the end of the
program which
represented a gain of 0.75
points from when they first
entering the program on a
5-point rubric scale.
Students demonstrated
77% proficiency based on
specific questions on final
exam vs. 77% for 2011-12.
This met the 70%
proficiency set for the
objective.
2011-12 Actions:
Continue to use student
study groups with “senior”
students serving as team
captains (mentors)
2012-13 Actions:
Changed textbook to take
advantage of technology
and math practice online.
2011-12:
100% of students met or
exceeded via test
questions. Four exam were
2011-12 Actions:
Math Tutors encouraged;
text in use provides toll
free student tutor services;
2012-13 Actions:
Students are now going
online using case studies
involving challenges and
successful marketing
strategies.
84
Graduates will be able to
demonstrate management
skills in leadership, team
building, and motivating a
sales or marketing staff
70% of the time.
Assessment Methods
Marketing Projects
Final Examination.
Rubrics
Analysis Procedure
Graduates will choose an
original marketing idea
then assemble and
construct a electronic
presentation. Professor
will evaluate graduates
responses using a 4-point
rubric scale. The primary
course is MKT 220
Principles of Advertising.
Graduates will also take a
capstone course, MKT 285
(Current Issues in
Marketing) which requires
a research paper in which
the SLO is evaluated using
a 4-point rubric scale.
Graduates are also
assessed using an indirect
survey measurement
which focuses on graduate
proficiency.
administered requiring
extensive research, use of
Power-Point presentations,
and case study exercises.
Results are consistent with
college –level entry-level
lack of good math skills.
2012-13:
80% of students (8/10)
constructed complex and
through Power-Point
presentations on marketing
principles. Students
utilized foreign languates
computer translators
(Spanish, Mandarian,
Chinese, and Portuguese).
Students were required to
view and review all other
students' presentations.
Students were required to
comment on all fellow
students' presentations. As
for value added, graduates
were 82% proficient at the
end of the program which
represented a gain of 1.00
points from when they first
entering the program on a
5-point rubric scale.
student teams assigned
with team “captains”
2011-12:
100% of students
constructed complex and
through Power-Point
presentations on marketing
principles. Students
utilized foreign languages
computer translators
(Spanish, Mandarian,
Chinese, and Portuguese).
Students were required to
view and review all other
students' presentations.
Students were required to
comment on all fellow
students' presentations.
2011-12 Actions:
No actions needed at this
time.
2012-13 Actions:
Reviewed and changed to
a new textbook for next
year.
MANAGEMENT PROGRAM
Learning Outcomes
Students will be able to
evaluate & build a simple
network.
Assessment & Analysis
Methods
Assessment Methods
Student outcomes are
based on the CISCO
Results Of Analysis
2012-13:
As for value added,
graduates were 74%
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
The CISCO systems were
converted and all ITN 101
85
Student Learning
Assessment.
Evaluation of student
performance in building a
simulation network based
on rubrics reviewed by
peer groups & OT faculty.
Analysis Procedure
Student performance on
the CISCO Network
Simulation included in the
CCNA curriculum.
Testing is done in Fall
term. The primary
measures are the final
exam in ITN 101,
Introduction to Network
Concepts, and an
evaluation using rubrics of
students performance in
building a simulation
network. Secondary ITN
107, ITN 106, and ITN
115.
proficient at the end of the
program which
represented a gain of 0.70
points from when they first
entering the program on a
5-point rubric scale.
students are now enrolled
in the introductory portion
of the CISCO networking
academy.
For the Fall 2012 term,
11/14 students were able
to successfully build a
network and have it
properly work.
2011-12:
In 2011-12, there were two
groups of people, those
taking the class live and
those working online with
a simulator. 6 out of 9
students in the live group
were able to complete the
simulation setup of a
network without
assistance. This was
significantly higher than
the results in the online
group where 2 out of 8
students were able to
complete the simulation
setup of a network without
assistance. On a second
attempt, for those who
could not set the network
up initially 7 out of 9 were
able to complete the task
successfully with
assistance. Overall, 15 out
of 17 passed the task by
the end of the course.
2011-12 Actions:
The packet tracer
examples were introduced
earlier and any problems
related to running the
software will be mediated
by the midterm of the
class. In addition, packet
tracer content embedded in
the CISCO curriculum
were emphasized and
student work critiqued
with the intent of making
the final assessment more
routine for the students.
All students used the
online test system and 8
out of 17 passed the
CISCO final exam with a
score of 70 or better. The
Average score for the
exam was 68%. 11 out of
17 passed the online Cisco
Course, with an overall
86
average of 76%.
Graduates will be able to
integrate the advanced
computer concepts using a
productivity software suite
as measured by a 50%
pass rate on the MCAS
(Microsoft Certified
Application Specialist
Exam).
Assessment Methods
Student performance on a
test based on industry
certification objectives
with a target performance
level for success of 50%.
100% of the students
completing ITE 215 will
be prepared to sit for
certification as
demonstrated by the precertification testing. Of
those students, 50% will
pass the certification the
first time. Those who do
not pass certification will
be given an opportunity to
retest.
Analysis Procedure
Students take the precertification testing, and
decide if they wish to sit
for certification.
Once the students
complete the ITE 215
course, they are eligible to
sit for the MOS
certification exam. The
exam is offered at the
Workforce Center.
2010-11:
Fall 2009,100 % of
students attempting
network simulation
successfully complete the
task.
2010-11 Actions:
In Summer 2010, the
department offered the IT
55 (certification prep
course). In order to
increase student
participation, the College
assisted students
financially by paying for
one of the certification
tests.
2012-13:
As for value added,
graduates were 84%
proficient at the end of the
program which
represented a gain of 0.33
points from when they first
entering the program on a
5-point rubric scale.
2012-13 Actions:
No students sat for the
certification test as the test
was not yet available. The
long range goal of the
department is to have
support for students taking
the test.
For the Fall 2012 term,
8/11 students were eligible
to sit for certification, but
the certification test was
not available for the new
version of Office.
2011-12:
Spring 2011, no students
sat for certification
although there were
students who were
eligible.
For Fall 2011, 100% of the
students took the midterm
pre-certification exam
(WORD AND EXCEL)
100% of the students took
the final pre-certification
exam (ACCESS and
PowerPoint) The precertification test was given
to 10 students. 4 out of 10
passed Word/Excel with
70% or better 10 out of 10
passed Access/PowerPoint
with 70% . Overall, 7 out
2011-12 Actions:
We are continuing the
actions from the previous
year, which include more
repetition of skills.
Modification to course
assignments will be made
to try to improve the test
scores. The next group of
students should do better
as the same format will
now be used in ITE 115,
so the ITE 215 students
should be better prepared
in the future
87
of 10 students passed the
composite with 70% or
better.
2010-11:
Summer 2010, 14 students
enrolled in the one credit
IT 55 class for MCAS
preparation. Thirteen
students sat for the exam
& ten students passed the
exam. This equals a 77%
pass rate.
2010-11 Actions:
Added a capstone one
credit course to assist
students prepare for the
MCAS certification exam.
Integrated accounting
software into curriculum.
The program FTE’s have remained relatively constant over the last three years of data points
and meet SCHEV’s standard of 13 FTE’s for Management.
MANAGEMENT PROGRAM: HEADCOUNT AND FTE
FALL 2009
FALL 2010
FALL 2011
FALL 2012
FTE’s
34
39
32
23
Student Headcount
56
60
44
42
The fall-spring retention rates have increased over the past five years from 54% to 73%.
Category
Percent
Numbers
MANAGEMENT PROGRAM RETENTION RATES
Fall 2008
Fall 2009
Fall 2010Fall 2011Spring
Spring
Spring
Spring
2009
2010
2011
2012
54.2%
58.0%
77.3%
68.2%
32/59
29/50
51/66
43/63
Fall 2012Spring
2013
73%
35/48
The number of graduates has increased from 8 in 2009 to 10 in 2013. This exceeds SCHEV’s
standard of 8.
Category
Graduates
MANAGEMENT PROGRAM GRADUATES
2009
2010
2011
2012
8
4
2
14
2013
10
88
The previous matrix presented evidence that the program faculty are continually
reviewing the student work and improving the curriculum based on assessment and analysis
results. In addition, the table demonstrates the desire of the program faculty to associate national
skill sets (Microsoft Office Specialist [MOS] and Cisco) and use materials in that area. The
number of students receiving certifications has increased, and the program content has been
enhanced resulting in improved student outcomes. Finally, the use of a simulator, first in the
applications course and then in the networking area, helps reinforce program content, validate
results and keep budgets in check. Simulators offer great promise in the networking arena. The
simulator provided by CISCO has a “virtual” version of many pieces of equipment, allowing
high-end devices to be studied without requiring equipment purchase.
Much progress was made by the students with respect to assimilating what was learned
among and between core and elective courses in the Management degree program. IT courses,
Accounting and key boarding classes are producing a better-rounded student body.
For example, students’ computer skills, abilities and knowledge have improved
dramatically. They are more comfortable and proficient producing simple Power Point
presentations as well as more complex presentations (with their voice-over narration, attaching
videos, photos and even sub-titles in Mandarin Chinese, Portuguese, Spanish, Arabic, and
Russian).
It should also be noted that general education courses are clearly helping students write
and speak better. This is particularly true with respect to writing composition (it should be noted
that all of my graded examinations are case studies using the American Psychological
Association format). Progress is being made in the BUS 125 Applied Business Math, as well as
other courses, partly as a result of student teamwork homework assignments.
On a more positive note the following strategy has been implemented to increase student
participation, grades, and retention in the management program. Students must submit a
Mandatory Compliance Statement to the program head. This statement reads the following: I, xxx,
89
have read the entire syllabus and understand and agree to comply with all college policies and guidelines as well as
additional stated requirements of this course as outlined by the program head. Failure to fully comply with all
requirements on or before the last day to drop without academic penalty will result in my being dropped from the
class with the appropriate grade or academic status as determined by the program head, PDCCC, and VCCS
policies. Furthermore, students eligible to receive any/all forms(s) of financial aid will be referred to the
appropriate college administrator(s) and/or financial aid organization for any/all appropriate disciplinary/legal
action(s).
The next section presents a few program-specific objectives for the Administrative
Support Technology Program.
90
Administrative Support Technology Program
The Administrative Support Technology (AST) Program is designed to prepare students
with the knowledge and skills necessary to make decisions and perform successfully in office
occupations. It provides opportunities for those persons employed in office occupations as well
as those seeking a promotion and/or a degree to upgrade their skills and knowledge of new
methods, practices, and innovations in business. The following matrix demonstrates a few of the
AST Program specific SLOs and the evaluation process (assessment and analysis through the use
of results and actions taken).
ADMINISTRATIVE SUPPORT TECHNOLOGY PROGRAM
Assessment & Analysis
Results Of Analysis
Use Of Evaluation
Methods
Results (Action Taken)
Students will demonstrate
Assessment Methods
2012-13:
2012-13 Actions:
oral & written
Final exam in AST 205,
For Spring 2013, AST
New writing software was
communications with
Business Communications graduates completed
added to the AST 205
emphasis on writing &
grammar questions with
class.
presenting business-related Assessment by the AST
90% proficiency. As for
materials using electronic
advisory committee.
value added, graduates
media with 70%
were 80% proficient at the
proficiency.
Analysis Procedure:
end of the program
Assessment by the AST
compared to when they
advisory committee using
first entering the program
a pass/fail evaluation for 8 using a 5-point rubric
out of 10 questions.
scale.
Learning Outcomes
2011-12:
For Spring 2012, graduates
completed the typing test
with five or fewer errors
but only one student
passed the requirement of
45 wpm.
2011-12 Actions:
Writing resources webbased software was added
to the AST 205 class.
Additional writing
exercises were added to
the program.
2008-09:
For Spring 2009
proficiency was 64% vs.
57% for Spring 2008
AST Program Graduates
For Spring 2009, based on
the data, students are
losing their typing speed
after completing the typing
classes.
2008-09 Actions:
Advisory committee met
& reviewed student
portfolios.
Offered “live” or hybrid
classes vs. “online”
classes.
Integrated Software into
class, expanding additional
hands-on grammar &
91
writing. Pre-test & posttest were given.
Sample data was reviewed
by the Advisory
Committee. Based on the
results, program faculty
continued to work with
students to increase their
grammar & critical
thinking skills.
Grammar activities were
given weekly to review &
reaffirm grammar rules &
formatting.
Critical thinking skills
expanded through online
& live class scenario
examples.
Incorporated software that
expanded hands-on
grammar & writing
activities.
Offered workplace
certification for speed &
accuracy.
Students will be able to
apply critical thinking,
analytical, and quantitative
skills in decision making
and problem solving with
70% proficiency.
Assessment Methods
Final exam in AST 243:
Office Administration.
Office Scenarios for AST
graduates.
Analysis Procedure:
Final exam.
AST Advisory Committee
review of samples of
pass/fail office scenarios
for AST graduates using a
rubric.
2012-13:
For Spring 2013, AST
graduates completed
pass/fail office scenarios
with 90% pass. As for
value added, graduates
were 80% proficient at the
end of the program
compared to when they
first entering the program
using a 5-point rubric
scale.
2012-13 Actions:
Additional scenarios were
discussed online in the
discussion forum. Threads
were graded for grammar
and spelling.
2011-12:
Fall 2011 (AST 243),
scenario problem solving
was completed with 80%
using online discussion
board and questioning.
2011-12 Actions:
Additional resources were
used to increase problem
solving activities via
discussion board in
Blackboard.
Spring 2012 graduate
testing data showed all
graduates based the testing
with 70% proficiency.
92
2009-10:
Fall 2009 proficiency was
75% vs. 73% for Fall
2008.
2009-10 Actions:
Developed office scenarios
for review by AST
Advisory Committee.
Fall 2009 (AST 243)
100% participated in
Blackboard discussions of
textbook scenarios.
Incorporated various
scenarios into AST 243 &
AST 244 classes via
Blackboard discussion
board. At the end of spring
2010 semester, the
Advisory Committee
reviewed samples of office
scenarios.
Spring 2010 AST
Graduates proficiency for
proofing—name checking
was 68% & proofing—
number checking was
90%. This compares to
2008 graduates with 75%
for name checking & 90%
for number checking.
Students will use proper
keyboarding skills to
prepare documents quickly
and accurately according
to employer standards with
70% proficiency.
Assessment Methods
Final exam in AST 102.
Program graduate test
reviewed by the Advisory
committee.
Analysis Procedure:
Program graduate test with
a proficiency level of 45
wpm with five or fewer
errors evaluated by the
AST advisory committee.
2012-2013
For Spring 2013, 3
graduates completed the
typing test with five or
fewer errors and passed
the requirement of 45 wpm
and the remaining students
did not take the typing test.
As for value added,
graduates were 80%
proficient at the end of the
program compared to
when they first entering
the program using a 5point rubric scale.
2012-2013 Actions:
Accuracy activities were
stressed and increased
speed activities.
2011-12:
Fall 2011 AST 102
students (90%) had the
proficiency level of 45
wpm with five or fewer
errors.
2011-12 Actions:
Continued to stress
accuracy and speed in
typing. Graduate testing
results in formatting
stayed the same at 100%
accurate. Continued to
increase timings in class.
Spring 2012—AST 102
students 10 out of 11 typed
at least 40 words per
minute with five or fewer
errors.
93
2010-11:
Fall 2009, proficiency was
88% vs. 67% for Fall 2008
& 33% for Fall 2007.
Spring 2009 proficiency
was 80% vs. 82% for
Spring 2008.
2010-11 Actions:
Added speed & accuracy
to job certification tests.
Faculty developed project
researching the tie
between speed, accuracy
& pay.
Fall 2009 (AST 102) 29%
higher than 45 wpm, 28%
at 40 wpm, 28% had too
many errors, & 14% did
not take the test.
Spring 2010 AST
graduates, proficiency for
speed was 57% >45 wpm
with 72% accuracy.
This compares to 25% for
speed with 75% accuracy
for 2008 graduates in 5minute timing.
Fall 2010: AST 101, 14
students completed the
class & all 14 students
formatted personalbusiness letters in block &
modified-block style with
100% proficiency .Of the
14 students, 12 students
(85%) typed at least 35
words per minute with 3 or
fewer errors.
Students will be able to
use software, including
word processing,
spreadsheets, databases,
presentation, or calendar
tools, to input, manage,
and interpret information
to meet organization needs
with 70% completion rate.
Assessment Methods
ITE 215 class, practice
tests through MyIT Lab
software & the Microsoft
Certification test in Word
Processing.
AST 141 class, practice
tests in SNAP web based
software & certification
test.
2012-2013:
As for value added,
graduates were 75%
proficient at the end of the
program compared to
when they first entering
the program using a 5point rubric scale. No AST
students took the MOUS
certification because of
lack of funding.
2012-13 Actions:
More repetition was
required in preparation for
MOUS. Trying to obtain
funding next year.
Analysis Procedure:
94
MyIT Lab software.
Microsoft Certification test
in Word Processing.
2011-12:
No AST students (Fall
2011 or Spring 2012) took
the certification because of
lack of funding.
2011-12 Actions:
During Fall 2011 and
Spring 2012, students
prepared for certifications
by using SNAP (tutorials
and online skills training
throughout the class).
Required employment
portfolios from students.
2010-11:
In 2010-11, precertification test given to
17 students: 5 out of 17
passed Word with 80% or
better; 5 out of 17 passed
Excel with 80% or better;
3 out of 17 passed
PowerPoint with 80% or
better; 4 out of 17 passed
Access with 80% or better.
2010-11 Actions:
More student repetition of
skills was required.
Modified course
assignments working
toward improved test
scores.
2009-10:
Fall 2009, four students in
AST 141 were awarded
free vouchers & eLearning for the Word
2007 MCAS exam.
2009-10 Actions:
Students were encouraged
to take the Microsoft
Certification Exam in
word processing.
Additional funding was
sought to help pay for the
tests. Used MyIT Lab
software with tutorials &
specific skills online
training using assessments
throughout the class.
SNAP web based
software.
Certification test.
Program graduates will be
evaluated as the number
passing the industry
standard word processing
test given by Microsoft
Corporation.
For ITE 215 Fall 2009-65% of the students took
the final pre-certification
exam (ACCESS &
PowerPoint).
The program FTE’s have remained relatively constant over the last three years of data points and
meets SCHEV’s standard of 13 FTE’s.
ADMINISTRATIVE SUPPORT TECHNOLOGY: HEADCOUNT AND FTE
FTE’s
Student Headcount
FALL 2009
24
28
FALL 2010
22
34
FALL 2011
20
35
FALL 2012
14
25
The fall-spring retention rate has increased from 75.6% in fall 2007-spring 2008 to 85% for fall
2012-spring 2013.
95
Administrative Support Technology Fall-Spring
Retention Rates
Category
Fall 2007
Fall 2008
Fall 2009Spring
Spring
Spring
2008
2009
2010
Percent
75.6%
73.5%
83.8%
Numbers
31/41
36/49
31/37
Fall 2012Spring
2013
85%
17/20
The program has over the past 5-years increased its graduates from 0 in 2009 to 5 in
2013. In 2011 the number of graduates was 10 which exceeded SCHEV’s standard of 8.
ADMINISTRATIVE SUPPORT TECHNOLOGY PROGRAM GRADUATES
Category
2009
2010
2011
2012
2013
Graduates
0
6
10
7
5
The previous matrices present evidence that the results of AST assessment efforts are tied
to specific SLOs for students to achieve desired results for workplace skills. A better-prepared
student entering the workforce is the desired outcome for the AST program faculty. Instead of
individual course learning, overall program objectives require students to prepare by “across
curriculum” learning. The AST Program has effectively added grammar and critical thinking
activities in several classes along with computer generated writing activities.
The AST Program has also encouraged students in certificate programs to continue to
complete the two-year program. Classes in the certificate programs map to the two-year degree
for seamless transition. Additionally, several businesses have requested internships for our
students and have given our students workplace experience.
The next section will present a few program-specific objectives for the Early Childhood
Development Program.
96
Early Childhood Development Program
This program is designed to prepare students for the care, supervision, and education of
young children from birth to eight years of age. Graduates also qualify to working with children
up to age twelve in after-school programs. Individuals already working in the field may upgrade
their skills and qualify for advancement. The following matrix demonstrates a few of the Early
Childhood Development Program specific SLOs and how the evaluation process (assessment and
analysis through the use of results and actions taken).
EARLY CHILDHOOD DEVELOPMENT PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
Student will recognize the
Assessment Methods
2012-13:
stages of early childhood
Multiple measures
As for value added,
development through
including classroom
graduates were 86%
activities & experiences in observation, midterm
proficient at the end of the
the nursery, preexaminations, final
program which
kindergarten, kindergarten, examinations, & numerous represented a gain of 0.86
& primary programs for
Blackboard discussions.
points from when they first
young children.
entering the program on a
Analysis Procedure:
5-point rubric scale. 2012Objectives for CHD 120
13:
was examined &
For Fall 2012, proficiency
coordinated with Virginia
was 57% vs. 75% in Fall
Child Care Provider
2011 based on specific
Competencies.
questions on final exam.
Learning Outcomes
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
Modified course objectives
and assignments to
correlate with the
objectives of the NAEYC
Standards for Professional
Preparation and the VA
Early Childhood
Standards. Focused course
content more on
Developmentally
Appropriate Practices as
related to the stages of
Early Childhood
Development.
2011-12:
For Fall 2011, proficiency
was 75% vs. the 88% in
Fall 2010.
2011-12 Actions:
The course syllabus
objectives and assignments
for CHD 120 was
correlated to the objectives
of the NAECY Standards
for Professional
Preparation and the Va.
Early Childhood
Standards.
2010-11:
Fall 2010 proficiency was
88% vs. 85% in fall 2009.
2010-11 Actions:
CHD 120 course
requirements
(assignments, activities,
etc.) were coordinated to
enhance student
understanding &
proficiency of course
objectives:
97
-
-
-
2009-10:
Fall 2009 proficiency was
85% vs. 80% in fall 2008.
2008-09:
Fall 2008, proficiency was
80% vs. 70% for Fall
2007.
Student Learning
Outcomes
Students will be able to
illustrate developmentally
appropriate techniques &
methods for encouraging
the development of
language literacy, math,
science and social studies.
EARLY CHILDHOOD DEVELOPMENT PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
Assessment Methods
2012-13:
Student proficiency of this As for value added,
learning outcome is
graduates were 86%
formatively assessed in
proficient at the end of the
CHD 118: by using
program which
multiple measures
represented a gain of 0.86
including classroom
points from when they first
observation, midterm
entering the program on a
examinations, final
5-point rubric scale. For
examinations, & numerous Fall 2012, proficiency was
Blackboard discussions.
83% compared to 89% in
fall 2011.This exceeded
Analysis Procedure:
the proficiency level set at
Objectives for CHD 118
70%, but below that of last
Language Arts for Young
year.
Children were examined &
coordinated with VA
2011-12:
Child Care Provider
For Fall 2011, proficiency
Competencies & NAEYC
was 89%. For fall 2010, it
Standards for Early
was also 89%. This
Childhood Programs for
exceeded the proficiency
AAS Degrees.
level set at 70%.
CHD 120 Course
Syllabus was updated
accordingly.
Each lecture &
assignment is directly
related to one of those
objectives.
The final exam for
CHD 120 was
reconstructed to
reflect evaluation of
the competencies.
2009-10 Actions:
CHD 120 objectives have
been correlated with the
standards of the Virginia
Core Competencies for
Child Care Providers &
NA EYC Standards for
Early Childhood Programs
for AAS Degrees, & are
now documented in the
course syllabus.
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
Aligned course activities
and assignments with
targeted learning outcome.
2011-12 Actions:
More practical
applications were
considered for assignments
for CHD 118. In addition,
CHD 118 became a
prerequisite for CHD 119.
Raised the proficiency
level to 80%.
98
2009-10:
For spring 2010,
proficiency was 89%. This
was below the fall terms of
previous years.
2009-10 Actions:
Objectives for CHD 118
were reviewed &
coordinated with VA
Child Care Provider
Competencies & CHD 118
Course Syllabus updated
accordingly.
Final exam for CHD 118
was reconstructed to
reflect evaluation of
competencies.
Practical applications of
teaching of language &
reading, for example, the
Raising A Reader
Program, were used. The
Raising A Reader
Program is used in all
Early Start, Head Start, &
Virginia Preschool
Initiative programs in the
PDCCC service region.
Collation of course
objectives with the
Virginia Core
Competencies for Child
Care Providers helped
direct students to their
responsibilities as teachers
of young children.
The program FTE’s and student headcounts remained relatively constant over the last three years
of data points. The program meets SCHEV’s standard of 13 FTE’s.
EARLY CHILDHOOD DEVELOPMENT: HEADCOUNT AND FTE
FTE’s
Student Headcount
FALL 2009
26
42
FALL 2010
27
46
FALL 2011
24
46
FALL 2012
25
41
The fall-spring retention rate has improved significantly from 56% for fall 2007-spring 2008 to
83% for fall 2011-spring 2012.
99
EARLY CHILDHOOD DEVELOPMENT PROGRAM RETENTION RATES
Category
Fall 2007
Fall 2008
Fall 2009
Fall 2010
Fall 2011
Spring 2008
Spring 2009
Spring 2010
Spring 2011 Spring 2012
Percent
56%
69%
80%
80%
83%
Numbers
33/59
31/45
37/46
41/51
32/42
The program has over the past 5-years increased its graduates from 8 in 2009 to 9 in 2013.
EARLY CHILDHOOD DEVELOPMENT PROGRAM GRADUATES
Category
2009
2010
2011
2012
2013
Graduates
8
5
4
7
9
Overall, the Early Childhood Development program has grown and improved over the
past several years. The growth has been slow, but the reasons for this slow growth are positive.
First, most of the students are hired in childcare as soon as they finish their first 12 hours of
course work if not before. Then they are slowed to part-time status, taking one or two classes a
semester. Secondly, as the standards of childcare providers have risen, so have the expectations
of the classwork. This is a great situation, for the young children are more likely receiving better
quality care and are more prepared for success in school and live. In addition, several of the
students majoring in Early Childhood Development, a program in Associate in Applied Science,
are deciding to pursue their transfer degree in Education, a program in Associate in Arts and
Science. That is a big success!
The previous matrices present evidence that the correlation of the Child Development
courses with the Virginia Core Competencies for Child Care Provider and the National
Association for the Education of Young Children (NAEYC) Standards for Early Childhood
Programs for AAS Degrees are effective in providing students with information that is specific
and directly related to the knowledge and skills they are encountering in the workplace. When
100
students understand the need for the information and can see the application of that information
in the workplace, it enables them to retain the information.
The next section will present a few program-specific objectives for the Industrial
Technology Program.
101
Industrial Technology Program
The Associate in Applied Science degree curricula is designed to provide a broad base of
instruction and industrial knowledge that will prepare the graduate to enter the technical work
force upon graduation. Graduates are trained for jobs with regional local industries, federal and
local governmental agencies. The following matrix demonstrates a few of the Industrial
Technology program specific SLOs and the evaluation process (assessment and analysis through
the use of evaluation results and actions taken).
Learning Outcomes
Graduates will be able to
plan and execute technical
applications for the set-up
and operation of
electrical/electronic
equipment with 85%
proficiency.
INDUSTRIAL TECHNOLOGY PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
Assessment Methods
2012-13:
IND 165, Principles of
98.7 % of the Dual
Industrial Technology I
enrollment students
Final exam.
continued to achieve an
average proficiency score
Student Projects.
of 88.8 on the Virginia
Work Readiness
Student Demonstrations.
Assessment.
Results from the VA
Career Technology
Competency (VCTC)
standards.
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
Continued to make
students accountable for
their own learning, allow
students to model proper
techniques and general and
acceptable industry
standards. Students are
required to complete more
comprehensive group
projects. Continued to
work cooperatively with
home schools to sponsor
“Reality Day” for students
to apply skills modeled in
program of study.
2011-12:
Graduates demonstrated
the ability to plan and
execute technical
applications for the set-up
and operation of
electrical/electronic
equipment with 88%
proficiency.
98.7 % of the Dual
enrollment students
achieved an average
proficiency score of 88.8
on the Virginia Work
Readiness Assessment.
2011-2012 Actions:
To make students more
accountable for their own
learning, allow students to
model proper techniques
and general and acceptale
industry standards.
Increased the number of
required comprehensive
group projects. Working
cooperatively with home
schools to sponsor
“Reality Day” for students
to apply skills modeled in
program of study.
2010-11:
Proficiency was 100% vs.
96% for previous Fall.
2010-11 Actions:
Added comprehensive
group projects. Advisory
102
Fall 2009, VCTC
evaluations based on
student projects &
demonstrations: average
score of 3.92 with 80% of
students being proficient
(scored 3 or higher).
Graduates will be able to
communicate their ideas
and results using their
knowledge of written, oral,
and graphical
communication with 70%
proficiency.
Assessment Methods
Assigned team projects.
VCTC standards:
proficient score 3 or
higher.
Grading Rubrics.
Analysis Procedure:
Team projects are teacher
& peer evaluated based on
VCTC standards proficient
score: > 3.Rubrics
standard: 2.0 on a 5.0 scale
reflects where the student
can perform the task
without supervision.
Committee input used to
established constraints that
are delineated in the
updated rubrics.
2012-13:
87% of graduates were
able to successfully
demonstrate their
knowledge of written, oral,
and graphical
communication with 78%
proficiency to design and
layout a wiring plan for a
12 room residence in
written proposals. Based
on the Virginia CTE
records for modeling 86%
of the time students
demonstrated a proficiency
0f 2.83. Result based on
Industrial Technology
Rubric.
2012-13 Actions:
We continue to use
multimedia for
illustrations &
experimental simulations.
We have introduced web
links on the Blackboard
sites for selected YouTube
videos which have been
used for online
discussions. Graduate
comprehensive tests from
capstone course ETR 273
have been successfully
modified to adequately
provide written
discussions for topics
presented as a means of
demonstrating increased
rigor and higher level of
learning.
2011-12:
Given a collaborative
learning activity, graduates
were able to communicate
ideas and results using
their knowledge of written,
oral, and graphical
communication with 70%
proficiency to design and
layout a wiring plan for a
12 room residence. Based
on the Virginia CTE
records for modeling 88%
of the time students
demonstrated a proficiency
0f 2.83. Result based on
Industrial Technology.
2011-12 Actions:
We have increased the use
of multimedia for
illustrations &
experimental simulations.
Students are using virtual
software to gain
immediate feedback. With
the open lab concept,
students are allowed to use
the laboratory when there
is not a scheduled class for
remediation without direct
feedback from the
instructor. If models are
connected in accordance
with the schematic
diagram the virtual circuit
is functional. Modify
graduate comprehensive
tests from capstone course
ETR 273 to increase rigor
and higher level of
103
learning.
Graduates will be able to
utilize a working
knowledge of electrical
fundamentals, precision
tools, and techniques to
perform identified tasks
with 80% proficiency.
Assessment Methods
Grading Rubrics.
VCTC standards.
Analysis Procedure:
A minimum of 3 graded
practical laboratory
applications with grading
based on rubrics when
averaged must total a
minimum of 80%.
2009-10:
Fall 2009, proficiency
based on the Virginia CTE
records for modeling in the
four basic energy systems,
84% of the time was 2.80,
2.45, 2.75 & 2.70
respectively. Result based
on Industrial Technology
Rubric reflects a
satisfactory rating of 3.8.
2009-10 Actions:
Provided students with
opportunities to work
cooperatively &
demonstrate competence
through performance
examinations.
2012-13:
Based on a Likert scale 14, students continued to
achieve 3.6 in the proper
use and selection of
peculiar tools used to
model assignments.
Students have
demonstrated the ability
predict, analyze, &
measure electrical
quantities 86% of the time.
2012-13 Actions:
Continue to use modeling
and rubrics to evaluate
students. Peer modeling
and evaluation on selected
course projects. Link
applicable YouTubes
websites into Blackboard
for demonstrated modeling
real world practical
applications.
Increased opportunities for
students to work
cooperatively &
demonstrate competence
through performance
examinations.
2011-2012:
Based on a Likert scale 14, students achieved 3.6 in
the proper use and
selection of peculiar tools
used to model
assignments. Students
continue to show
improvement to predict,
analyze, & measure
electrical quantities 85%
of the time.
2011-2012 Actions:
Continue to use modeling
and rubrics to evaluate
students. Integrate peer
evaluation on selected
course projects. Increased
the availability of virtual
software, audio-visual
materials and teacher
demonstrated modeling
real world practical
applications.
Increased opportunities for
students to work
cooperatively &
demonstrate competence
through performance
examinations.
2010-11 Actions:
Prior to teaching theorems,
conducted math skills
drills & ways to properly
VCTC standards.
Proficient score: > 3.
2010-11:
78% of the time students
demonstrated the ability to
execute on a 3.8 scale
104
rating which exceeds
borderline satisfactory.
use calculators for
quantitative problem
solving. Provided students
with opportunities to work
cooperatively &
demonstrate competence
through performance
examinations.
The College’s Industrial Technology AAS degree program has shown a marked increase
in FTEs from 19 in fall 2009 to 23 in fall 2011. The program exceeds SCHEV’s standard of 9
FTEs.
INDUSTRIAL TECHNOLOGY: HEADCOUNT & FTE
FALL 2009
FALL 2010
FALL 2011
FTEs
19
25
23
Student Headcount
22
33
23
FALL 2012
11
19
The fall-spring retention rate for Industrial Technology program has been between 75-84%
over the past five years.
INDUSTRIAL TECHNOLOGY PROGRAM RETENTION RATES
Category
Fall 2007
Fall 2008
Fall 2009Fall 2010- Fall 2011Spring
Spring
Spring
Spring
Spring
2008
2009
2010
2011
2012
Percent
84.2%
89.5%
72.4%
79.4%
75.0%
Numbers
16/19
17/19
21/29
27/34
21/28
The program has increased over the past 5-years the number of graduates from 5 in 2009 to 12 in
2013. In 2011, 2012, and 2013, the program exceeded SCHEV’s graduation standard of 6.
INDUSTRIAL TECHNOLOGY PROGRAM GRADUATES
Category
2009
2010
2011
2012
2013
Graduates
5
2
12
15
12
The previous matrices present evidence that students are developing the necessary skills
sets to apply the principles of technology as they relate to the four primary energy systems:
mechanical, electrical, fluid and thermal. As evidenced by assessment results, students are
demonstrating the ability to read and interpret diagrams, schematics and working drawings.
Additionally, they are demonstrating proficiency in the use of graphs and data tables to
105
appropriately select necessary components used for building working models in accordance with
Virginia Technology Competency standards with limited immediate supervision.
The last five years, the Industrial Technology has met up with some challenges, but we
have stood the test and have come out with a great success story. In 2009, the economy took a
turn for the worst and the numerous displaced workers came to our department for training and
skills in the areas of Industrial Technology and Electronics. We worked with these individuals on
a one-on-basis, taking into consideration their wants and needs. We fined tuned curriculum with
the type of skills they were looking to gain, worked with their schedules and kept an open-lab
classroom where students were allowed to come in and work on projects on their time. We made
the students more accountable for their learning while having them to practice the acceptable
standard workplace procedures while in the lab. We used discovery laboratory projects using lab
equipment so that students will leave the class with a feeling of accomplishment through
exploration. After many certificates and degrees were awarded from the Industrial Technology
Program areas, the workers who once looking for jobs were well on the way to be being
employed once again. 90% of the graduating class of 2011 and 2012 have been employed by
local employers such as Green Mountain Coffee, California Cartedge, and International Paper.
The next section will present a few program-specific objectives for the Administration of
Justice Program.
106
Administration of Justice Program
The curriculum in both the Corrections and Police Science Specializations has been
developed and maintained in cooperation with state and local correctional and police officials.
The Administration of Justice curriculum with its specializations was designed to provide a
broad foundation, which prepares students’ to enter any of the varied fields of corrections and/or
law enforcement, or to advance professionally within them. The following matrix demonstrates a
few of the Administration of Justice specific SLOs and the evaluation process (assessment and
analysis through the use of results and actions taken).
Learning Outcomes
Graduating ADJ students
will formulate an
appreciation of ethical
standards through their
experiences (classroom &
non-classroom) 70% of the
time.
ADMINISTRATION OF JUSTICE PROGRAM
Assessment & Analysis
Results Of Analysis
Methods
2012-13:
Assessment Methods
ADJ 133 is only taught in the
Final exam in ADJ 133:
Summer and is assessed the
Ethics & the Criminal
following Spring with the
Justice Professional.
Assessment portfolio.
Analysis Procedure:
Final Exam.
Rubric to evaluate
Assessment portfolio.
Student survey
Advisory Committee. In
Spring 2012 the portfolio was
replaced with a Reflective
Learning Journal (RLJ) [See
rubric tab] because after
reviewing the portfolios the
program head and the
Adisory Committee agreed
that the portfolio was
showing what had been
learned with no application of
the material, whereas, with a
RLJ program students could
put what they had learned
into practice. As for value
added, graduates were 80%
proficient at the end of the
program which represented a
gain of 0.50 points from
when they first entering the
program on a 5-point rubric
scale.
2011-12:
For 2011-12, Advisory
Committee met on 01/2012
reviewed portfolios and all
were reviewed as competent.
Course surveys N=18 were:
To the questions, Has your
Use Of Evaluation
Results (Action Taken)
2012-13 Actions:
The Portfolio was replaced
with a Reflective Learning
Journal so that students
could put what they had
learned in practice. The 35
ethical dilemmas require a
solution section as part of
the Reflective Learning
Journal (RLJ).
2011-12 Actions:
Students developed/
included as part of the
portfolio the ethical
dilemmas. Student survey
added.
107
experiences in this course
lead you to formulate ethical
standards. 14 strongly agreed
and 3 agreed and 1 strongly
disagreed. Has this course
enhanced your knowledge
about the topic of ethical
standards. 13 strongly agreed
and 4 agreed and 1 strongly
disagreed.
Students developed 35
ethical dilemmas that were
included as part of the
portfolio. The portfolio
was replaced with a
Reflective Learning
Journal to be reviewed by
the Advisory Committee
01/2013. Student survey
added.
2010-11:
For Summer 2009,
proficiency was 85%
based on a comprehensive
exam vs. 83% for Summer
2008.
2010-11 Actions:
Continued use of
assessment portfolio
presented to the ADJ
advisory committee &
evaluated using a rubric.
Students completed (35)
ethical dilemma scenarios.
A review of the partial
portfolios (7) was
completed at the Fall 2010
Advisor meeting & all
portfolios reviewed met at
least the rating of
“proficient” or higher.
Graduating ADJ students
will engage in listening
skill improvement through
their experiences
(classroom & nonclassroom) 70% of the
time.
Assessment Methods
Final exam in ADJ 227:
Constitutional Law for
Justice Personnel.
Course Surveys
Analysis Procedure:
Final Exam/Lecture
Summaries
Assessment Portfolio.
Assessment portfolio
presented to ADJ advisory
committee & evaluated
using a rubric.
Met proficiency: 17 of 18
for 2010-11 vs. 15 of 18 in
2009-10 vs. 11 of 13 in
2008-09.
2012-13:
As for value added,
graduates were 80%
proficient at the end of the
program which
represented a gain of 1.00
points from when they first
entering the program on a
5-point rubric scale. In
Spring of 2013 the
Advisory Committee met
to review the Portfolios,
case studies and lecture
summaries. The
Committee was impressed
with the Portfolios with
regard to how closely they
followed the course
objectives.
2011-12:
For 2011-12, the
Advisory Committee met
01/2012 reviewed
portfolios and all were
2012-13 Actions:
For Spring 2014 students
are required to present a
ramdon sample of the case
studies in class.
2011-12 Actions:
Added course survey/Oral
presentations of five cases
was added that were from
memory. This was added
108
reviewed as
competent/Course surveys
N=12 were: To the
questions, Has your
experience in this course
lead to you improving
your listening skills? 11
strongly agreed and 1
agreed. Did the course
lecture summaries help
improve your listening
skills? 10 strongly agreed
and 2 agreed. Did the
lecture material support
completing your case
briefs? 10 strongly agreed
and 2 agreed. As a result,
proficiency was met.
because of the need for
students to recall
information and not rely
on electronic devices thus
improving the student’s
short, working and long
term memory.
2010-11:
Met proficiency: 12 of 12
in 2010-11 vs. 14 of 15 in
2009-10 vs. 15 of 15 in
2008-09.
2010-11 Actions:
Added lecture summaries
(14 to 25) after every class
session (students’
debriefed lecture for 20
minutes after each class).
Placed a grade value to the
summaries. An oral
presentation requirement
was added. Assessment
portfolio (case briefs &
lecture summaries)
presented to ADJ advisory
committee & evaluated
using a rubric.
For Spring 2009
proficiency was 100%
based on a final exam,
case briefs (45) vs. 93%
for Spring 2010
The FTEs have increase significantly over the last three years of data points from 29 in fall 2009
to 44 in fall 2012. This is well above the SCHEV’s standard of 13 FTEs.
109
ADMINISTRATION OF JUSTICE: HEADCOUNT AND FTE
FALL 2009
FALL 2010
FALL 2011
FTE’s
29
40
41
Student Headcount
41
59
55
FALL 2012
44
66
The fall-spring retention rate for Administration of Justice has increased from 65.4% for
fall 2008-spring 2009 to 73% for fall 2012-spring 2013.
Category
Percent
Numbers
Administration of Justice Fall-Spring Retention Rates
Fall 2008
Fall 2009Fall 2010- Fall 2011- Fall 2012Spring
Spring
Spring
Spring
Spring
2009
2010
2011
2012
2013
65.4%
74.5%
80.0%
71.7%
73%
36/55
35/47
48/60
43/60
55/75
The program has constantly exceeded over the past 5-years SCHEV’s graduation
standard of 8. The Administration of Justice program had 8 graduates in 2013.
ADMINISTRATION OF JUSTICE PROGRAM GRADUATES
Category
2009
2010
2011
2012
2013
Graduates
13
11
16
17
8
The previous matrices present evidence that the Administration of Justice Program has
worked extensively with its Advisory Committee for curriculum improvement. The Program has
implemented a series of assessment portfolios to evaluate student learning outcomes. Program
changes based on portfolio results have been implemented. However, beginning in Fall 2012 the
portfolios will be replaced with Reflective Learning Journals and this change will more reflect
the affective domain/andragogy intent of the Program. Having had such great success with the
results of these assessment activities the Program is very hesitant to recommend any major
changes to the Program. Another area of student success is the hiring and promotion of Program
students. Although, the Program has no hard evidence to support this claim anecdotal evidence
110
from graduates is plentiful. Lastly this Program has had the distinction of having the highest
ratio of graduates to FTES of all similar programs within the VCCS.
All objectives for 2012-13 were met or were exceeded. With the oversight of the
Advisory Committee reviewing portfolios, reflective learning journals, policy analysis projects
and essays the Committee’s conclusion was that the objectives were satisfactory to very
satisfactory. The Committee’s recommendation was not to change the overall intent of the
program, but to enhance those courses where oral presentations and writing might improve the
student’s performance. These changes could take effect beginning in fall 2014, however, the new
program head (fall 2014) will decide if these changes are desired or if the program head would
take a different approach to program objectives.
The next section will present a comparison of traditional face-to-face instruction oncampus vs. distance learning on-line course vs. dual enrollment off-campus courses.
111
Modes of Instruction Comparisons:
Traditional On-Campus vs. Distance Learning Online vs. Dual Enrollment Off-Campus
In order to make the college more accessible to students who demand more than
traditional on-campus classes, PDCCC has offered various other modes of instruction,--including
face-to-face on-campus, dual enrollment off-campus, and distance learning online classes.
The College does not have any programs at present that are totally online. Dual
enrollment students can, however, obtain a General Education Certificate and/or General Studies
AA&S Degree. The new Virginia HB 1184 mandate requires all community colleges to develop
an implementation plan for high school students to be able to earn a General Education
Certificate and/or an Associate Degree while simultaneously completing high school course
requirements. As a result, the number of dual credit students taking advantage of this
certificate/degree has increased. In 2013, there were two Suffolk high-school graduates that did
receive a General Studies Degree in addition to their high school diploma. No matter how or
where classes are taught, all faculty must still meet the same credential standards, common
syllabus, and textbook, etc. Online faculty must also complete online certification courses before
they can teach online.
A grade comparison of the traditional face-to-face on-campus classes, dual enrollment
off-campus classes, and distance learning online classes is provided in the table below. Faculty
conduct a comprehensive review of the data to determine changes needed to improve student
success. Another example is the student evaluation of distance learning, dual enrollment, and
traditional courses. Students are provided an opportunity to comment on their course (traditional,
distance learning, or dual enrollment) experiences through an electronic online faculty course
evaluations (IOTA Solutions). The results are then provided to the deans and faculty to review
112
and make changes based upon the contents. Blackboard, the College’s online management
system for every course (traditional, distance learning, or dual enrollment), facilitates course
specific online communication, resources, and evaluations specific to the course such as
feedback to student performance.
Grade Comparisons for Different Modes of Instruction
Completion Rates (Grade of C or better)
Term
Course
Traditional
On-Campus
Distance Learning
Online
Dual Enrollment
Off-Campus
Fall 2012
BIO 101
ENG 111
ITE 115
HIS 121
PSY 201
Total
84%
77%
97%
66%
80%
95%
55%
80%
98%
69%
70%
91%
93%
92%
58%
70.2%
79.8%
90.5%
(283/403)
(119/149)
(294/325)
Fall 2011
BIO 101
75%
71%
98%
ENG 111
68%
68%
97%
ITE 115
68%
75%
98%
HIS 121
86%
80%
86%
PSY 201
88%
75%
80%
Total
75.9%
73.6%
90.8%
(409/539)
(103/140)
(257/283)
Fall 2010
BIO 101
93%
76%
97%
ENG 111
66%
38%
96%
ITE 115
72%
62%
98%
HIS 121
89%
89%
86%
PSY 201
96%
96%
69%
Total
79.3%
76.3%
88.6%
(414/522)
(90/118)
(210/237)
Notes: On-campus is traditional face-to-face courses; Online is virtual distance learning courses
(sections 71, not hybrid or compressed video); Off-campus is dual enrollment courses taught in the
high school and can lead to a General Education Certificate program.
Results from the grade comparisons show that dual enrollment off-campus students
tended to perform better than the on-campus and distance learning students taking the same
classes. Overall, traditional face-to-face on-campus students performed slightly better than
113
distance learning online students, but the gap is narrowing. An explanation as to why dual
enrollment students have a higher completion rate is because they are the higher achievers
from the local high schools and private academies. They must first pass the College’s placement
test and then be approved by the high school to take dual enrollment classes. Furthermore, the
higher percent of completion for dual enrollment is somewhat bias, in that students who are
having difficulties early in the class are deleted from the course with full refund to the high
school because of the business policy on how the dual enrollment course is paid by the high
school instead of giving a “W” withdrawal grade.
In past 10+ years, the gap between tradition on-campus courses and distance learning
online courses was about 20%. Significant improvement has been made in the past two years.
The college has initiated a REDI test to assist students as to their readiness and computer skills
before making the decision to take an online course. Faculty teaching online courses are now
required to obtain further online certifications and training before teaching online classes. As a
result, the overall gap has been narrowed to 2-3% points. The grade distribution for each
course can also be seen in the table below.
BIO 101
BIO 101
BIO 101
ENG 111
ENG 111
ENG 111
ITE 115
ITE 115
ITE 115
HIS 121
SAME COURSE GRADE DISTRIBUTION USING DIFFERENT MODES OF INSTRUCTION
Traditional On-Campus, Distance Learning Online, and Dual Enrollment Off-Campus
A
B
C
D
F
Fall 2011
Traditional
49%
25%
11%
5%
7%
(32/65)
(16/65)
(7/65)
(3/65)
(5/65)
Distance
21%
29%
21%
4%
18%
Learning
(6/28)
(8/28)
(6/28)
(1/28)
(5/28)
Dual
57%
27%
14%
2%
Enrollment
(25/44)
(12/44)
(6/44)
(1/44)
Traditional
19%
30%
19%
7%
19%
(33/176)
(52/176)
(33/176)
(13/176)
(34/176)
Distance
20%
28%
20%
12%
12%
Learning
(5/25)
(7/25)
(5/25)
(3/25)
(3/25)
Dual
57%
31%
9%
3%
Enrollment
(37/65)
(20/65)
(6/65)
(2/65)
Traditional
35%
21%
12%
6%
13%
(45/127)
(27/127)
(15/127)
(7/127)
(16/127)
Distance
31%
33%
11%
11%
7%
Learning
(8/27)
(9/27)
(3/27)
(3/27)
(2/27)
Dual
37%
53%
8%
2%
Enrollment
(14/38)
(20/38)
(3/38)
(1/38)
Traditional
48%
27%
11%
1%
10%
W
3%
(2/65)
7%
(2/28)
6%
(11/176)
8%
(2/25)
13%
(17/127)
7%
(2/27)
3%
114
HIS 121
HIS 121
PSY 201
PSY 201
PSY 201
Distance
Learning
Dual
Enrollment
Traditional
Distance
Learning
Dual
Enrollment
BIO 101
Traditional
BIO 101
Distance
Learning
Dual
Enrollment
Traditional
BIO 101
ENG 111
ENG 111
ENG 111
ITE 115
ITE 115
ITE 115
HIS 121
HIS 121
HIS 121
PSY 201
PSY 201
PSY 201
Distance
Learning
Dual
Enrollment
Traditional
Distance
Learning
Dual
Enrollment
Traditional
Distance
Learning
Dual
Enrollment
Traditional
Distance
Learning
Dual
Enrollment
(45/93)
28%
(8/29)
39%
(33/84)
59%
(46/78)
52%
(16/31)
12%
(6/52)
31%
(20/65)
10%
(2/21)
30%
(8/27)
16%
(22/139)
(25/93)
(10/93)
45%
7%
(13/29)
(2/29)
33%
14%
(28/84)
(12/84)
23%
6%
(18/78)
(5/78)
23%
(7/31)
42%
26%
(22/52)
(13/52)
Fall 2010
45%
17%
(29/65)
(11/65)
33%
33%
(7/21)
(7/21)
56%
11%
(15/27)
(3/27)
28%
22%
(39/139)
(30/139)
4%
(1/23)
47%
(27/57)
27%
(41/153)
21%
(6/29)
53%
(24/45)
50%
(43/85)
62%
(16/26)
35%
(22/63)
84%
(67/80)
76%
(19/25)
13%
(6/45)
17%
(4/23)
40%
(23/57)
31%
(48/153)
38%
(11/29)
36%
(16/45)
34%
(29/85)
15%
(4/26)
33%
(21/63)
8%
(7/80)
12%
(3/25)
29%
(13/45)
17%
(4/17)
9%
(5/57)
14%
(21/153)
3%
(1/29)
9%
(4/45)
5%
(4/85)
12%
(3/26)
18%
(11/63)
4%
(3/80)
8%
(2/25)
27%
(12/45)
(1/93)
11%
(9/84)
8%
(6/78)
6%
(2/31)
13%
(7/52)
3%
(2/65)
5%
(1/21)
11%
(15/139)
22%
(5/23)
4%
(2/57)
7%
(10/153)
3%
(1/29)
2%
(1/45)
14%
(9/63)
16%
(7/45)
(9/93)
13%
(4/29)
3%
(2/84)
3%
(1/31)
5%
(3/52)
(3/93)
7%
(2/29)
4%
(3/78)
16%
(5/31)
2%
(1/52)
1%
(1/65)
14%
(3/21)
3%
(1/27)
16%
(23/139)
3%
(2/65)
5%
(1/21)
7%
(10/139)
13%
(3/23)
-
26%
(6/23)
-
13%
(20/153)
7%
(2/29)
-
8%
(13/153)
28%
(8/29)
-
6%
(5/85)
8%
(2/26)
-
5%
(4/85)
3%
(1/26)
-
4%
(3/80)
4%
(1/25)
4%
(2/45)
11%
(5/45)
Overall, specific distance learning courses vs. the same course taught under different modes of
instruction tended to have the greatest withdrawal rates compared to the face-to-face oncampus and the off-campus sites.
115
When comparing all courses taught by the different modes of instruction, similar results were
found (see below): Withdrawal rates were higher for on-line or hybrid classes compared to
face-to-face. Dual enrollment classes had the highest completion rates because they are the
most motivated and mainly college transfer students. The shared service distance learning
(SSDL) students who take their on-line class from Northern Virginia Community College have
the lowest success rate than all other modes of instruction. A review of the SSDL courses
showed that they are very high level science courses such as physics, geology, astronomy or
different language courses such as Japanese or Chinese. Improvements in SSDL completion
rates have increased significantly through better screening of students, better advising, and
course selection. For example, the SSDL completion rates for fall 2012 was 67% vs. 36% in fall
2011.
Summary of All Courses Taught Using Different Modes of Instruction
Mode
Fall 2011 Courses
C
D
F
%Suc
N
A
B
R
W
X
I
S
U
Hybrid
78
356
48%
22%
7%
3%
9%
1%
9%
0%
0%
0%
1%
In Person
79
3,273
35%
21%
13%
4%
5%
5%
5%
0%
0%
10%
2%
Video
89
36
58%
11%
19%
0%
8%
0%
3%
0%
0%
0%
0%
SSDL
36
28
7%
21%
7%
4%
25%
0%
36%
0%
0%
0%
0%
Online
76
903
45%
20%
10%
4%
12%
0%
8%
0%
0%
0%
0%
Dual
Enroll
Total All
Modes
93
621
46%
33%
14%
6%
1%
0%
0%
0%
0%
0%
0%
78
4,598
38%
21%
12%
4%
7%
4%
6%
0%
0%
7%
2%
S
U
Fall 2012 Courses
Mode
%Suc
N
A
B
C
D
F
R
W
X
I
Hybrid
78
333
45.3%
21.9%
10.2%
4.5%
11.1%
0.0%
6.3%
0.0%
0.0%
In Person
80
2,892
33.1%
21.9%
9.4%
3.9%
5.4%
4.8%
3.6%
0.4%
0.0%
Video
91
32
18.8%
46.9%
25.0%
0.0%
9.4%
0.0%
0.0%
0.0%
0.0%
SSDL
67
46
39.1%
19.6%
8.7%
6.5%
17.4%
0.0%
8.7%
0.0%
0.0%
Online
73
811
42.3%
20.0%
10.6%
4.9%
14.4%
0.0%
7.8%
0.0%
0.0%
Dual
Enroll
Total All
Modes
93
633
44.4%
37.6%
11.1%
5.2%
1.1%
0.0%
0.3%
0.0%
0.0%
79
4,126
36.1%
21.7%
9.8%
4.1%
7.8%
3.4%
4.7%
0.3%
0.0%
0.0%
15.3%
0.0%
0.0%
0.0%
0.0%
2.0%
0.0%
0.0%
0.0%
0.0%
0.0%
10.7%
1.4%
Note: SSDL=Shared Services Distance Learning classes with Northern Virginia CC.
The next section will present a summary for this section of the report.
116
Section Summary
This section of the Institutional Effectiveness Report has presented evidence that the
College’s academic units are assessing student learning outcomes, analyzing data and using the
results to improve its programs. A sample of the SLOs from each program the College offers has
been presented in matrices that support this assertion. Academic units at the College have
recognized for some time the value of assessment measures tied to clear student learning
outcomes that are measurable, objective, and, when possible, use state and national standards.
The previous matrices of SLOs provide evidence that student learner outcomes for each program
are defined in terms that are measurable and that the assessments employed are directly tied to
student learning. The matrices further demonstrate that the programs have generally made
continuous improvement in their respective areas by using the results of these assessments. For
many programs across the institution, the gold standards by which they measure the performance
of their students are nationally-normed and objectively-administered examinations. The Nursing
Program employs the NCLEX-RN in addition to the core competencies to determine its
effectiveness in teaching and student learning outcomes.
However, nationally-validated examinations are not readily available or appropriate for
the goals of all programs. Therefore, some programs continue to use course data, student
satisfaction surveys, and other evaluation measurements as secondary or tertiary means of
evaluating learning. In spite of these limitations, all programs have moved toward an outcomesbased approach for primary evaluations. Those employing in-course assessments are providing
consistency and objectivity in these measures by using such practices as committee-developed
scoring rubrics, third-party reviews, and the development of student portfolios. Various programs
117
use the portfolio model and juried evaluations of student works. All programs use multiple
measures to assist their continuous improvement efforts.
The evidence that program improvements are effective can be found in student and
graduate satisfaction surveys, and national validated exams (where available), including the
CCSSE, VCTC Standards, Career Readiness Certificates, retention rates, graduation rates, SLOs,
and others. To heighten student performance, in 2007, the College adopted a capstone course,
PHI 115: Practical Reasoning, which is a requirement for all degree students. This course focuses
on required general education and core competencies. Initial introduction of core competencies
is the responsibility of each faculty member within the individual course curriculum. Faculty
members attach core competency addendums to their course outlines to show how the
competencies are addressed within each course. This ensures the integration of Core
Competencies throughout the curriculum of all degree programs. No matter how (face-to-face or
distance learning online) or where (on-campus or off-campus) classes are taught, all faculty must
still meet the same credential standards, common syllabus, and textbook, etc. Online faculty
must also complete online certification courses before they can teach online. Success rates are
monitored and various strategies are implemented to increase student success and commonality.
The next section II addresses the College’s Administrative Support Units followed by the
Educational Support Units, section III.
118
Conclusion
The Institutional Effectiveness Report (1) summarizes the institution’s assessment of the
outcomes (including student learning outcomes) of its educational programs (Section I). The
report focuses on the extent to which the intended outcomes have been met and provide evidence
of institutional or program improvement based on an analysis of the results of the assessments,
and actions taken; and (2) summarizes the institution’s assessment of the outcomes of its
administrative support (Section II) and educational support programs (Section III). The report
focuses on the extent to which the intended outcomes have been met and provide evidence of
institutional or program improvement, based on an analysis of the results of the assessments and
actions taken.
The Institutional Effectiveness Report provides many general items that reflect the
culture of assessment present at the College and also provides evidence of its continued
commitment to both student learning outcomes and the assessment and analysis of student
learning. More importantly, the report shows the College’s commitment to the use of data
analysis for action steps to improve the institution as a whole, thereby enhancing its
effectiveness.
The backmap of institutional goals shows how the college’s educational programs,
educational support services, and administrative services are aligned with its institutional goals.
119
Backmap to Institutional Goals to Achieve Its Mission
Unit
Administration of Justice AAS
Administrative & Financial Services
Administrative Support Tech. AAS
Admissions and Records Office
Assessment and Institutional Research
Bookstore Office
Buildings and Grounds
Business Administration AA&S
Business Office
Computing Services
Counseling, Advising, & Recruitment
Distance Education
Division of Workforce Development
Dual Enrollment Program
Early Childhood Development AAS
Education AA&S
Financial Aid Office
General Studies AA&S
Human Resources and Payroll
Industrial Technology AAS
Institutional Advancement Dept.
Learning Assistance & Testing Center
Learning Resources Center
Management AAS
Nursing AAS
Purchasing and Procurement
Safety & Building & Grounds
Safety and Security
Student Activities
Student Support Services Program
Science AA&S
Goal
1
Goal Goal
2
3
X
Goal
4
Goal
5
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Goal Goal Goal
6
7
8
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Goal
9
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
To achieve the college’s mission of providing diverse learning opportunities to enhance the quality of life for students and the
community. The college has the following goals:
Goal 1: The college provides access to higher education for students and promotes their success and goal attainment.
Goal 2: The college provides curricula in university parallel programs that facilitate transfer to senior institutions.
Goal 3: The college provides career and technical programs that are responsive to the needs of students and employers.
Goal 4: The college provides a developmental studies program to help students met college-level learning expectations.
Goal 5: The college provides workforce training, services and lifelong learning opportunities.
Goal 6: The college provides skills and values students need to function effectively in their world.
Goal 7: The college provides support for partnerships for the development, growth and renewal of the service region.
Goal 8: The college provides adequate personnel, financial resources, facilities and technology to support its programs and
services.
Goal 9: The college provides emergency preparedness planning, training, and promotion.
120
As stated in the opening of this report, the College is committed to enhancing its
academic excellence. The College’s commitment to excellence begins with its academic
programs. The College’s systems for assuring excellence in student learning continue to evolve
and mature. The College has committed significant resources to this end and will continue to do
so.
This document shows evidence that the College is using assessment results to improve
the overall quality and effectiveness of the institution. In addition, the report provides evidence
that positive progress has been realized. Students are clearly recognizing the value that they see
in all of the College’s operations. The College believes that its graduates say it best, as evidenced
by the following comments taken from our graduate surveys:
GRADUATE STUDENT SURVEY RESPONSES
PAUL D. CAMP COMMUNITY COLLEGE IS LIKE….
“…the Harvard of Hampton Roads,”
“…a first date, you don’t know the person but later you begin to fall in love,”
“…a college with a desire to serve and better their community,”
“…home away from home. Like one big family,”
“…a shot of vitamin B for the brain,”
“…a stairway leading to a better future.”
“…getting a quality education right at your back door.”
“…the light that leads the way to your dreams.”
“…an old Chevy truck. It’s not too complicated, very reliable, and is a valuable
tool that prepares you for work.”
“…your first baby steps. You may start off a bit wobbly, but with much practice
and hard work, you are able to stand tall and walk independently.”
“…a mountain. The climb may be long and difficult at times, but it is all worth it
when you reach the top.”
“…the missing piece to the puzzle I have struggled to complete.”
“…Paul D. Camp Community College is like coke. It is the real thing.”
The assessment audit of all academic, educational support, and administrative support
units further demonstrate that PDCCC’s assessment is comprehensive.
121
Paul D. Camp Community College Assessment Audit
Master List of All Assessment Units (in Alphabetical Order)
Codes for Instructional Programs: 1. Curricular Change; 2.
Course Revision; 3. Pedagogy; 4 Assessment Methodology;
5. Assessment Criteria; 6. Process Revision; 7. Budget; 8.
Development/Training; 9. Enhanced Recruitment; 10. Other;
11. Enhanced Technology
Unit
Administration of Justice ASS
Administrative Support Tech. AAS.
Admissions and Records Office
Assessment & Institutional Research
Bookstore Office
Unit Type
Academic Program
Academic Program
Educational Support
Administrative
Administrative
Support
200910
Report
Yes
Yes
Yes
Yes
Yes
Codes for Non-Instructional Programs: A. Revised Service; B. Revised
Process; C. New Policy; D. New Process; E. Informed Budget; F.
Assessment Method; G. Assessment Criteria; H.
Consultant/Contractor; I. Create/Modify Instruction; J.
Development/Training; K. Enhance Technology Initiative; L. Enhance
Communication; M. Other;
2011-12
2012-13
Explanatory Notes
Report
Yes
Yes
Yes
Yes
Yes
Result
Codes
9, 2
9. 2
J
K, J, L
A
Report
Yes
Yes
Yes
Yes
NA
Result
Codes
2,4,9
2,3,7,11
B,D,H,K
B, E, K
NA
In 2012-13, a third party,
Barnes and Noble, began
operating the Bookstore for
Paul D. Camp Community
College.
Buildings and Grounds
Administrative
Yes
Yes
E
Yes
K,C,D
Business Administration AA&S
Academic Program
Yes
Yes
9, 3
Yes
2
Business Office
Administrative
Yes
Yes
A, K
Yes
A
Computing Services
Administrative
Yes
Yes
K, J
Yes
K
Counseling, Advising, & Recruitment
Educational Support
Yes
Yes
B, J, L
Yes
A, B, L
Distance Education
Educational Support
Yes
Yes
A, D, J
Yes
B,D,F,J
Division of Workforce Development
Administrative
Yes
Yes
A
Yes
A
Dual Enrollment Program
Educational Support
Yes
Yes
D
Yes
A,B,C,D
Early Childhood Development AAS
Academic Program
Yes
Yes
5, 2, 9
Yes
2, 9
Education AA&S
Academic Program
Yes
Yes
9. 2, 3
Yes
2
Financial Aid Office
Administrative
Yes
Yes
B
Yes
B
General AA&S
Academic Program
Yes
Yes
9, 2
Yes
2
Human Resources and Payroll
Administrative
Yes
Yes
E, B
Yes
A
Industrial Technology AAS
Academic Program
Yes
Yes
2
Yes
2, 3, 9
Institutional Advancement Dept.
Administrative
Yes
Yes
B, E, J, L
Yes
L
Learning Assistance & Testing Center
Educational Support
Yes
Yes
A, B
Yes
B,C,D,K
Learning Resources Center
Educational Support
Yes
Yes
A, L
Yes
C,E
Management AAS
Academic Program
Yes
Yes
3, 9
Yes
2,3,5,9
Nursing AAS
Academic Program
Yes
Yes
2, 3
Yes
2, 6
Purchasing and Procurement
Administrative
Yes
Yes
B
Yes
C,E
Safety and Security
Administrative
Yes
Yes
A, K
Yes
A,C,H,J
Student Activities
Educational Support
Yes
Yes
L
Yes
J. L
Student Support Services Program
Educational Support
Yes
Yes
L
Yes
K, L
Science AA&S
Academic Program
Yes
Yes
9, 3
Yes
2
Note: The results of this inventory demonstrate that the assessment is comprehensive. All academic programs, educational support units, and
administrative units are regularly participating in assessment.
Note: Missed 2010-11 reports due to revisions in the assessment process. However, 2011-12 data is incorporated into the 2011-12 unit reports.
Note: In 2011-12, a new taxonomic scheme was added to assessment which provided codes for better categorizing use of results in unit
improvement/change.
122
Download