Cover Page 1

advertisement
Cover Page
1
Table of Contents
Page
Chapter 1: Introduction ..................................................................................................................3
Chapter 2: Historical Perspective of Assessment ..........................................................................8
Chapter 3: Assessment of Student Academic Achievement ........................................................16
Chapter 4: Assessment Implementation.......................................................................................29
Chapter 5: Summary of Responses to HLC Visiting Team Concerns .........................................42
Appendix A: List of Attachments Included in Report .................................................................48
Attachment 4.1: Master Course Guide Samples .................................................49
Attachment 4.2: MATH 204 Assessment Report ...............................................64
Attachment 4.3: Nursing Program Assessment Report ......................................68
Attachment 4.4: Sample Program Outcomes Matrix ..........................................78
Attachment 4.5: Assessment Peer Review Rubric ..............................................79
Appendix B: List of References Cited in Focus Visit Report ........................................................80
Appendix C: Assessment Web Page Contents...............................................................................82
2
CHAPTER 1
INTRODUCTION
Purpose of the Report and Focus Visit
West Virginia Northern Community College underwent a comprehensive evaluation visit
from a Higher Learning Commission Team during March, 2003. While the report was generally
favorable with a recommendation for continued accreditation for ten years, the Team called for a
Focus Visit in 2007-2008 on Assessment of Student Academic Achievement. This report
describes Northern’s assessment history and implementation activities and addresses the
concerns identified by the 2003 Visiting Team. While the primary purpose of this report is to
document for the HLC the progress Northern has made in assessment of student learning, the
report and the Focus Visit also serve an important role in communicating to the College
community.
Process for Preparation of the Report
The Assessment Committee at West Virginia Northern has oversight responsibility for
assessment at the College. The Committee is comprised primarily of faculty with a faculty
member from every division serving on the Committee. In preparation for the report and the
visit, the Committee decided to expand its membership by adding an additional faculty member
from each division. The purpose of this expansion was two-fold. One was to broaden
participation in preparation and review of the report. The other was to use the focus visit as a
transition to new committee leadership. The original faculty representatives have been serving
on the Committee for three years or more. The Committee felt that it would be appropriate for
new faculty to join the Committee following the Focus Visit. Adding faculty for preparation of
the report is a strategy to insure continuity when the membership changes.
The Assessment Committee decided to organize the report around the key elements of the
“Commission Statement on Assessment of Student Learning.” Committee members contributed
examples of evidence demonstrating progress on each of the elements. The Assessment
Coordinator compiled the information and wrote the report which was shared with the
3
Committee for comment and revision. The final draft was then shared with the entire faculty and
the College Cabinet for comment.
Organization of the Report
This report demonstrates the progress that Northern has made in assessment of student
learning. The first chapter is the introduction which provides information about the purpose,
preparation, and organization of the report. Additionally, the introduction describes Northern’s
accreditation history and significant changes at the College since the last visit. The second
chapter describes the assessment history. Chapter 3 explains the assessment processes at
Northern and the 4th chapter details implementation of the assessment plan. Chapter 5 evaluates
progress in assessment of student learning since the last visit, lists strengths and challenges, and
describes future actions that are planned to strengthen assessment and also institutional
effectiveness.
The report contains a list of Appendices and Attachments. Attachments have been
provided for the convenience of the reader as most materials are also accessible via the
Assessment Web Page. Appendix A contains the attachments referenced in this report.
Documents included in the submitted report are annotated as attachments. Attachments are
numbered according to chapter and order of appearance within the chapter (Ex: ATT 2.1, ATT
3.1, etc). Appendix B contains the reference listing of documentation for each chapter in this
report. Referenced documents in each chapter are identified by file location in the Resource
Room. References are numbered according to chapter and order of appearance in the chapter
(Ex: REF 2.1, REF 2.2, etc.). Appendix C contains a list of materials available on the
Assessment Web Page.
History of Accreditation at West Virginia Northern Community College
West Virginia Northern Community College has been accredited since NCA transferred
the accreditation formerly accorded the Weirton and Wheeling campuses of West Liberty State
College to West Virginia Northern effective July 27, 1972. The new College conducted its first
self-study in 1973-1974 and the transfer of accreditation was affirmed in 1975 for five years.
4
The College was awarded five years of continued accreditation following an evaluation team
visit in the Spring of 1980. As a result of the next comprehensive self-study, the College was
awarded continued accreditation for seven years, with a scheduled visit in 1992-1993. The 19921993 evaluation visit resulted in continued accreditation for ten years with a focus visit
scheduled for 1996-1997 on the topics of finance, communication, and assessment. The NCA
focus team visit report from the April, 1997 visit concluded “...that the three focus issues have
been addressed successfully”.
In March 2003 the HLC Team conducted a comprehensive evaluation visit. The Team
recommended continued accreditation for ten years and the Commission accepted the Team
recommendation with the next comprehensive visit set for 2012-2013. The Team also
recommended a Focus Visit on assessment of student academic achievement in 2007-2008. In
their report the Team stated:
“Although it is clear to the team that the senior executive officers of WVNCC provide
leadership and support for assessment, the team found little evidence that the institution
has moved beyond planning for assessment. It is not clear that an assessment program
with structured processes that are continuous and provide meaningful and useful
information to the planning processes is in place, is owned by the faculty, and is being
used to make decisions to improve instruction.
During the focused visit West Virginia Northern Community College must demonstrate
the following:

Consistent assessment of student learning outcomes across the College,
regardless of location or modality.

Implementation of assessment across the various levels, including courses,
general education, degree programs/certificates, and institutional, and establish
measures, indicators and specific benchmarks for acceptable levels of
performance.

Demonstrate that data collected is analyzed and used to improve subsequent
instruction.
5

Demonstrate that the analysis of assessment results is integrated with planning
processes and is communicated to students, faculty and administration.”
Significant Changes Since the Last Visit
All institutions experience change but Northern has experienced an uncommon amount of
change since the HLC visit in the spring of 2003. While there have been changes in all aspects
of the College, the changes in leadership and in the administrative structure have had the greatest
impact upon the assessment efforts of the College. At the time of the visit in 2003, Dr. John
Hunter was the President, Garnet Persinger was in her fourth year as the Chief Academic
Officer, and the academic area was organized into three academic centers with a chairperson for
each. Ms. Persinger retired following that semester and in the five years that followed there have
been four different individuals who have served as the Chief Academic Officer. In the spring of
2005, Dr. Hunter announced his resignation as President and Dr. Martin Olshinsky was named
President in August of that year. There were also numerous changes at the division/department
level during this period of time. A summary of the administrative changes follows:
$ 2002-2003: G. Persinger was Provost (chief academic officer); Dr. J. Hunter was
President; academic area organized into 3 centers with division (center) chairs.
$ 2003-2004: J. Daley was Dean of Instruction (CAO); Dr. J. Hunter was President;
academic area organized into 3 centers with division (center) chairs.
$ 2004-2005: Fall semester had interim arrangement for chief academic officer. B. Good
named Dean of Instruction in January; Dr. J. Hunter was President but announced
resignation in January for end of year; academic centers eliminated with two associate
deans named to coordinate faculty and work with program coordinators
$ 2005-2006: B. Good was Dean of Instruction; Dr. M. Olshinsky was President;
academic division structure was re-established in the spring semester with 4 divisions
each having a division chairperson.
$ 2006-2007: Michael Koon was interim Vice President of Academic Affairs; Dr. M.
Olshinsky was President; academic area organized into 4 divisions with division chairs.
6
In addition to changes in College academic leadership and organizational structure, there
have been significant changes in institutional effectiveness during the past five years. The Office
of Institutional Research was just being developed under the direction of Michael Smith at the
time of the 2003 HLC visit. Since then the College has used Title III funds to establish an
effective institutional research office now called the Institutional Research and Information
Systems (IR/IS) with Mr. Smith as the Director. The center has three staff members in addition
to Mr. Smith. The placement of the IR/IS Office within the organizational structure has changed
with the director reporting to the Dean of Institutional Effectiveness (a position that has been
eliminated), then to the Dean of Computer Information and Communications Technology, and
currently to the Vice President of Finance and Administration.
There have also been significant changes in leadership for assessment at the College.
While the Assessment Committee has always had oversight responsibility for assessment, the
College has used several approaches for coordinating assessment. In 2004 the position of Dean
of Institutional Effectiveness was created with directing of assessment as a prime responsibility.
When that position was eliminated, a faculty member was recruited as Assessment Coordinator
and granted release time to perform the duties. In the summer of 2007 the College hired Sherry
Becker-Gorby on a part-time basis to serve as Assessment Coordinator. Ms. Becker-Gorby was
the former Associate Dean of Instruction and coordinated assessment efforts at the College until
2001.
In addition to personnel and organizational changes, there have been two significant
changes in facilities. The College purchased and renovated a former warehouse building in
Wheeling which is located adjacent to the B&O Building (the main College Building). All
programs except the Refrigeration, Air Conditioning, and Heating program which were formerly
housed at the Hazel Atlas Building have been moved to the 80,000 square foot Education Center.
With the addition of this facility and the concurrent development of the plaza behind the B&O
Building, the College now has a true campus setting in Wheeling. The other significant change
is the conversion of classrooms on all three campuses to technology enhanced classrooms.
Using Title III funds and College capital funds, almost every classroom in the College now has
7
overhead projectors and podiums equipped with computers connected to the internet and DVD
and VCR players.
8
CHAPTER 2
HISTORICAL PERSPECTIVE OF ASSESSMENT AT WVNCC
Chapter 2 provides the history of assessment at Northern with particular emphasis on
assessment developments since the HLC team visit in the spring of 2003. Throughout its
assessment history, Northern has developed, implemented and revised its assessment plan
several times only to experience difficulty with sustainability of defined processes. In retrospect,
many issues related to implementation and efficacy of the previous assessment plans can be
linked to data access, changes in academic leadership, institutional reorganization and degree of
faculty ownership. Since the 2003 HLC visit, the Assessment Committee, faculty, and
administration have taken steps to reinvigorate assessment and to ensure sustainable assessment
practices that support student learning and academic achievement.
Assessment at Northern had its unofficial beginnings in 1992 with the institutional self
study in preparation for a re-accreditation visit by NCA. At that time the College committed to
developing an assessment plan as part of the re-accreditation process. Like many institutions in
the early stages of assessment, Northern and its faculty grappled with terminology and
philosophy and attempted to develop an assessment plan to measure student learning which
would also satisfy state and NCA requirements. Several of the faculty and the academic vice
president attended a workshop by Patricia Cross in 1991. Given that orientation, much of the
faculty efforts, at that time, related to assessment centered around Cross and Angelo’s classroom
assessment techniques. From 1988 through 1991, the faculty spent considerable time debating,
developing, and implementing a program for College entry-level assessment and placement. As
a result of the self study preparation for the 1992-1993 visit and a deeper understanding of
assessment by the faculty and administration, a core group was formed to coordinate
development of the assessment plan. Integral to the development of the plan was the philosophy
that assessment must be linked to the College’s mission. This commitment to mission driven
processes has remained consistent throughout all revisions of the assessment plan. Assessment
data and information gathered through focus group meetings held during the fall of 1992 and the
1993 NCA team visit provided valuable guidance regarding assessment of student academic
achievement. Not surprising to the institution, the NCA Team report indicated that minimal
9
progress had been made in the area of assessment and included this concern as an area to be
addressed in a focus visit scheduled for 1996-1997.
In the fall of 1993, a committee comprised mainly of faculty was formed to develop an
assessment plan. After a review of various assessment models and spirited debate within the
committee, the Assessment Committee recommended that the institution adopt the James O.
Nichols model, beginning with micro-assessment at the course level and progressing to a macro
approach as the assessment initiative matured. The assessment plan recommended by the
Committee was adopted by the institution, submitted to NCA, reviewed as part of the focus visit
in 1997, and subsequently approved by NCA. The visiting team report from the 1997 focus visit
concluded that “it has been determined that the concern for assessment has been addressed, since
a plan is in place and is being implemented” (Report of a Focus Visit, April 15-16, 1997, Pg.
10.).
The Assessment Committee continued to evaluate and refine the assessment program and
faculty implemented the plan by attending professional development activities, establishing
outcomes for individual courses and programs, establishing cycles for assessing courses, and
collecting and analyzing the data. The Assessment Committee reviewed the data and made
recommendations to programs and academic divisions regarding refinements in micro-level
assessment activities and changes essential to move the College to a macro-level approach. The
Associate Dean for Academic Affairs facilitated much of the data compilation and provided
technical assistance to the program faculty for both assessment and program review. By the
spring of 2000, outcomes had been submitted for 294 of 300 possible courses (98%).
Assessment reports had been received for 166 courses by the spring of 2001. Sixty-five of the
courses without assessment reports had not been offered since the outcomes had been defined or
still were within the 2-year review cycle. Therefore, assessment reports had been received for
166 of the 235 possible courses (71%).
A significant impediment to complete implementation was difficult acquisition of data.
The College did not have a functional institutional research office. Prior to 2001, institutional
research was a shared function between the computer center and other departments within the
College that required data for reporting and operational purposes. Recognizing the need to
10
improve the institutional research area, the College included the development of an institutional
research office in a Title III Grant which was funded in 2001. The College established a fulltime position of Director of Institutional Research in summer 2001 and used the Title III grant to
provide financial assistance to help fund the position, to establish the IR office, and to establish
systems to enhance assessment of student learning and evaluation of institutional effectiveness.
This office is still operational today under the title of Office of Institutional Research and
Information Systems.
Several organizational changes occurred in the fall of 2001 which impacted
implementation of the assessment plan. The position of the Associate Dean of Academic Affairs
was eliminated and responsibility for facilitating data collection was assigned to the Director of
Institutional Research. In addition, the Assessment Committee was changed from a standing
committee to a sub-committee of the Academic Affairs Committee with the intended goal of
more effectively linking assessment and curriculum development.
By 2001 the College was in the midst of another self study process in preparation for a
NCA re-accreditation visit in 2002-2003. Through the self study process it became obvious to
faculty and the administration that assessment implementation was not proceeding as had been
anticipated. The Academic Affairs Committee determined that curriculum development and
assessment each required significantly more time and focus than one committee could effectively
address. Hence, the Academic Affairs Committee recommended that assessment once again be
the responsibility of a separate and focused assessment committee. Faculty in some divisions
and programs were continuing assessment efforts but implementation was sporadic. As a result,
progression to macro-level assessment was minimal. Further indication that the assessment
program was lagging came from the Higher Education Policy Commission (HEPC) review of
student academic achievement at all State colleges and universities conducted by the National
Center for Higher Education Management Systems (NCHEMS). The NCHEMS report indicated
that many West Virginia institutions were struggling with assessment and expressed concern that
Northern was making little progress in implementing its assessment plan.
In light of the findings of the self study committee, the Academic Affairs Committee and
the NCHEMS review, the College began the process to revitalize the assessment program. A
11
revised assessment plan, which built upon the previous efforts but more aggressively moved
toward macro-level assessment, was presented to the faculty in August 2002. The College
community moved to implement the revised plan. A new assessment committee called the
College Leadership Team for Assessment (CLTA) was established. The Faculty Senate also
established a General Education Committee. A new appointment to the position of Director of
Institutional Research was made. A key element in the revised assessment plan was the
requirement of Master Course Guides (MCG’s) for all courses taught by the College. The
MCG’s were established to improve many aspects of instruction and assessment by standardizing
the process and defining expectations for students. A format was developed for the MCG’s and
workshops were conducted for both full-time and part-time faculty to explain the process and
assure effective implementation. In order to strengthen the commitment to student learning and
ensure consistency across the curriculum, a process and standard for establishing course learning
outcomes was developed and presented to the faculty. The MCG serves as the official
institutional document by which faculty delineate the purpose, structure and course learning
outcomes of courses offered by Northern. Faculty are required to adhere to the course
requirements published in the MCG thus assuring consistency of learning outcomes across
multiple course sections. To date, 86% (253) of the courses listed in the College Catalog have
MCGs. MCGs are posted to the Assessment web link for ready access by faculty, students and
other constituents.
The 2002-2003 self study found that progress had been made in many areas but there
were numerous deficiencies. Problem areas were that the College had not moved to macro-level
assessment as planned, feedback loops were not well developed, documentation and review of
assessment activities was not systematic, general education was not assessed as a program area,
and there was little connection between assessment processes and budget development. The
HLC/NCA Team report verified many of the College’s findings. A focus visit on assessment
was scheduled for 2007-2008.
Organizational changes in 2003 and 2004 blunted some of the momentum gained during
the self study process. In two academic administration re-organizations, academic centers were
eliminated and replaced by departments and then two associate deans were established. Forms
12
and processes referring to academic centers and divisions were obsolete creating confusion as to
where assessment reports were to be submitted and who was coordinating the implementation.
The position of Dean of Institutional Effectiveness was created for the 2003 academic year with
coordination of assessment as a major responsibility. However, the search became prolonged
and the Dean was not hired until spring 2004. Shortly after that the Dean of Academic Affairs
office was re-organized and the Dean position was vacant from July 2004 until January 2005.
The Assessment Committee was re-constituted for the 2004-2005 academic year and
began working with faculty to revitalize assessment. The Committee began making reports at
Faculty Senate meetings and gathering input from faculty. In the spring 2005 term, the newly
appointed Vice President for Academic Affairs made a commitment to revitalize the assessment
initiative. As an initial step, an assessment audit was conducted to discern the status of
assessment practices at that time. Faculty were also asked to participate in an assessment
activity. New forms for reporting assessment activities were prepared and shared with faculty.
As a result of the revitalization efforts, all full-time faculty indicated participation in an
assessment activity; best practices in assessment were identified and shared with faculty during a
professional development session.
In 2005, the Assessment Committee once again undertook a major review and
comprehensive revision of the assessment plan, but this time with emphasis on the student as a
developing learner, faculty ownership, feasibility and sustainability. The Assessment Committee
committed to bimonthly sessions and the active engagement of the faculty in developing the
currently approved plan to assure long term commitment and ownership of the process. The
administration provided central leadership for the Committee to assure access to institutional
resources and sufficient support to lead the assessment charge to fruition. In light of the
administrative presence and role on the Assessment Committee, it is important to note that
faculty clearly led the charge for assessment redefinition and implementation. To further assure
faculty engagement, an Assessment Coordinator was appointed from faculty ranks to serve as a
liaison between the faculty, assessment committee and administration. Additionally, the
“assessment committee report” was added as a standing agenda item at Faculty Senate meetings
[REF 2.1]. This reporting function provided an additional avenue for access to assessment
13
information thereby bolstering the standard committee reporting processes and division meeting
reports. After much debate and multiple levels of review, the current assessment plan was
approved by the Faculty Senate in October 2005 and the Board of Governors in March 2006. A
new reporting time line and report format were put into practice. The Master Course Guide was
also revised as a result of the new assessment initiative. The revised MCG now included both
student learning outcomes and student learning performance objectives. A faculty development
activity was provided to assure that faculty understood the changes. Faculty have ready access to
preparation guidelines and forms to prepare Master Course Guides [REF 2.2].
Although course assessment activities had been ongoing through the development of the
2006 assessment plan, renewed emphasis was placed on course level assessment in the 20062007 academic year. Course assessment activities were developed with a clear focus on student
learning, effective teaching and faculty discourse. An annual academic assessment and reporting
cycle (See Assessment Plan, page 17) was defined and an assessment proposal process was
incorporated in the assessment plan as an added measure to assure continuity and longevity of
the assessment program [REF 2.3]. By the close of the spring 2007 term, over one hundred
course level assessment reports had been completed and submitted to the Assessment
Committee. In addition to course level assessment activities, faculty continued with the
definition of course learning outcomes by completing Master Course Guides for courses taught
primarily by full-time faculty.
Program level activities were also undertaken in all divisions. Program assessment
activities paralleled the program review requirements required by the WV Council on
Community and Technical College Education (WVCCTCE) and external accrediting agencies.
The recent revision of the assessment plan expanded on this framework by adding an annual
institutional review. This process is described more completely in Chapter 3 of this report.
Sound assessment processes are only one mechanism to assure efficacy and continuity of
assessment processes. In spring 2007, the Assessment Committee membership was increased by
including an additional faculty from each division and employing an Assessment Coordinator.
The increase in membership is intended to bolster momentum and assure continuity as senior
members of the assessment committee begin to shift to other institutional committees.
14
Expansion of the committee membership also increases the opportunity for faculty input and
perspective of assessment across divisions.
In summer 2007, the assessment committee focused attention on discerning the quality of
assessment practices and strengthening the feedback loop by implementing a peer review process
for course level assessment. Specifically, the purposes of the peer review process were to:
$
Provide evidence that students are achieving stated student learning outcomes
$
Provide a snapshot of progress implementing course assessment
$
Identify best practices at Northern
$
Identify faculty development needs
$
Close the feedback loop
$
Provide mechanism to assure continuous improvement of assessment process
A pilot project was undertaken with the Assessment Committee serving as the peer
reviewers. Results of this process are detailed in Chapter 4 of this report.
Interwoven throughout this history is an ongoing support for faculty development.
Northern has historically supported efforts to improve institutional learning regarding student
learning, assessment and accreditation processes. Teams comprised of faculty, administration
and staff participate in the HLC/NCA Annual Conferences. The College has continuously
supported faculty attendance at the Annual Conference since 2003. Results of these efforts are
evidenced in practices currently in operation at the College. The recently piloted peer review
process is one example of such practices. In addition to participation in national conferences, the
College has also committed to shared learning opportunities among faculty at the College. In
2006 multiple professional development opportunities were provided for faculty both regionally
and on campus. These included sessions on rubric development, the Alverno ability-based
education model, and embedded assessment. Each term, a portion of development activities
prior to the start of a term are dedicated to assessment. The College also contracted the services
of an assessment consultant to foster development of assessment within the departments, review
assessment practices and assist faculty with the development or revision of assessment plans. In
May 2006, a mini-conference was offered providing information about faculty access to the
15
assessment web page, instructions for the assessment reporting cycle and round table sessions
addressing best practices at WVNCC.
At the same time Northern was moving to strengthen assessment of student learning, it
was also implementing processes to improve institutional research and evaluation of institutional
effectiveness. The College used a Title III grant to help establish the Office of Institutional
Research and Information Systems and to purchase hardware and software for data collection
and analysis. Processes were implemented to systemize reporting. In the fall of 2006 an
institutional strategic plan was developed with measurable objectives for administrative areas.
Additionally, an Institutional Effectiveness Team was established to guide the institutional
effectiveness process and to integrate it with the assessment of student learning.
In 2004, the faculty undertook a review and revision of the general education core
learning outcomes. General education outcomes were revised and course mapping was
conducted. Under the revised assessment plan, general education assessment became the
responsibility of the General Education Committee. The assessment of general education is
addressed more fully in Chapters 3 and 4.
Northern has made a long term commitment to assessment of student learning. Despite
an inconsistent assessment history, assessment practices have evolved into an effective,
sustainable process. The College has learned from its history and has made significant progress
toward developing a culture of assessment. Plans for 2007 and beyond include:
$
continuing the established course assessment cycle
$
encouraging collaboration among faculty in development of assessment projects
$
incorporating course level assessment activities into program level assessment
$
offering faculty development opportunities to increase proficiency
$
continue support for faculty-led discussions on general education, curriculum
development and student academic achievement
$
closing the feedback loop and increasing communication between the
assessment committee and faculty
16
CHAPTER 3
ASSESSMENT OF STUDENT ACADEMIC ACHIEVEMENT AT WVNCC
17
The purpose of this chapter is to provide an overview of assessment of student academic
achievement at Northern with particular emphasis on implementation and philosophical changes
occurring since the 2003 self study and HLC team visit. The goals of Northern’s assessment
initiative are to enhance student learning, and to improve instruction and curriculum [REF 3.1].
The College’s mission and faculty’s focus on the “student as a developing learner” serve as the
conceptual framework for all implementation processes defined in the assessment plan. As
Northern’s faculty again grappled with its assessment challenges, several issues clearly evolved:
sustainability, faculty ownership and focus on learning. It was important to faculty to implement
a plan that was clearly more than a compliance document and to create an assessment process
sustainable beyond initial implementation. As a result, the institutional mission and focus on the
student became the guideposts for developing and implementing the currently approved
assessment plan. The Assessment Committee comprised mainly of faculty developed a revised
assessment plan. To assure faculty engagement and ownership, faculty were involved in all
proposed revisions of the assessment plan through faculty senate meetings, division meetings and
start of the semester activities. At all planning stages, faculty were actively encouraged to
participate in, comment on, and provide recommendations to the assessment plan while in
development. The approved academic assessment plan is based in the institution’s mission and
values as articulated in the Strategic Plan and is interwoven with existing policies and procedures.
Northern’s Strategic Goal 2 states the College will “Move assessment to the forefront of College
agenda... (by) defining and assessing student success” [REF 3.2]. Northern’s pledge to students
further emphasizes this commitment to “excellence in teaching and learning” [REF 3.3].
The “Student as a Developing Learner” is at the core of Northern’s assessment plan [REF
3.4]. This allows for informed discussions of educational processes, development of the
curriculum and student success. Northern’s assessment plan addresses three stages of
engagement while the student is affiliated with the College: “Students at Admission” (entry
assessment), “Students During Enrollment” (process), and “Students after Graduation” (long
term) assessment. All stages address student academic achievement, student success, course and
program effectiveness, and student satisfaction. Particular emphasis is placed upon course and
program assessment activities during the second stage, “Students During Enrollment”.
18
Students at Admission
The open door admission policy provides learning opportunities “for all who wish to
learn” while at the same time creating many challenges for faculty and staff. Focusing on the
“Student as a Developing Learner”, assessment practices first discern student capabilities upon
entrance to the College for appropriate advisement and course placement. Course level
assessment activities and tracking studies assure effectiveness of developmental programming
and entry level assessment practices. Students are tracked from their initial placement in
developmental courses through their performance in college level courses and programs as well as
retention through graduation. Results of tracking studies are shared among faculty teaching in the
developmental program. Results of developmental tracking students are also reported in the
WVCCTCE Compact, a report addressing the College’s accomplishment of State goals and
standards as measures of institutional effectiveness. Course level assessment activities are used to
discern overall student success and need for curriculum changes. For example, the mathematics
faculty were concerned with the success and retention of students in the three credit
developmental algebra course (MATH 96). Hence they committed to a course level assessment
project to determine if students would be better served by extending time on task by creating a
two semester course series. As a result of tracking studies and course assessment, the MATH 96
course was eliminated from the curriculum and replaced with a two semester course series
(MATH 92 and MATH 93). Assessment of this curriculum change is currently in process.
Students During Enrollment
What do students learn while they are at Northern? How well are they learning? How can
we improve student academic achievement? These are the questions faculty address in this stage
of assessment activities. Student success during enrollment and after graduation are dependent
upon what happens at this stage of student engagement with the College. To address these
questions, the assessment plan incorporates the following characteristics:
$
Goals and objectives derived from the College mission, vision, and values
$
Clear and explicit student learning outcomes
19
$
Measurable student learning performance objectives and performance standards
$
Cyclical assessment to determine extent to which outcomes/objectives are met
$
Use of multiple assessment methods to collect and analyze information determining
extent to which outcomes/objectives are achieved
$
Use of data to improve academic functions such as curriculum development,
strategic planning and budgeting, and primarily to improve instructional activities
that impact student learning and student success
$
Professional development activities to ensure responsible and effective application
of assessment instruments
$
Communication of data and impact on learner achievement resulting from the
assessment initiative
Course and program level assessment activities are the core of assessment while students
are enrolled at the College. Clear statements of student learning outcomes are the foundation for
student academic achievement. The Master Course Guide (MCG) is the official document that
provides consistent delineation of student learning outcomes and student learning performance
objectives across class sections, teaching modalities and among instructors. All courses in the
College Catalog are required to have a MCG [REF 3.5]. MCGs have been completed for courses
taught by full-time faculty. Plans are underway to develop MCGs for courses taught only by
adjunct faculty. In addition to orienting faculty through professional development sessions, the
assessment web page contains links for Master Course Guide requirements, preparation
guidelines, and MCG template [REF 3.6].
20
Students After Graduation
A measure of student achievement is continued application of learning. Consequently,
assessment and institutional effectiveness activities extend beyond graduation to determine if
students apply their acquired learning in the workplace or in continued studies at another college.
The Career Services Office administers follow-up surveys to obtain employment and continuing
education information on graduates. Program directors also conduct employer surveys to discern
employer perception of program graduates and program learning outcomes. The IR/IS Office
tracks graduate performance (graduation rates, licensure statistics, and transfer data) and provides
data to relevant program faculty for inclusion in program assessment activities. Acquiring
transfer data from the State has been extremely difficult. Northern has taken a lead position
among the community colleges in the state in getting the transfer information distributed to the
colleges. In August 2007, the College was provided with the system file about transfers. The
IR/IS Office is currently analyzing the data to prepare reports for the College community.
Model for Assessment of Student Learning
Northern’s model for assessment of student learning and evaluation of institutional
effectiveness provides a flexible framework for guiding the assessment initiative and allows for
variations in selection of assessment measures appropriate to the disciplines and programs.
Consistent with the plan’s guiding principles, the model is based on the premise that curriculum
decisions are best made by the collaborative efforts of the faculty. The Assessment Committee
and IR/IS Office provide technical assistance to help faculty choose assessment measures or
instruments. However, the faculty provide leadership for defining assessment activities and
collecting data that contributes to improvement of student learning and student academic
achievement (See Assessment plan, page 15 for Model of Assessment of Student Learning) In
accordance with the model [REF 3.7]:
$ Faculty determines outcomes, assessment, and evaluation measures for the
assessment cycle.
21
$ Faculty determines performance indicators (standards, benchmarks) for comparison
purposes. Note, some performance measures and indicators may be stipulated by
external agencies.
$ Faculty collects and summarizes data.
$ Faculty analyzes the data and prepares assessment report of the results.
$ After instituting changes, faculty may conduct an additional assessment to determine
impact of changes with a follow-up report. Such assessment activities are separate
from the course scheduled assessment.
$ Communication and completing the feedback loop are at the center of effective
assessment as indicated in the model. This is viewed as an essential aspect of
student learning and institutional effectiveness.
Course Assessment
Northern is committed to assessing and improving student learning throughout the
curriculum. The College implemented course level assessment processes to assure opportunities
for early intervention in the curriculum and to provide incremental review opportunities for
program level assessment. Course level assessment places emphasis on student learning
outcomes, clear statements of student performance objectives and opportunity to strengthen the
curriculum early in the assessment cycle.
In course assessment activities, faculty:
$ Identify student learning outcomes for the assessment cycle
$ Develop indicators and performance standards to assess accomplishment of
learning outcomes
$ Use data to revise instructional strategies and curriculum
$ Share and discuss results (feedback loop) to improve student academic
achievement
Course outcomes for selected courses are assessed on a scheduled, rotational basis as
determined by faculty within each division [REF 3.8]. Faculty within the divisions determine the
22
rotation schedule for course assessment activities. Each course assessment project generally
targets one to three student learning outcomes for evaluation. Dependent upon assessment
findings, some outcomes may be assessed over multiple years to validate effectiveness of changes
in curriculum or course materials. Assessment instruments include the use of direct measures
such as embedded assessment instruments (i.e. exam questions, lab exercises, etc.), indirect
measures of student achievement (student perception surveys) or other instruments selected by the
faculty. When feasible, faculty collaborate to collect comparable data across multiple sections of
a course. Performance standards are set by the faculty as deemed appropriate for the discipline.
In the absence of historical data for setting benchmarks, faculty conduct assessment activities to
establish a baseline for future comparison. Assessment results may trigger further monitoring of a
learning outcome, evaluation of course materials supporting the learning outcome, revision of
course materials or further curriculum revisions. Course level assessment activities serve the dual
purpose of assuring learning at the course level, but also providing a mechanism for determining
overall course effectiveness in meeting program level learning outcomes.
Program Assessment
Northern evaluates student learning and achievement at the program level through a
cyclical, systematic process consisting of three components: program review, annual institutional
assessment review, and if applicable, external accreditation reviews. Program assessment
addresses the extent to which learners achieve the program outcomes as defined in student/college
materials. Initially, Northern’s program assessment paralleled the WVCCTCE policy for program
review [REF 3.9]. Although this met state requirements for program evaluation, faculty found
the process to be insufficient for meaningful assessment. The five year time-line, although
cyclical, did not provide for effective monitoring and timely intervention. Hence the annual
institutional assessment projects were added to the program level assessment cycle.
23
Program level assessment demonstrates if:
$ students are acquiring the knowledge/skills necessary to achieve defined program
outcomes
$ programs meet the needs of area employers or sufficiently prepare students for studies at
the baccalaureate level.
$ program outcomes are derived from and support the college mission, the general
education philosophy and the program mission
$ the curriculum is coherent and current
$ instruction is effective for student success
$ resources are sufficient for effective program delivery
All degree programs have stated program level learning outcomes. These outcomes are
published in the College catalog, program brochures and on the College website. Additionally, a
“Program Objectives Matrix” is completed for approved academic programs published in the
College Catalog. Using this matrix, course mapping is completed for each academic program
linking program learning outcomes with courses required in the curriculum [REF 3.10]. The
Program Objectives Matrix provides the ability to identify target areas for extensive review and
potential modification as identified through the assessment process.
Through use of these three components of program assessment, each program is
extensively evaluated over an extended period of time incorporating multiple direct and indirect
assessment methods. Through the annual institutional review, program faculty select at least one
program outcome for an assessment review. This annual assessment review is comparable to the
course level assessment process where faculty select a student learning outcome, select
appropriate indicators, and identify performance standards and collect data to determine student
achievement of the intended learning outcome. The primary difference being focus on program
level outcomes as compared to course level outcomes. In addition to these direct measures of
achievement, program review builds in the opportunity for a long range, comprehensive review of
direct and indirect performance indicators. These include, but are not limited to graduation rates,
retention rates, transfer rates, licensure/certification rates, and student and employer satisfaction.
24
The reporting format and guidelines have been recently revised for program review. The
Nursing program (spring 2007) served as a pilot for implementing the new reporting process. The
report parallels WVCCTCE program review guidelines incorporating the five year review cycle
and required report criteria. The report criteria include program viability, adequacy, necessity,
and consistency with mission. Each criteria has defined indicators and data elements each of
which incorporate annual program assessment reports, and BOG, WVCCTCE or external
accrediting agency reports [REF 3.11]. By implementing a five year review cycle with an annual
report to the Vice President of Academic Affairs, programs are continually monitored to assist
with early problem identification and intervention strategies to rectify problems before they
become systemic and impact long term program success. The proposed format was presented to
and endorsed by division chairpersons and program directors in May 2007. It has been
incorporated into a new institutional rule (policy) on program review. The format was shared
with faculty during the fall 2007 development day. The proposed rule is currently posted for
public comment. It should be acted upon by the College Board of Governors at the November
2007 meeting.
Assessment of General Education Core
Assessment of the general education core parallels processes defined in the course and
program level assessment activities. A General Education Committee was established to
specifically focus on the philosophy, definition and statement of student learning outcomes for the
general education core curriculum. A program outcomes matrix was completed for the general
education core and distributed to the faculty. The goal of the matrix is to link the general
education outcomes with specific courses to assure inclusion of all general education outcomes
within the College programs. Courses designated as general education core courses are assessed
on a regular basis determined by the rotation schedule developed by the divisions. The General
Education Committee is charged with working with the faculty and the Assessment Committee to
determine other processes for assessing student academic achievement and effectiveness of the
general education core.
25
In addition to course assessment and program assessment activities, the WorkKeys testing
series is designated as a standard measure of assessment of the general education core in career
technical programs. The West Virginia Community and Technical College System has
established WorkKeys as a measure for Perkins III Core Indicators. Following the May 2006
professional development activity, the faculty suggested that Northern should explore using
WorkKeys to assess general education in the career technical programs. The tests used in the
assessment include Reading for Information, Applied Mathematics and Locating Information.
These tests are administered annually in the spring semester to career technical students in their
final semester. Minimum performance standards are defined in accordance with Standards and
Measures for Perkins III Core Indicators [REF 3.12]. Scores falling below the designated
standard indicate the need for a curriculum review in the deficient subjects to determine cause and
appropriate corrective action.
$
Applied Mathematics Performance Standard: Seventy-seven percent of program
completers shall score at or above the appropriate WorkKeys level for their given
occupational area.
$
Reading for Information Performance Standard: Eighty-eight percent of program
completers shall score at or above the appropriate WorkKeys level for their given
occupational area.
$
Locating Information Performance Standard: Eighty-eight percent of program
completers shall score at or above the appropriate WorkKeys level for their given
occupational area.
Test results are provided to program faculty to discern effectiveness of general education
core in supporting development of mathematical and reading competencies required in the
workplace.
The College does not currently administer a standardized test of the general education
core. The Academic Profile was administered for two years as part of a HEPC/WVCCTCE
system initiative to assess the general education curriculum. Although standardized testing
provided a performance measure for comparing Northern student performance against a national
standard, results did not provide an effective measure or recommendation for strengthening the
26
general education curriculum at the College. The state initiative was abandoned and Northern
faculty opted to discontinue the project.
Assessment Time-lines and Sustainability
Although Northern’s assessment model provides for significant flexibility and
customization, a time-line and operations cycle was developed to facilitate implementation and
assure sustainability. The time-line had to be realistic, fit within existing organizational and state
system processes and assure continuity of assessment processes. Considerable attention was
given to these parameters. Regarding time-lines and sustainability, the faculty agreed that
prescriptive measures were necessary to ensure effective implementation [REF 3.13]. The
assessment cycle incorporates the planning, data collection, review and reporting processes across
the institution. The assessment cycle begins with the submission of an assessment proposal by
the faculty to their respective division chairs. The addition of the proposal process provides a
mechanism to assure assessment activities are being planned by all faculty and program
coordinators. The division chairs in turn provide a compilation of proposed assessment activities
to Assessment Committee. The Committee reviews proposal summaries and provides feedback,
if needed, to the respective division chair. Target dates are established for assessment report
submissions in each term subsequent to the proposed assessment activity. The Assessment
Committee tracks submissions to assure implementation. In 2007 (summer), the Assessment
Committee piloted a peer review process as an additional mechanism to monitor assessment
activities, provide another mechanism to close the feedback loop and improve assessment
practices.
Institutional Effectiveness
While the emphasis of the 2003 Team report was assessment of student learning, the Team
made it clear in their second point for the focus visit and in the advancement section that they also
had some concerns about evaluation of institutional effectiveness that needed addressed. Since the
visit, Northern has established an effective office of institutional research and has implemented
processes to measure institutional effectiveness.
27
The Institutional Research and Information Systems (IR/IS) Office is headed by a director
and includes a data base administrator who serves as a programmer, a computer center director
and an application systems analyst. Using funds from a Title III grant, hardware and software,
including the Brio/Hyperion package, was purchased to facilitate data collection, analysis and
storage. Using Brio/Hyperion, a data warehouse is being established to improve data analysis and
reporting. This software enables the office to create data reports and administrative users can then
drill into the data to conduct analyses which are critical for the particular area. The data
warehouse has enabled the College to create reports on admissions, enrollments, student success,
and finances and to link them together. As a result of the implementation of the IR/IS Office, the
institution has been able to become more data-driven in its decision making. The development of
the IR/IS Office has resulted in more effective integration of planning in all areas and budget
development.
Institutional effectiveness and assessment operate in tandem to generate a comprehensive
institutional view of student academic achievement and student success. While assessment
focuses specifically on student academic achievement, measures of institutional effectiveness
provide the needed data for a comprehensive picture of mission implementation. The model as
defined in the assessment plan [REF 3.14] demonstrates that all departments are to be engaged in
evaluation of student’s lived experience at the college as well as overall organizational health and
mission implementation. The IR/IS Office provides institutional support for data access and
compilation. Student satisfaction surveys, retention studies, tracking studies, transfer data and
graduate data is made available to support comprehensive assessment and evaluation initiatives
[REF 3.14].
HLC Focus Visit Concerns Addressed by the Assessment Plan
Northern’s current assessment plan effectively addresses the concerns noted in the HLC
2003 Visiting Team Report (quotes from Visiting Team report are in italics below).
Consistent assessment of student learning outcomes across the College, regardless of
location or modality.
 Consistency of course learning outcomes provides a framework for effective
28
assessment. Master Course Guides (MCG) are required for all courses. The
MCGs define course content, purpose, structure and learning outcomes.
 MCGs are provided to faculty teaching a course
 MCGs are available to all faculty, students and other constituents via the
assessment web site.
 Course assessment activities are structured to assess course learning outcomes
across multiple sections to address student academic achievement, hence
assessment activities are not restricted by delivery modality or campus location.
Implementation of assessment across the various levels, including courses, general
education, degree programs/certificates, and institutional, and establish measures,
indicators and specific benchmarks for acceptable levels of performance.
 The assessment plan defines processes for multiple levels of academic delivery.
 Course level assessment processes are defined in the assessment plan. Courses
are assessed on a cyclical basis to assure continuous improvement opportunities
 Program level assessment processes include a three tier system: annual program
assessment activities provide opportunity for early intervention; program review
processes paralleling the WVCCTCE program review; external accrediting
agency processes when applicable.
 Program level assessment is required for all programs including associate degree
and certificate programs.
 Reporting format requires inclusion of performance standards
Demonstrate that data collected is analyzed and used to improve subsequent instruction.
 Assessment reporting requires “recommendations based on assessment results”.
Recommendations for curricular changes must include implementation date,
reassessment date and projected date of submission to other college committees
if required.
 Course level assessment reports are available to all faculty for use in tracking
29
studies, program assessment or other longitudinal assessment projects.
Demonstrate that the analysis of assessment results is integrated with planning processes
and is communicated to students, faculty and administration.
 An assessment web page is available on the College web site for access by
faculty, students, staff and other constituents
 Assessment recommendations may generate major curricular changes. Such
changes are submitted to the Academic Affairs Committee with supporting
assessment documentation.
 A peer review process serves to strengthen the feedback loop.
 The annual academic assessment cycle assures opportunity to implement
recommendations within the college operating cycles for schedule development,
catalog revision, budget development, etc.
 Assessment is a standing agenda item for Faculty Assembly meetings.
 Assessment/IE model provides for multiple communication points throughout
the assessment process
 Assessment is a major goal in the Strategic Plan, Goal 2
 Assessment is integrated throughout the strategic plan as specific objectives in
Goals 1 and 4 in addition to the focus placed in Goal 2
 Assessment is a standing agenda item for the Board of Governors’ meetings
appearing on the Board agenda at least quarterly
Northern’s academic assessment initiative is the cornerstone in furthering the College’s
mission as a student centered organization enhancing opportunities for student success. The core
of the plan focuses on improvement of student learning and excellence in teaching. Data is used
to support institutional decisions regarding instruction, curricula, and strategic planning. The
assessment initiative is an ongoing, shared and integrated process. Data alone can not impact
change or improve student learning. Interpretation of data, application of the analysis and shared
dialog will strengthen Northern’s support of the “student as a developing learner” and student
success opportunities.
30
CHAPTER 4
ASSESSMENT IMPLEMENTATION
The purpose of this chapter is to provide a review of institutional support and
implementation of assessment specifically focusing on those activities since the approval of the
revised assessment plan. This chapter builds on information provided in the previous chapters
which enumerated the history of assessment and description of the approved assessment plan.
Although Northern has had uneven progress in its assessment initiative, the College has been able
to identify and address elements that inhibited its efforts. This chapter will present evidence that
clearly shows that the assessment initiative is back on track with the faculty recognizing their
responsibility for the integrity of the curriculum and delivery of instruction that supports the
institutional mission. It is widely accepted that the revised plan does provide a sustainable
mechanism to reach these ends.
Student Learning Outcomes
Northern recognizes the importance of consistent and clearly stated student learning
outcomes for its courses and programs. The College has traditionally published program level
student learning outcomes in the College catalog and on the College web site to assure that all
constituents have easy access to program level information. This pattern of published and easily
accessible information for program level learning outcomes has been adapted for course level
assessment. The Master Course Guide (MCG) is the instrument used to communicate course
information to faculty, students and other constituents. As noted in Chapter 2, Northern adopted
the IPSI format for development of course outcomes. This process required listing course level
learning objectives for all intended course activities. As a result, initial MCGs listed as few as
fifteen course outcomes and sometimes as many as 150 or more. This made assessment of
student learning unwieldy and limited ability to effectively assess a course in its entirety within a
reasonable time frame. The 2003 Visiting Team recommended grouping similar outcomes by
identifying more broadly stated outcomes to facilitate assessment activities. As the Assessment
Committee reviewed assessment reports in 2004 and revised the assessment plan, it revised the
31
MCG format to the version currently in use. The faculty chose to retain the expanded list of
student performance learning outcomes (SPLO) since the MCG is a key element to assure
consistency in instruction. However, MCGs were to be modified to group related SPLOs into a
manageable number of student learning outcomes (SLO) with the understanding that the SLOs
would be used for assessment. To assure consistency in application, a faculty development
session was conducted on the new MCG to orient faculty to the shift in format and the inclusion
of course level student learning outcomes (SLO) and student learning performance objectives
(SLPO) that would facilitate measurement and address multiple aspects of each student learning
outcome. Guidelines and forms for the MCG are easily accessible to all faculty on the assessment
web page [REF 4.1]. The revised MCGs assure consistency in course descriptions, SLOs and
SLPOs regardless of delivery modality, instructor or campus location. To date, 86% of the
courses listed in the College Catalog have current MCG’s posted to the assessment web page (See
ATT 4.1) [REF 4.2]. The MCGs are accessible on the assessment web page assuring widespread
access to the course descriptions, expanded course focus, course materials and other information
relevant to the course. Curriculum proposals submitted through the Academic Affairs Committee
also include MCGs as part of the curriculum proposal if changes necessitate revision of the MCG.
Faculty have been asked to review and revise the MCGs and most courses have a MCG in the
revised format. Additionally, as a course undergoes revision, the course information will be
transitioned to the revised format. Many courses taught only by adjunct faculty do not have
revised MCGs in place. Course assessment projects identify one or more SLPOs for assessment.
Given the standardization of stated SLPOs, course assessment activities are able to be organized
collaboratively among the faculty teaching the sections included in the assessment activity (See
ATT 4.2). The Assessment Committee has identified the need for involvement of adjunct faculty
as a target activity specifically focusing on the development of MCGs for the adjunct taught
courses.
Annual Academic Assessment Cycle
Sustainability has been a challenge in Northern’s assessment initiative. The College has
experienced several surges in assessment only to stall in its efforts. The faculty under the
32
leadership of the Assessment Committee, addressed this challenge by establishing an annual
academic assessment cycle as noted in Chapter 3. The significant shift in the cycle involves a
published “planning cycle”. This institutional level cycle provides the operational framework for
all departments. Faculty determine the rotation cycle for courses within their divisions, but all
assessment activities adhere to the published time line. In additional to the assessment cycle and
course rotation schedule, assessment proposals are submitted by the faculty to their respective
division chair assuring that course assessment activities are slated for annual review. Proposals
are accessible to others in the division and throughout the College. The proposals provide a
tracking mechanism helping to assure that projects are completed. The assessment cycle
identifies course level assessment and annual program level assessment project time lines. Both
the Assessment Committee and Division Chairs have responsibility for tracking assessment
activities and for insuring that reports are submitted on time.
Course Level Assessment
Faculty have actively engaged in course level assessment activities since the development
of the initial assessment plan. However, the College experienced numerous ambitious starts only
to stall in its efforts. The 2003 Visiting Team noted that, although Northern had established a
pattern of course level assessment, particularly at the micro level, it had not moved to a long term,
sustainable process that assured improvement in instruction and student academic achievement.
As noted in Chapters 2 and 3, the faculty revisited its course level assessment process to address
this concern. The adoption of the institution-wide, annual academic assessment cycle has aided in
addressing this problem. Additionally, course assessment reporting was expanded to include
recommendations based on assessment results, time line for implementation and notation as to
involvement of the Academic Affairs Committee, if needed, as well as a reassessment date to
support long term cyclical activities. By implementing the proposal process, tracking report and
revised reporting format, the College is moving to a sustainable course assessment process which
provides meaningful data to improve instruction and student academic achievement. Using the
revised format, course level assessment activities are well established and conducted across the
institution.
33
Faculty in all departments are engaged in assessment on a predictable, rotation cycle. In
multiple departments, faculty effectively collaborate to conduct assessment across multiple
sections and often engage adjunct faculty. For example, the mathematics faculty have established
an assessment rotation for all courses using selected SLO, SLPOs and performance indicators. As
a result of their assessment activities, the mathematics faculty made a significant revision to the
developmental algebra course series based on course level assessment data and tracking studies
(withdrawal rates, success rates). The MATH 96 course was eliminated and replaced with a two
semester series (MATH 92 and MATH 93) providing students with more time on task and
ultimately improving student success rates. As an additional example, a course level assessment
of OFAD 120, Introduction to Machine Transcription of Medical Records, has also lead to a
curriculum revision. Based on course assessment results, multiple recommendations were made
to revise the course. These included the addition of two prerequisites (Medical Terminology and
Anatomy and Physiology) and a curriculum revision to bring the program standard into
compliance with the National Association of Medical Assisting guidelines. These are only two
examples of assessment practices being used for curriculum improvement [REF 4.3]. Completed
course assessment reports are currently available in hard copy for faculty review. Northern is
transitioning to an electronic format. The electronic copies will be uploaded to the assessment
web site to increase accessibility by faculty.
Course level assessment reports include use of multiple measures in assessing student
academic achievement. Direct and indirect measures have been included in many course
assessment projects. Direct assessment measures include pre/post testing, embedded assessment
indicators in student quizzes and tests, portfolio assignments, and lab and clinical activities.
Faculty have also incorporated a student survey assessing student perception of course outcomes
presented and perceived level of achievement. Incorporation of the survey provides the
opportunity to compare results from direct measures with student perception of academic
achievement. The faculty teaching BIO 110 (Principles of Biology) provided the model for this
practice. Students enrolled in BIO 110 are administered a survey asking their perception as to
whether course learning outcomes are presented and the degree to which they are learned. Results
from the survey are then compared with data collected from direct assessment measures to
34
determine student achievement and need for course revision. In addition to assessment of student
learning outcomes, course assessment projects have included impact of technology and impact of
instructional strategies on student learning.
Program Level Assessment
Program level assessment has shifted to a comprehensive, cyclical process that includes
assessment of program level outcomes on an annual basis. As noted in Chapter 3, the previous
program level assessment cycle did not provide an adequate window for monitoring outcomes and
providing for timely curriculum change. The previous model was based solely on the five year
program review cycle required by the WV Council for Community and Technical College
Education (WVCCTCE) [REF 4.4]. The Assessment Committee opted to include an annual
assessment project to increase the opportunity for timely curriculum revision and to supplement
the required review processes at the state level. This annual institution assessment project adheres
to the “Annual Academic Assessment Cycle”. For separately accredited programs such as those
in health sciences and culinary arts, the accrediting agency’s assessment and review practices are
also included. Blending these formats has strengthened Northern’s program assessment by
providing a comprehensive view of student achievement at the program level. Course assessment
activities can be used to supplement program level assessment when deemed appropriate. The
program assessment reporting process was recently revised to strengthen the synthesis of data
from multiple sources. The proposed reporting format targets assessment of program viability,
necessity, and consistency with mission in addition to measure of student academic achievement
[REF 4.5]. To date, the nursing program was used to pilot the proposed policy (See ATT 4.3)
[REF 4.6]. Based on this pilot, no revisions were make to the proposed policy. The policy and
nursing pilot report were presented to the faculty at a fall 2007 faculty development session. The
policy is slated for final review and approval in fall 2007. The program review self-study
elements contain:
 Introduction with program description and description of any unique aspects of
program
35
 Description of review process and listing of those who participated in process
 Evaluation of viability, adequacy, necessity, and consistency with mission using
standard elements.
 Summary of strengths, concerns, recommendations (could be part of standard
elements)
Since this process is in transition, the previously established process for the program
review cycle was used for the most recent program reviews. The most recent reviews were
conducted for Appliance Repair, Industrial Maintenance and Refrigeration, Air Conditioning and
Heating programs [REF 4.7].
Program level assessment activities include use of multiple measures in assessing student
academic achievement. Direct and indirect measures have been included in program assessment
projects. Direct assessment measures include pre/post testing, embedded assessment indicators in
student quizzes and tests, portfolio assignments, and lab and clinical activities. Institutional data
is provided regarding enrollment patterns, retention, success in target and subsequent courses,
graduation rates and student satisfaction studies [REF 4.8]. In addition to assessment of student
learning outcomes, program assessment projects also have included impact of technology in the
classroom, student evaluation of program resources as well as impact of instructional strategies.
For example, the respiratory program coordinator conducts an annual survey providing students
with the opportunity to evaluate faculty, resources, and clinical sites. These results are
incorporated into the program review process and are available in the nationally submitted
program report.
Course mapping has been completed for all approved associate and certificate programs.
To assure appropriate curriculum alignment, a course outcomes matrix is completed for each
academic program [REF 4.9]. Alignment of program outcomes with specific courses provides the
ability to target specific courses when assessment data denotes need for curriculum revision (See
ATT 4.4) [REF 4.10]. Program level outcomes are stated in the catalog and on the College web
site.
There are numerous examples of program revisions to enhance student learning resulting
from program assessment. The Refrigeration, Air Conditioning and Heating program added a
36
component on digital control of HVAC systems and purchased equipment to implement the
curriculum change based upon student and advisory committee input regarding preparation for
employment. The math faculty changed the structure of the developmental math courses to offer
a two semester introductory algebra series after analyzing data on student success collected over
several semesters. Preliminary data from spring 2006 indicates an improvement of 15% in
success of developmental algebra students after the initial year of implementation. Student
surveys and course assessments indicated that the students in the Medical Assisting program did
not have sufficient background in anatomy and physiology for the medial assisting courses.
Consequently, the program was revised to include anatomy and physiology. These are but a few
of the changes providing evidence that program assessment is being used to make changes which
lead to improvements in student learning.
A new component in the annual program assessment process is the formal inclusion of the
Program Advisory Committees. Assessment should measure not only how well students are
learning but also if they are learning essential skills needed for success in their chosen field.
Program Advisory Committees provide the needed input to assure programs are preparing
graduates to meet the needs of employers. Since its inception, Northern has appointed advisory
committees for all technical programs. Advisory committees meet at least once per year. In the
past, advisory committees have been involved in the five-year program review but input from the
committees at the annual meeting has been informal and sporadic. The new program review
process requires the program coordinator (director) to gather information from the advisory
committee annually regarding program strengths, needed program improvements, and trends in
the field that may affect the program and graduates. This information is shared with program
faculty and the division chairperson so it can be used to guide program improvements.
General Education Assessment
Implementation of general education assessment has been one of the biggest challenges
for assessment efforts at Northern as it is at many institutions. The College experimented with
CAAP but the faculty did not find it to be effective for determining curriculum changes in the
general education core. A General Education Committee was formed with the revision of the
assessment plan. The Committee reviewed and redefined the general education outcomes. A
37
general
38
education outcomes matrix was completed to align general education outcomes with specific
courses. Once accomplished, the course level assessment was to provide the information needed
for assessment of the general education core. WorkKeys results were also used to measure
achievement in general education for career technical programs since the WVCCTCE requires use
of WorkKeys as a Perkins Core Indicator.
The Assessment Committee and the General Education Committee have both labored in
the past two years to improve assessment of the general education core. It has become evident
that a significant impediment to moving forward with general education assessment was the
disintegration of the academic administrative structure. Without this structure, faculty in
programs continued to focus on the courses and programs, but general discussions regarding
college-wide initiatives such as general education were not occurring. This became clear in
meetings with divisions chairs and faculty at the start of the fall 2007 semester when many faculty
and chairs did not initially recognize a problem with general education assessment. Through the
continued discussion, it became more evident to the faculty that overall assessment of general
education was lacking. A major goal of the academic area and the Assessment Committee for the
current academic year is to reinvigorate the discussion regarding general education and to develop
consensus on appropriate ways to assess it.
Peer Review
In 2007 (summer), the Assessment Committee piloted a peer review process as an
additional mechanism to monitor assessment activities, provide another mechanism to close the
feedback loop and improve assessment practices. The addition of this process expands the
Assessment Committee’s role beyond that of planning and compliance to incorporate
development and continuous improvement. Essentially, the peer review process was devised to:
$
Provide evidence that students are achieving stated student learning outcomes
$
Provide snapshot of progress implementing course assessment
$
Identify best practices at Northern
$
Identify faculty development needs
$
Close the feedback loop
39
$
Provide mechanism to assure continuous improvement of assessment process
Although the peer review process is in the initial stages of implementation, the process has
yielded preliminary information regarding Northern’s course assessment activities. Based on the
pilot, course assessment appears to be occurring throughout the college; course assessment
activities are used to validate student learning; assessment is being used to improve curriculum
and instructional strategies, faculty collaborate and share assessment responsibilities and some
assessment activities build upon previous assessment. The initial peer review process will be
completed in the fall 2007 term (ATT 4.5) [REF 4.11]. At that time it will be reviewed for
overall effectiveness, need for revision and continued implementation.
Faculty Development
Faculty development is synonymous with sustainability. Increasing general awareness of
assessment concepts and practices, and assuring that faculty were well grounded in the revised
assessment plan became a professional development priority. Development opportunities were
addressed in several ways: conducting informational session on the assessment plan and
processes, organizing institution based activities highlighting best practices by Northern faculty,
attending national and regional conferences and more recently, adding information links to the
assessment web page. Each semester, Northern based professional development activities center
on assessment practices. In spring 2006, an assessment mini-conference was organized to
showcase Northern’s assessment initiatives identifying best practices within the college. In fall
2007, the peer review process was introduced as a pilot project [REF 4.12].
Institutional Effectiveness
The IR/IS office routinely prepares reports for internal audiences and external agencies. A
listing of the reports produced by the IR/IS office will be available in the resource room. A
review of the list demonstrates that the College is collecting and analyzing data to improve
institutional effectiveness and to meet accountability standards.
The College has a number of internal measures of institutional effectiveness. Evaluation of
institutional effectiveness begins with the Strategic Plan. The Strategic Plan begins with the
40
College mission and from that mission nine key goals were developed. All sectors of the College
community provided input into the development and refinement of the goal statements.
Following agreement on goal statements, objectives were developed under each of the goals by
the appropriate administrative units. The entire plan with measurable objectives was distributed to
all constituent groups and provides a clear statement of institutional effectiveness outcomes. The
complete plan with objectives is available in the resource room. Administrators report progress
toward meeting the objectives to the President in monthly reports and the college newsletter
provides a vehicle for reporting goal achievement to the college community. Accountability is
built into the process since a key element of the annual evaluation of administrators is progress
toward meeting the objectives.
While achievement of goals in the Strategic Plan is one measure of institutional
effectiveness, another important measure is feedback from students on two survey instruments.
The College administers a student satisfaction survey every fall semester and data from this
survey has been used to guide improvements such as enhancements in support for the telecom
system. The other emerging measure is student response on the CCSSE (Community College
Survey of Student Engagement) survey. The State Community College system funded
participation in CCSSE in 2004 and has agreed to contribute again in 2007. The State CTCS and
the College plan for information from this survey to guide development of activities to enhance
student success and retention.
The HLC Statement on Assessment states: “while strong assessment should provide data
that satisfy any externally mandated accountability requirements, its effectiveness in improving
student learning relies on its integration into the organization’s processes for program review,
departmental and organization planning, and unit and organization budgeting.” Northern’s
institutional effectiveness efforts accomplish both goals. There are a number of external
accountability measures that the IR/IS Office must report. Two of the key external requirements
are the reporting of institutional progress on the State Compact and on the Community and
Technical College Performance Indicators. The College has integrated these with the Strategic
Plan as a crosswalk has been developed between the Compact and the goals of the Strategic Plan.
Further evidence that the College has successfully integrated accountability requirements and
41
institutional planning is the College’s success on the CTCS Performance Indicators for the most
recent academic year. Northern was the only community college in the state to receive an
“excellent” rating from the Council for Community and Technical College Education on the
Performance Indicators.
Institutional Support for Assessment
The HLC Statement on Assessment of Student Learning (pg 3.4-2, Handbook of
Accreditation, 3rd ed) states that there should be strong support for assessment from the Board and
the administration. The 2003 Team report noted that Northern had strong administrative support
for assessment. That support continues today. Institutional support begins with the College
Board of Governors. During the search for a new president, the Board included targeted
discussion of assessment with the candidates. In addition, the Board requests regular reports on
assessment. Board minutes provide evidence that assessment is a topic for Board discussion
[REF 4.13]. Support from administration takes many forms. The tone for administrative support
is set by the inclusion of assessment as one of the institutional goals in the strategic plan [REF
4.14]. The administration has demonstrated support through the commitment of funds for
assessment. Funds have been allocated for faculty to attend regional and national conferences
involving assessment, for bringing speakers on campus, for purchase of assessment resources, for
development of the assessment web-page and for coordination of the assessment. Most recently,
the College has hired an Assessment Coordinator on a part-time basis to guide the assessment
process.
HLC Focus Visit Concerns Addressed by the Assessment Plan
Northern’s current assessment plan effectively addresses the concerns noted in the HLC
2003 Visiting Team Report (quotes from Visiting Team report are in italics below).
In summary:
Consistent assessment of student learning outcomes across the College, regardless of
location or modality.
 MCGs have been completed for courses taught by full-time faculty
42
 MCGs are posted to the Assessment Web page for access by faculty, students
and other constituents
 MCGs are included as part of curriculum proposals submitted to the Academic
Affairs Committee
 The new MCG format differentiates between broadly stated course level,
student learning outcomes (SLO) and measurable student learning performance
objectives (SLPO)
 Program Learning Outcomes are published on the College web site and in the
College Catalog
 Course mapping has been completed for all approved programs aligning
program learning outcomes with course requirements.
Implementation of assessment across the various levels, including courses, general
education, degree programs/certificates, and institutional, and establish measures,
indicators and specific benchmarks for acceptable levels of performance.
 Course and program level assessment activities are conducted according to the
“Annual Academic Assessment Cycle”
 An assessment rotation cycle has been established by the faculty for all courses
 Program review is conducted on a cyclical basis according to prescribed
WVCCTCE policies
 Program assessment and review is conducted according to prescribed time
lines and standards for accredited programs
 Indirect measures of assessment are administered on a cyclical basis by the
Institutional Research and Career Services Offices to provide student/graduate
perception of learning acquired and program effectiveness
 The peer review process will provide essential information to improve
assessment practices and target faculty development needs
Demonstrate that data collected is analyzed and used to improve subsequent instruction.
43
 Curriculum changes have been made in courses and programs based on
assessment results. Such changes have included addition of prerequisites,
elimination of specific courses and redefinition of course learning outcomes on
MCGs
 Assessment results have also validated effectiveness of current practices in
attainment of student learning outcomes
 Assessment project have been used to compare impact of technology on
student learning by comparing traditional course sections with web based
sections
 Program assessment incorporating standard elements are used to measure
program viability, adequacy, necessity, student achievement and consistency
with mission
 Peer review pilot project has been implemented to monitor assessment
activities, to provide another mechanism to close the feedback loop and to
improve assessment practices
Demonstrate that the analysis of assessment results is integrated with planning processes
and is communicated to students, faculty and administration.
 The Model for Assessment of Student Learning builds assessment into the
planning/budgeting cycle
 Report format incorporates implementation dates for curriculum change and
dates for submission to Academic Affairs Committee when needed.
 The College has established an assessment budget to assure ongoing operations
and provide opportunities for faculty development
 The budgeting process and strategic planning processes were aligned to
effectively incorporate strategic priorities
 At the spring 2005 faculty development activity on assessment, the biology
faculty demonstrated a survey method that could be administered with the
assistance of the IR/IS Office. As a result of that presentation, many faculty
now administer this instrument.
44
 At the fall 2007 faculty development activity on assessment, the faculty
reviewed the rubric for the pilot peer review project. The faculty also
completed application exercises using the piloted rubric. Faculty reported a
clearer understanding of how to improve assessment practices and reporting as
a result of this session.
45
CHAPTER 5
SUMMARY OF RESPONSES TO HLC VISITING TEAM CONCERNS
The purpose of this chapter is to succinctly highlight Northern’s actions to address
concerns noted in the Assurances and Advancement Sections of the HLC Comprehensive
Evaluation Visit (2003).
Faculty Ownership of Assessment
$ Faculty undertook a comprehensive review and revision of the assessment plan
$ Assessment is a standing agenda item for Faculty Senate; minutes from meetings are
posted for informational access
$ The Assessment Committee is primarily comprised of faculty
$ The Assessment Committee was recently expanded to double the number of faculty to
facilitate transition to new committee members
$ Assessment plan revisions were reviewed in Faculty Assembly meetings
$ Assessment plan revisions were reviewed in division meetings
$ Faculty development sessions are scheduled each term to assure current access to
assessment information
$ In a survey of faculty taken in January 2007, 91.2% of the faculty agreed or strongly
agreed that faculty own and drive the assessment process [REF 5.1]
Assessment Implementation and Sustainability
$ Faculty revised the assessment plan with focus on sustainability
$ The annual academic assessment cycle provides institutional time lines for assessment activities
$ Faculty determine rotation schedule for courses within the divisions
$ Assessment proposals are now submitted on an annual basis to identify proposed assessment
activities
$ IR/IS Office supports faculty data needs
$ All divisions are engaged in course level assessment; since 2005, over 100+ course assessment
reports have been completed and are currently on file with Assessment Committee
46
$ Annual program assessment activities were incorporated into the program level assessment
activities to provide opportunity for early intervention, curriculum revisions and support for
student academic achievement
$ Assessment reporting requires notification of Academic Affairs Committee for significant
curriculum revision
$ Assessment reporting requires note of reassessment cycle to discern impact of assessment based
curriculum changes
$ Faculty development activities are scheduled each semester
Use of Multiple Measures of Student Academic Achievement
$ Assessment reports include descriptions of assessment activities
$ Reports include identification of performance indicators
$ Direct methods of assessment are identified in assessment reports; measures include pre/post test;
embedded assessment indicators in exams and quizzes, portfolios, performance in labs and
clinical sessions
$ Indirect methods of assessment are identified in assessment reports; measures include student
satisfaction surveys, student self assessment, graduating student surveys, advisory committee
input, employer survey, and job placement
Concise Statements of Student Learning Outcomes
$ The new MCG format differentiates between broadly stated course level, student learning
outcomes (SLO) and measurable student learning performance objectives (SLPO)
$ MCGs have been completed for courses taught by full time faculty
$ MCGs are posted to the assessment web page to assure access by faculty, students and other
constituents
$ Program outcomes are published in the college catalog for each program
$ Course mapping has been completed for all approved programs and the general education core
aligning program learning outcomes with course requirements (Program Objectives Matrix)
47
General Education Assessment
$ General education outcomes have been reviewed and redefined. The revised outcomes are
published in the College catalog
$ A program Outcomes Matrix has been completed for the general education core curriculum
$ WorkKeys tests are used to assess the general education core in career technical associate degree
and certificate programs (Reading for Information, Applied Mathematics, Locating Information)
Differentiation between Institutional Effectiveness and Assessment of Student Academic
Achievement
$ Separate processes have been identified for Institutional Effectiveness and Student Academic
Assessment
$ IE and Assessment have separate committee structure
$ IE reports are used to provide data for program level reporting requirements as needed, but are not
submitted for annual course assessment projects
Closing the Feedback Loop and Communication
$ Assessment web page provides access to MCGs, assessment plan, forms and assessment
resources.
$ Steps are underway to post assessment reports to the assessment web page
$ A peer review process was piloted in summer 2007 to expand access to assessment reports,
provide feedback to faculty, identify institutional best practices and faculty development needs
$ Assessment is a standing agenda item at Faculty Assembly meetings
$ Assessment is a recurring agenda item at the Board of Governors meetings
$ Assessment is a standing agenda item at Division meetings
$ Assessment is a professional development and information session included in start of the
semester faculty development sessions
Student Engagement
48
$ Students have been included on the Assessment Committee but their participation has been
sporadic. The Assessment Committee is looking for other avenues to include student
participation.
$ Faculty are doing a better job of communicating the importance of assessment to the institution
and to students. Student participation in WorkKeys testing improved significantly last year as a
result of program directors explaining the value of assessment.
$ Students have submitted suggestions about ways to improve participation in WorkKeys testing
and other assessments
$ The College Catalog contains a statement regarding expectations for student participation in
assessment [REF 5.2]
$ Program directors emphasize the importance of assessment in relation to external accreditation
and program reviews and the role of assessment in strengthening the curriculum
Conclusions: Accomplishments
Since the last self study, Northern faculty have engaged in dialog and activities to
strengthen assessment at Northern. Supporting the College mission and student academic
achievement have been in the forefront of all discussions and decisions. The faculty believe that
they now have a sustainable system in place and can place attention on areas that still require
refinement. It is clear to faculty that assessment can provide insight needed for curriculum
revision as well as validate existing practices. In summary, student learning outcomes are clearly
stated on Master Course Guides although some courses are still being transitioned to the revised
format. Master Course Guides have been completed for 86% of courses published in the College
Catalog. The Courses without Master Course Guides are those taught solely by adjunct faculty.
Discussions are underway to determine a process for completing Master Course Guides for those
courses taught by adjunct faculty. Data are being collected and analyzed to support student
academic achievement. All full time faculty are engaged in course level assessment activities and
courses have been placed on an assessment rotation cycle. Program assessment activities have
been expanded to assure an annual assessment activity and focus on program level learning in
addition to those long term assessment and tracking requirements defined by state policy.
49
Institutional level assessment is now addressed by the Institutional Effectiveness Committee with
strong support from the IR/IS Office. The IR/IS Office actively supports the expressed data needs
of faculty and administration. The revisions to the assessment plan have helped to create
sustainable assessment practices and embed assessment into the institutional culture.
Communication processes have been established to assure that information about
assessment is widespread and accessible. The assessment web page is established and accessible
to faculty. This site is being expanded. Course and program reports and proposals will be added
to assure college wide access to assessment projects and results. Additional resources supporting
assessment activities will be added as they are available. Assessment has become a standing
agenda item for faculty based and administrative groups in the college. Minutes are available to
the college community. A peer review process was piloted to provide a mechanism to strengthen
assessment practices, identify best practices, and determine faculty development needs. The peer
review process will also provide another mechanism for closing the feedback loop.
Conclusions: Challenges
Although much progress has been made since the last self study, Northern is still faced
with several challenges. Assessment of the general education core has not progressed as
intended. A separate committee structure was originally established to focus on assessment of the
general education curriculum. Although general education courses are being assessed at the
course level, the bridge between course level assessment and assessment of the general education
program has not been established. The Assessment Committee is now exploring alternatives to
address this shortcoming.
Students have not been actively engaged in assessment. Students participate in terms of
completing surveys or specific assessment activities, however they have not served an active role
in developing the assessment program or evaluating its effectiveness. The Assessment
Committee is exploring methods of informing the student population in general and engaging
membership from formal student organizations.
50
Closing the feedback loop remains a challenge. Although the newly implemented peer
review process provides a mechanism to bridge the current gaps, the impact of this system is yet
to be realized.
In summary, West Virginia Northern has made significant strides since the last self study
visit to build a culture of assessment and to assure sustainable assessment practices to support
student academic achievement. The College has identified assessment challenges and is confident
that it can revise processes to eliminate challenges and strengthen its assessment initiative.
51
Appendix A
List of Attachments
Attachment Code
Document Title
Chapter 4
ATT: 4.1
Master Course Guide Samples
ATT 4.2
MATH 204 Course Assessment Report
ATT: 4.3
Nursing Program Assessment Report, Pilot for new format
ATT 4.4
Sample Program Outcomes Matrix
ATT 4.5
Assessment Peer Review Rubric
52
Attachment 4.1
53
West Virginia Northern Community College
Master Course Guide
Course Number:
_Bio 110_________________________________
Course Title:
_Principles of Biology_____________________
Revision Date:
_January 2007____________________________
Faculty Signature:
_______D. Folger 1/07______________________
Date
_______S. Gress 1/07______________________
Date
________T. Danford 1/07___________________
Date
Comments:
I confirm that the Master Course Guide was developed according to the established guidelines and that it meets
college requirements.
Division Chair Signature:
_________T. Danford 1/25/07_____________________
Date
Rev 1/07
BIO 110
54
Principles of Biology
Course Description
This course is an introductory course in general biology stressing a unified approach to biological systems.
Emphasis is placed on fundamental processes at the cellular level. Genetics is stressed. Students must
register for a lecture and laboratory section.
Expanded Course Description/Course Focus
This course is presented in several formats: Lecture, Lab, Lecture/Lab, Early Entrance Lab, Early
Entrance Lecture, Course Learning Contract, SREC, Tech Enhanced Lecture/Lab, Technology Enhanced
Lecture, Technology Enhanced Lab, Web Based Dist Ed Electronic, WebCT
Prerequisites
(Undergraduate level ENG 90 Minimum Grade of K## or Undergraduate level ENG 90 Minimum Grade of
C# or Meets English Requirement 3 or ACT-ENH English 18 or SAT Verbal 450 or WVNCC English 28 or
ASSET English Test 38) and (Undergraduate level READ 95 Minimum Grade of K## or Undergraduate
level READ 95 Minimum Grade of C# or Meets Reading Requirement 3 or ACT-ENH Reading 17 or
ASSET Reading Test 36 or WVNCC Reading 25 or SAT Verbal 420)
Corequisites
none
Credit Hours:
4
Lecture 3
Lab 2
Text Information Available in Bookstore: Cell Biology and Genetics, 11th edition; by Starr & Taggart; Wadsworth
Publisher
List of Material and Supplies
Course Outcomes
The following list of course outcomes will be achieved at the successful completion of the course.
In order to successfully complete this course, the student must:
1. Demonstrate an understanding of and the ability to use the scientific method working both independently and as
part of a group.
2. Demonstrate an understanding of the unifying principles of living organisms, including the chemical basis of life,
structure of cells, functions of cellular structures and the energy relationships upon which life depends.
3. Demonstrate an understanding of genetic mechanisms, both Mendelian and current, and their relationship to
biotechnology and genetic engineering.
4. Demonstrate an understanding of the mechanisms responsible for the diversity of life on earth.
5. Demonstrate a familiarity with those scientists who have contributed to the development of the basic theories of
biology.
Student Learning Outcome Objectives
The following list of student learning performance objectives will be addressed in the course.
55
1.
Describe the organization of our physical world, from subatomic particles to the biosphere.
2.
Explain the concepts of energy transfer and flow
3.
Describe the interdependency among organisms
4.
Distinguish living systems from non-living
5.
Recognize the diversity of life one earth
6.
Apply the scientific method to learning and practical situations
7.
Describe the “chemistry of Life” including atoms, molecules, and bonding
8.
Explain the properties of water in biological systems
9.
Describe the importance of hydrogen bonding to life
10.
Apply the concepts of hydrogen ion concentration and pH to living systems and their functioning
11.
Describe the chemistry of carbon molecules in cells
12.
Explain the structure and main cellular functions of the carbohydrates, lipids, proteins and nucleic
acids
13.
Describe the cell theory and it’s application to life on earth
14.
Explain the structure and function of both prokaryotic and eukaryotic cells
15.
Compare and contrast these two cells types
16.
Analyze the contribution to life of each of the following eukaryotic organelles: plasma membrane,
mitochondrion, golgi body, endoplasmic reticulum, ribosome, lysosome, nucleolus, nucleus, nuclear
membrane, chloroplast, central vacuole, flagella, and cilia
17.
Describe the fluid mosaic model of membrane structure and function including structural features
all cell membranes have in common
18.
Explain the processes of diffusion, osmosis, and active and passive transport mechanisms
19.
Compare and contrast exocytosis and endocytosis and the application of these processes to the life
process
20.
Describe the concepts of energy flow through our biosphere, including the laws of thermodynamics
21.
Explain enzymes, their functions, their helpers, the factors affecting them, and how enzymatic
reactions form pathways
22.
Describe phosphorylation mechanisms, electron transport and the ADP/ATP cycle in biological
systems
23.
Describe the structure and function of chloroplasts, Photosystems I and II, and the light dependent
reactions of photosynthesis
24.
Explain the processes involved in carbon dioxide fixation and the light independent reactions of
photosynthesis
25.
Explain the aerobic oxidation of glucose to carbon dioxide, water, and ATP
26.
Compare and contrast anaerobic respiration and fermentation with the aerobic respiratory processes
27.
Analyze the relationships among photosynthesis, aerobic respiration, anaerobic respiration and
fermentation, and the cycling of carbon, oxygen and water in our biosphere
28.
Describe the eukaryotic cycle and how this cycle applies to the life and death of cells
29.
Explain the processes of mitosis and cytokinesis, including the specific events that occur during
each of the phases of mitosis
30.
Relate the concepts of cellular reproduction to homeostasis in multicellular organisms
31.
Describe the processes of meiosis and fertilization and how they apply to the diversity of organisms
32.
Explain the specific steps of meiosis and be able to recognize cells in each phase
33.
Compare and contrast mitosis and meiosis
34.
Apply the concepts of mitosis, meiosis, and fertilization to the generalized life cycles of plants and
animals
35.
Explain the terminology involved in classic Mendelian genetics and the theories of segregation and
independent assortment
36.
Set up and work monohybrid and dihybrid testcrosses, including prediction of outcomes (both
genotypic and phenotypic)
37.
Apply the concepts of dominance, incomplete dominance, codominance, multiple effects of single
genes, interactions between gene pairs and environmental effects on phenotype
38.
Describe the organization of the human genome: autosomal chromosomes and sex chromosomes
(sex determination in individuals)
39.
Explain the concepts of crossing-over, recombination and chromosome mapping
56
40.
Apply the patterns of autosomal and X-linked inheritance to genetic disorders in humans
41.
Explain the structure and function of DNA
42.
Describe in detail the organization and replication of DNA in the eukaryotic cell (including cell
cycle considerations)
43.
Describe the structure and function of DNA, mRNA, rRNA, tRNA, amino acids, polypeptides and
proteins
44.
Explain in detail all of the steps and molecules involved in transcription and translation
45.
Explain the control of gene expression in the prokaryote (negative control of transcription, lactose
operon)
46.
Describe in detail the levels of gene control (both transcriptional and translational) in the eukaryote
47.
Explain the following genetic engineering terms/tools: restriction enzymes and fragments, plasmids
and cloning vectors, the polymerase chain reaction, gel electrophoresis and DNA sequencing
48.
Describe the uses of DNA probes (nucleic acid hybridization) and cDNA
49.
Consider the concepts of human gene therapy and genetic cloning (of cells, animals, humans) in the
context of ethical, moral, social, biological and economic aspects of our biosphere
50.
Describe evolution and the current tools available for its study (comparative anatomy,
biogeography, fossil record)
51.
Explain the theory of evolution as proposed by Darwin and Wallace, including the concept of
natural selection and how it occurs
52.
Apply the major microevolutionary processes (mutation, gene flow, genetic drift, and natural
selection) to ecological systems
53.
Describe biological species concept including the isolating mechanisms (prezygotic and
postzygotic) which bear on species development
54.
Compare and contrast microevolutionary and macroevolutionary processes and apply these to
speciation
55.
Explain the science that underlies the macroevolutionary puzzle: fossils and their interpretation,
evidence from comparative embryology, anatomy, and biochemistry
56.
Use systematics to explain phylogeny and the evolutionary relationships among organisms
including gall three approaches (taxonomy, phylogenic reconstruction, and classification)
57.
Relate the significance of scientific research and recognize the scientists responsible for various
biological discoveries
Revised:2007
57
Attachment 4.1
West Virginia Northern Community College
Master Course Guide
Course Number: IMT 100
Course Title: Applied Basic Plumbing and Pipefitting
Date Revision: January 12, 2007
Faculty Signature: Joseph M. Remias
Comments: Student Learning Performance Objective and List of Material and Supplies were
added. Text Book and Student Learning Outcomes were changed.
I confirm that the Master Course Guide was developed according to the established
guidelines and that it meets college requirements.
Division Chair Signature: __________________________
58
Date: _______________
Master Course Guide
IMT 100
Applied Basic Plumbing and Pipefitting
Course Description
This course is designed to provide beginning pipefitting students with fundamental knowledge of
the use and care of tools necessary for the performance of trade responsibility. Special emphasis
is given to the importance of recognizing job safety and health hazards. Topics include soldering
and brazing, pressure gauge reading, regulation, adjustment and sizing of pipes, meters, valves,
strainer, regulators and related components. Students must register for a lecture and laboratory
section.
Prerequisite: None
Credit Hours: 4 Credits
Lecture: 3 Hours
Laboratory: 3 Hours
Course Focus
This course provides students with the knowledge and theory in oxygen-acetylene brazing and
soldering, tubing (copper-plastic) and its capabilities within systems. Students will gain
proficiency with the usage of hand tools, pipe threading and cutting machines and the use of a
radiac. After studying and successfully completing this course, students will have developed a
solid background in plumbing. Further expertise can be developed through additional hands on
experience and by keeping current with trends and new techniques in the plumbing industry.
Instructional methods include lecture and theory, discussion, demonstration and hands on
laboratory application.
Text Book: Available in Bookstore
List of Material and Supplies Required: None
Student Learning Outcomes
Student will:
58.
59.
60.
be knowledgeable about Oxygen-Acetylene Brazing and Soldering.
be knowledgeable tubing (copper and plastic) and their capabilities within systems.
be proficient in the usage of hand tools, pipe threading machine, cutting machines and the
RADIAC.
59
61.
be knowledgeable of the importance of proper sizing, fitting, and securing of different
piping materials, fixtures and sealants.
Student Learning Performance Objectives
The student will
1.
demonstrate the use of plumbing tools.
2.
exercise all safety procedures.
3.
perform mathematic problems designed for plumbers.
4.
utilize piping materials and fittings for project work.
5.
utilize valves, faucets and meters for project work.
6.
install hot water systems.
7.
demonstrate the use of transit level and cold beam laser.
8.
draw a complete plumbing system for a small building.
9.
calculate the desire slope of pipe for project work.
10.
perform math problems calculating the offset of pipe.
11.
utilize the use of a plumber’s rule for projects.
12.
practice installing pipes.
13.
discuss plumbing codes and zoning laws.
14.perform math problems converting length, area, volume and temperature from SI metric to US
conventional dimensions.
15.
list the color code of piping.
16.
read architectural drawings.
17.
perform math problems converting drawing measurements to actual measurements.
18.
perform experiments illustrating the relationship of heigh to water pressure.
19.
identify cross connections and explain how to avoid them.
20.
list the step involved in order to bring water and sewer service to a building.
21.
demonstrate rigging and hoisting of heavy equipment.
22.
perform PM work.
23.
solder, braze and cut using oxygen acetylene torch.
24.
solder using propane torch.
60
Attachment 4.1
West Virginia Northern Community College
Master Course Guide
Course Number:
MATH 110
Course Title:
Pre-Calculus Mathematics
Revision Date:
February 2007
_
Faculty Signature
Date
Faculty Signature
Date
Faculty Signature
Date
_
_
___________________________________________ _____________________
Faculty Signature
Date
Comments:
I confirmed that the Master Course Guide was developed according to the established procedures and that it meets the
college’s requirements.
Center Director Signature
__________________________________________
Date
61
62
MATH 110
Pre-Calculus Mathematics
Course Description
This course is an integrated approach to algebra and trigonometry preliminary to the study of calculus. The course
includes sets and the real number system, relations and functions, graphs of relations and functions, polynomials,
rational functions, exponential and logarithmic functions, trigonometric functions and complex numbers.
Prerequisites
Math 086 and Math 092 and Math 093 or satisfactory placement scores on the numerical and algebra sections of the
ASSET Test.
Corequisites
None
Credit Hours: 5
Lecture: 5
Lab: 0
Expanded Description/Course Focus
This course covers math content needed by students who are preparing for the study of differential and integral calculus.
Emphasis is placed on functions and their graphs, including polynomial, rational, exponential, logarithmic, and trigonometric
functions. The characteristics of the functions and their graphs are identified and explored. Solutions are found for
equations, inequalities, and applied problems involving the various functions. Use of technology such as graphing
calculators is incorporated in the course.
Text Information Available in Bookstore
List of Materials and Supplies: a minimum of a scientific calculator
Student Learning Outcomes (SLO)
1.
Demonstrate (to be able to integrate into the study of functions) the fundamental properties of algebra including
those of sets of real numbers and their graphs, polynomial arithmetic and factoring, algebraic fractions,
exponents and radicals and solutions to equations and inequalities in one variable
2.
Understand the Cartesian Coordinate System in order to identify and graph functions including piecewise
functions, arithmetic combinations of functions, one-to-one and inverse functions
3.
Identify and graph polynomial functions, including linear functions, solve equations and inequalities and apply
these skills to solve real life application problems
4.
Identify and graph rational functions, solve equations and inequalities and apply these skills to solve real life
application problems
63
5.
Identify and graph exponential and logarithmic functions, solve equations and inequalities and apply these skills
to solve real life application problems
6.
Identify and graph trigonometric functions, solve equations and inequalities, verify trigonometric identities,
interpret radian measure and apply these skills to solve real life application problems
Student Learning Performance Objectives (SLPO)
SLO
SLPO
1
1.1
1.2
1.3
1.4
1.5
2.1
2.2
2.3
2.4
2
2.5
2.6
2.7
2.8
2.9
2.10
2.11
o
Student Learning Performance Objective
use interval notation
graph intervals of real numbers
solve polynomial and rational inequalities
solve absolute value equations
solve absolute value inequalities
use the distance formula
use the midpoint formula
perform completing the square procedure
solve circle problems
determine intercepts
determine symmetry
graph equations
use functional notation
identify functions
determine function domain
determine function range
identify
even and odd functions
o
find
arithmetic function combinations
o
find
function composition
o
identify
one-to-one functions
o
find
function inverses
o
3
3.1
3.2
o
graph
function inverses
find slope of a line
find linear equations
graph
lines
o
identify
parallel and perpendicular lines
o
identify
increasing and decreasing functions
o
graph
quadratic functions
o
apply
graphical shifting techniques
64
o
determin
e polynomial end behavior
o
identify
polynomial characteristics
o
graph
polynomial functions
o
approxi
mate local extrema
o
divide
polynomial functions
o
factor
polynomials
o
apply
rational zero test
o
use
Descartes’ Rule of Signs
o
find
polynomial real zeroes
o
perform
complex number operations
o
find
polynomial complex zeroes
o
4
4.1
o
find
specific property polynomials
determine vertical and horizontal asymptotes
determin
e slant asymptotes
identify rational function characteristics
4.3
o
graph
rational functions
o
5
5.1
5.2
5.3
5.4
o
graph
rational power functions
identify exponential function characteristics
graph exponential functions
evaluate logarithmic expressions
identify logarithmic function characteristics
graph
logarithmic functions
o
apply
logarithm properties
o
solve
exponential equations
o
solve
logarithmic equations
o
solve
compound interest problems
o
6
6.1
6.2
6.3
6.4
o
solve
exponential growth and decay problems
understand radian measure
find unit circle point coordinates
find reference numbers
evaluate special angle trigonometric functions
identify
65
trigonometric function periods
o
identify
trigonometric function characteristics
o
solve
trigonometric equations
o
use
trigonometric identities
o
graph
trigonometric functions
o
verify
trigonometric identities
o
perform
degree/radian conversion
o
solve
right triangles
o
solve
right triangle application problems
o
evaluate
inverse trigonometric functions
o
apply
Law of Sines
o
apply
Law of Cosines
o
perform
graphing calculator operations
o
generate
graphing calculator graphs
o
solve
application problems using graphing technology
Rev: 02/07
66
Attachment 4.1
West Virginia Northern Community College
Master Course Guide
Course Number:
Course Title:
NUR 111
Foundations of Nursing Practice
Revision Date:
December 2006
Faculty Signature
Linda Jo Shelek
12/14/06
Date
Comments:
I confirm that the Master Course Guide was developed according to the established procedures and that
it meets the college’s requirements.
______________________________
Division Chair Signature
________________________
Date
67
Course Number:
Course Title:
NUR 111
Foundations of Nursing Practice
Course Description: This course introduces students to the discipline of nursing. This course
addresses the fundamental concepts of nursing as well as the mission, philosophy, and outcomes of the
Nursing program. Other concepts addressed are the history of nursing, the role of the professional
nurse, and the nursing process as it relates to the plan of care of a patient.
Prerequisites:
Completion of all developmental courses and admission to the nursing program
Corequisites:
NUR 121, NUR 131, NUR 151
Prerequisites or Corequisites:
Credit Hours:
BIO 114, BIO 117, and ENG 101
Lecture: 1
Lab: 0
Expanded Description / Course Focus:
This course introduces the nursing student to the discipline
of nursing and the role of the nurse in a diverse health care system. This course introduces the student to
the mission, philosophy, and outcomes of the WVNCC nursing program while identifying the elements
of the organizational framework of the nursing curriculum. A brief history of nursing and the forces that
shape nursing practice are discussed. The characteristics of the professional nurse are identified. The
steps of the nursing process are discussed, developed, and utilized in a nursing plan of care for a client.
Students will initiate and develop inter-campus discussion pertaining to aspects of the nursing profession
via WebCT.
Text Information Available in Bookstore
Student Learning Outcomes: The following list of student learning outcomes will be achieved at the
successful completion of the course.
1.
Describe the mission, philosophy, and outcomes of the WVNCC nursing curriculum.
2.
Define the concepts of the nursing curriculum.
3.
Identify the role of the nurse in the health care delivery system.
4.
Explain the steps of the nursing process.
5.
Develop a teaching learning plan for a client in an acute or long-term care facility.
6.
Establish inter-campus discussion about the nursing profession.
Student Learning Performance Objectives: The following list of student learning performance
68
objectives will be addressed in the course.
1.
Describe introduction to nursing.
2.
Identify course objectives.
3.
Identify grading and attendance policies.
4.
Describe Nursing.
5.
Describe the role and function of nursing.
6.
Identify ANA’s definition of Nursing.
7. Explain the mission, philosophy, framework, and outcomes of the WVNCC Nursing program.
8.
Identify caring behaviors of the professional nurse.
9.
Describe the historical impact of nursing leaders on nursing.
10.
Describe the theoretical perspective identified in various types of nursing education.
11.
Identify major nursing organizations and publications.
12.
Define five ethical principles and how they relate to nursing practice.
13.
Define the ANA Code of Ethics.
14.
Identify essential values for nursing practice.
15.
Explain the legal boundaries of nursing.
16.
Describe the listed sources of the law.
17.
Describe intentional and unintentional torts.
18.
Explain the nursing student’s role in the practice of nursing.
19.
Define standards of care.
20.
Identify the levels of the health care delivery systems.
21.
Identify the services provided by the different levels of care.
22.
Discuss the issues that influence health maintenance.
23.
Describe the role and competencies of the nurse in community-based practice.
24.
Explain the characteristics of clients from vulnerable populations.
25.
Identify nursing interventions to achieve culturally congruent care.
26.
Identify the purpose and role of the nurse in client education.
27.
Describe ways to incorporate teaching with routine nursing care.
28.Discuss the nurse’s responsibility and critical thinking attitudes used in clinical decision making.
29.
Explain the relationship between clinical experience and critical thinking.
30.
Define the Nursing Process.
31.
Discuss the purpose of nursing assessment.
32.
Describe three qualities required to be competent in using the Nursing Process.
33.
Explain the types of data that can be revealed from various interviewing techniques.
34.
Explain the relationship between data interpretation, validation, and clustering.
35. Explain why organizing data according to more than one method promotes critical thinking.
36.
Define the term nursing Diagnosis.
37. Utilize critical thinking to identify the problem, define the characteristics and the etiology in a
nursing diagnosis.
38.
Identify errors to avoid when writing nursing diagnosis statements.
39.
Differentiate between a nursing diagnosis and a medical diagnosis.
40.
Identify how you will set priorities in the clinical setting.
41.
Describe why specific, measurable outcomes are the key to efficient planning.
42.
Explain the importance of considering types of outcomes.
69
43.
Develop a comprehensive Plan of Care.
44.
Select appropriate implementation methods to achieve client goals.
45.
Discuss how you will set priorities and delegate care.
46.Describe how evaluation leads to continuation, modification, revision, and discontinuation of the plan
of care.
47.
Define methods of using the nursing process.
Attachment 4.2
WEST VIRGINIA NORTHERN COMMUNITY COLLEGE
DEPARTMENT of MATHEMATICS
ASSESSMENT REPORT
MATH 204: MATHEMATICS FOR TEACHERS I (K-9)
70
FALL 2006
71
WEST VIRGINIA NORTHERN COMMUNITY COLLEGE
72
DEPARTMENT of MATHEMATICS
ASSESSMENT REPORT
COURSE:
MATH 204
SEMESTER: FALL 2006
Number of
Students
Participating =
43
QUESTION
MCG
SLPO
Number of
Students
Earning Full
Credit
Percentag
e of Total
0.041667
1.4
30
69.77%
1b
1.4
19
2
8.1
3
Percentage
of Total
Number of
Students
Earning No
Credit
Percentage of
Total
13
30.23%
0
0.00%
44.19%
24
55.81%
0
0.00%
15
34.88%
17
39.53%
11
25.58%
2.5
21
48.84%
14
32.56%
8
18.60%
4
3.2
9
20.93%
11
25.58%
23
53.49%
5
5.3
25
58.14%
0
0.00%
18
41.86%
0.25
7.1
25
58.14%
0
0.00%
18
41.86%
6b
7.1
39
90.70%
0
0.00%
4
9.30%
6c
7.1
33
76.74%
0
0.00%
10
23.26%
7
6.3
15
34.88%
14
32.56%
14
32.56%
8
6.7
23
53.49%
3
6.98%
17
39.53%
9
7.5
32
74.42%
0
0.00%
11
25.58%
10
8.4
26
60.47%
11
25.58%
6
13.95%
11
8.15
20
46.51%
18
41.86%
5
11.63%
12
8.18
30
69.77%
5
11.63%
8
18.60%
73
Number of
Students
Earning
Partial
Credit
WEST VIRGINIA NORTHERN COMMUNITY COLLEGE
DEPARTMENT of MATHEMATICS
ASSESSMENT REPORT
COURSE:
MATH 204
SEMESTER:
Number of Students Participating in the Assessment:
FALL 2006
43
COMMENTS:
The Math 204 Assessment consisted of 12 problems with a total of 15 questions. These
questions were a common part of the final exam for each of the Math 204 sections on all
campuses. There was one Math 204 section on each campus and it was taught by a full-time
faculty member. A total of 43 students participated in the assessment.
At least 74% of the students earned full or partial credit on 10 out of the 15 questions. The
percentages for the 10 questions ranged from 74.41% to 100%. More students earned full
credit on 8 of those 10 questions compared to partial credit. Four questions out of the 15 had
between 58.14% and 67.44% of the students earning full or partial credit with the remaining
41.86% to 32.56% of the students earning no credit. One question out of the 15 had only
46.51% of the students earning full or partial credit with the remaining 53.49% earning no
credit. The 15 questions were the same 15 questions used in the 2004 assessment of Math
204, allowing for a comparison of results.
RECOMMENDATIONS:
Since at least 74% of the students earned full credit or partial credit on 10 of the 15 questions,
no recommendation is necessary for those 10 questions. The MCG SLPOs for the other 5
questions should have more emphasis placed on them and those MCG SLPOs should be
evaluated again the next time MATH 204 is scheduled to be assessed. The 5 MCG SLPOs
needing more emphasis in the MATH 204 classes are SLPOs 3.2, 5.3, 6.3, 6.7, and 7.1, which
correspond to assessment questions 4, 5, 7, 8, and 6a, respectively.
The specific MCG SLPOs needing more emphasis are:
SLPO 3.2: use various function representations
SLPO 5.3: perform operations using various number bases
SLPO 6.3: apply divisibility rules
SLPO 6.7: find least common multiple
SLPO 7.1: identify whole number operation properties (The particular property needing
emphasis is the closure property. The students did a good job with questions about the
identity and commutative properties.)
COMPARISON OF 2006 AND 2004 ASSESSMENT RESULTS:
Three of the 2006 SLPOs (3.2, 5.3, 7.1) that were identified as needing more emphasis were
74
also outcomes identified in the 2004 assessment. The percent of students earning full or
partial credit on each of these three SLPOs in this assessment is within 2% of the percent
earning full or partial credit in 2004. So not much progress was made improving the
performance on these three SLPOs.
SLPO 6.3 had 67.44% of the students get full or partial credit in 2006 and in 2004 the percent
was 67.86%. The reason that SLPO 6.3 was identified as needing more emphasis in the 2006
assessment is that its percent differed considerably from the 74% and above that the top 10
SLPOs had achieved. The percent of students getting full or partial credit on SLPO 6.7
dropped from 67.86% in 2004 to 60.47% in 2006. Because of the drop, SLPO 6.7 was
included in the 2006 list needing more emphasis.
There was significant improvement since 2004 in two of the SLPOs. In 2004 SLPO 2.5 (use
Venn diagrams) had only 51.79% of the students earn full or partial credit on the question
compared to 81.40% in 2006. Likewise, in 2004 SLPO 8.1 (use whole number operation
algorithms) had only 55.36% of the students earn full or partial credit compared to 74.41% in
2006. The 2004 recommendation of more emphasis for these two SLPOs seems to have
produced good results.
Shirley Rychlicki
Faculty Signature
2/22/07
Date
75
76
Attachment 4.3
77
NURSING PROGRAM
REVIEW
SPRING 2007
Submitted by:
Linda Jo Shelek
Director of Nursing
78
Nursing Program Review
Compiled by:
Linda Jo Shelek, RN, MSN, BC-NP
Professor/ Director of Nursing
Previous Formal Review:
1998
Accreditation Review:
NLNAC 2007 – Presented April meeting for review.
NLNAC 1999
Focused Report 2002
WV State Board Nursing 2002
Annual Reports available for Board of Governors
Faculty:
Rita Berry
Donna Hans
JeanneAnn VanFossan
Jill Keyser
Michele Watson
Arlene Kuca
Cris Riter
Danielle Bartley
Lynn Miller
Saundra Sue Huggins
Barbara Sisarcick
Campus:
Wheeling:
Weirton:
New Martinsville:
Programs:
Traditional
Advanced Placement:
Students:
Current Fulltime:
1st and 2nd level
1st and 2nd level
1st level
Transfer
LPN-RN
150 - 175
Statistical data presented in chart for previous five years.
Program implementations and changes presented in chart for previous five years.
 2006 Annual Report to WV State Board of Nursing available upon request and at BOG Meeting.
 2007 NLNAC Self-Study available upon request and at BOG Meeting.
79
80
81
Admission
Year
Graduate
Year
Applicants
Qualified
Applicants
Accepted
Admitted
1st day Cohort
of class Graduates
Cohort
Retention Rate
2007
2009
2006
2008
249
123
123
114
End of 1st year
11478
68%
2005
2004
2003
2007
2006
2005
286
283
315
218
200
197
160
151
159
136
135
137
129
133
127
68
62
61
53%
47%
48%
2002
2001
2000
1999
1998
1997
2004
2003
2002
2001
2000
1999
250
144
125
122
113
53
54
48
47%
82
NCLEX Pass
Rate
97%
95%
94%
89%
80%
77%
83%
75%
Admission
Year
Graduate
Year
Curriculum Changes
Admission Changes
Graduation Changes
Program Review
2007
2009
Developmental Math Placement
Scores Increased
2006
2008
NLN - PAX score greater than 96
composite
NLNAC Accreditation Visit
WVSBRN Annual Report
Accepted
Certified Background Checks;
NLN-PAX for Evaluation Only
WVSBRN Annual Report
Accepted
Pre-Nursing Advising / Advisors
Employed; NLN-PAX for
Evaluation only.
WVSBRN Annual Report
Accepted
Pre-Nursing Information Sessions;
Criteria of GPA 2.5 and all
developmental courses complete.
WVSBRN Annual Report
Accepted and Provisional
Status lifted.
2005
2004
2007
Add Medical Surgical Theory
content to first semester,
combination of 2 Medisurigal
courses in second semester;
Practice time for Skills Course built
in to credit hours first semester
2006
2005
Spring 2005 MedsPublishing
available via internet for student
increased availability.
2002
2004
Fall 2003 contracted with
MedsPublishing Inc. to provide an
NCLEX mentoring program via
Institution computer programs from
semester 1 through 4.
2001
2003
Required NCLEX Live Review
after graduation
2000
2002
Optional NCLEX Review by
Outside Company
1999
2001
Optional NCLEX Review by
Faculty
2003
83
Required NCLEX Live Review
Prior to Graduation; Required
passage of NLN PreLicensure Exam in exit
capstone course for final
grade.
WVSBRN Annual Report
Accepted with Progress
Report.
WVSBRN Annual Report
Accepted with Progress
Report
NLNAC Focused Report;
WVSBRN Accreditation Visit;
WVSBRN Annual Report
Accepted & Program placed
on Provisional Accreditation
for NCLEX Pass Rates below
required acceptable state
level.
WVSBRN Annual Report
Accepted
1998
1997
2000
Fully implemented curriculum
1999
Major Curriculum Change from
large block courses to multiple
smaller credit hour courses
84
WVSBRN Annual Report
Accepted
Many options of admission criteria
were utilized (GPA, ACT, Asset,
Prior courses both college and HS)
NLNAC Accreditation Visit;
WVSBRN Annual Report
Accepted
85
Viability
Indicator
Statistics
Data

Agreements:



Alternative
Methods of
Instruction:



Program
Costs:



Future Plans
Continue to admit qualified students at
capacity of program.
Work with BMSpurr for LPN to RN
Transition admission students.
-Revisit Saturday clinicals.
-Research viability of previous degree student
application for admission to a fast track
nursing curriculum. Student applicants must
be highly motivated with completion of all
core or general education required course at a
higher GPA than required for traditional
admission.
- Approach facilities to support salaries of
clinical instructors.
- Monitor usage of supplies
-
Adequacy
Indicator
External
Accreditation
Licensure
Pass Rate
WorkKeys
Data
NLNAC Accreditation 2007 –
Future Plans
Continued Accreditation for 8 years with a 2
year focused Report on Self Evaluation Plan.
Continued Annual Accreditation with report.
Focused visit is always TBA.
WV State Board of Nursing –
WV State Board of Nursing
Requirements: 80% of students pass
NCLEX exam as 1st time testakers.
All graduates complete.
86
Licensure pass rates have been steadily
increasing. Goal is to maintain pass rate at
National Level. 2006 was at 87%.
Previous years were not required for
transcript release. Inform students that it is
Applicants, Admitted, and Gra
WLSC
JDRockefeller Center for LPN
Most all 4 year institutions hav
Offer 12 hour clinical days.
Have offered Saturday clinical
Incorporating WebCT, IPVideo
Faculty salary: There are 4 fac
Supplies: - Students purchase
Equipment: -Utilize Perkins m
NLN PreLicensure
Exam
Transfer
Success
80% Likelihood to pass NCLEX
Student
Satisfaction
Survey
College survey does not give specific
program information.
Assessment
Results
Rotation of course assessment.
Annual program assessment on
identified weaknesses.
Student continuing education.
Improvement
s
Ad
vis
ory
Co
m




A
n
n
u
87
the colleges privilege to hold transcript if
WorkKeys not completed.
2006: 94% of students had greater than 80%
likelihood to pass NCLEX Exam on first
attempt.
This is not well developed.
Will continue to discuss avenues to increase
capture of this information.
Development of student survey of satisfaction
of program.
Development of employer satisfaction of
student knowledge within employment
setting.
-Course assessment completed with revision
of MCG’s.
- Program assessment lead to changes related
to NCLEX preparation throughout the
program and end of program review,
curriculum changes, admission selection
criteria changes, pre-nursing advisors,
NCLEX Review;
NLN Pre-Licensure Exam;
Developmental Course completed or
increased asset score;
NLN PAX exam >96;
Pre-Nursing Advising;
Summer Orientation;
Course curriculum changes;
Continue Pinning Ceremony and review
potential to have prior to graduation
ceremony.
Items that focused on improvi
Items that focus on retention
Nursing Pinning is one unified
mit
tee
a
l
A
d
v
i
s
o
r
y
M
e
e
t
i
n
g
s
.
Continue
meetings
focused on
the Nursing
Program.
General
college
advisory
meetings non
production
for program.
Excellent way
thank
88
community
but not able
to focus on
program.
Faculty
Faculty listed above.
For the academic year 2006-2007 there
are twelve (12) full-time faculty
members including the Director of the
Associate Degree Nursing Program.
Nine (9) of the full-time faculty have a
masters degree in nursing. Four (4) of
the MSN faculty also have a second
masters degree in Community Health
Promotion. Two (2) MSN faculty
members are Family Nurse
Practitioners with prescriptive
authority. Regionally and statewide,
there is a shortage of master’s prepared
nurses. For this reason WVNCC has
three (3) faculty members (BSN and
experientially qualified) currently
enrolled and making consistent
progress in masters degree programs.
The program has adjunct faculty that
have been with us for 3-13 years and
fulltime faculty with up to 25 years
commitment. We have had faculty
turnover for reasons of retirement and
return to workforce due to salary. It is
extremely difficult to entice faculty to
join and stay with us when the salary in
the industry for a new graduate is more
than an Instructor. Most fulltime
faculty are in overload contracts or
external employment to supplement
income and maintain current in
89



practice.
Fac
iliti
es

C
a
m
p
u
s
l
a
b
s
p
a
c
e
a
n
d
e
q
u
i
p
m
e
n
t
s
u
f
90
f
i
c
i
e
n
t
t
o
p
r
e
p
a
r
e
s
t
u
d
e
n
t
s
f
o
r
c
l
i
n
91
i
c
a
l
e
x
p
e
r
i
e
n
c
e

C
l
i
n
i
c
a
l
s
i
t
e
s
w
i
t
h
92
p
a
t
i
e
n
t
a
n
d
f
a
c
i
l
i
t
i
e
s
t
o
s
u
p
p
o
r
t
t
h
93
e
o
r
y
e
d
u
c
a
t
i
o
n
200
6:
Wh
eeli
ng
mo
ved
to
ne
w
Ed
uca
tio
nal
Ce
nte
r
and
Nu
rsin
g
Art
94
s
La
b
(31
0).
Re
pla
ce
me
nt
of
var
iou
s
ite
ms
suc
h
as
bed
s,
cup
boa
rds
to
con
tain
sup
plie
s,
ne
w
co
mp
ute
rs,
95
Vit
a
Si
ms
ma
nik
in
and
me
dic
atio
n
ad
mi
nist
rati
on
Car
ts
on
all
thr
ee
ca
mp
use
s..
200
5:
Ne
w
Ma
rtin
svil
le
Nu
96
rsin
g
Art
s
La
b
mo
ved
acr
oss
hall
to
lar
ger
roo
m.
We
irto
n
inst
alla
tio
n
of
Co
mp
ute
rs
and
Tec
h
equ
ip
me
nt
in
97
ca
mp
us
lab.

Wheeling
Medical Park
Ohio Valley
Medical
Center
East Ohio
Regional
Hospital
Reynolds
Memorial
Hospital
Weirton
Medical
Center
Wetzel
County
Hospital
Bishop
Hodges
Continuous
Care Center
Weirton
Geriatric
Center
Northwood
Health
Systems
Ohio &
Marshall
Maintain adequate areas of practice:
98
County
Schools
New
Martinsville
Health Care
Center
Belmont
Community
Hospital
Wheeling
Health Right


Equipment
Technology
Support
LRC Support
Maintenance of all campus labs with adequate
Order VitaSim Mannequin for New Martinsvil
See above.
Availability and utilization of Perkins
money for student needs in education.
Campus lab and classroom computers,
faculty computers, and support for
problems are available and working at
capacity.
Maintain a faculty driven list that is available
for college and division chair to order needed
equipment.
Working IT continues to be available for
support and problems.
IPVideo faculty usage and enhanced
availability of student usage will be
encouraged within program and for future
optimal, distance and alternate format
education of nurses.
Availability of all services at each
campus.
Online availability of services.
Continue encouragement of student usage.
Data
Graduates who have passed NCLEX
Exam will be employed within 1 year.
Future Plans
All students who have passed boards are
employed if they so desire.
Health Care Job Fair organized by Health
Necessity
Indicator
Placement
99
Employment
Outlook
Advisory
Committee
Input
Graduates who have passed NCLEX
Exam will be employed within 1 year.
Employers request to speak with our
students for positions within facility.



Science Division was not supported by
employers. We will develop and promote
EARLY in Fall and collectively choose a
date.
Employment in area or minimally distant is
sufficient to support number of students on all
three campuses.
DON at local hospitals verified to NLNAC at
the accreditation visit that they actively
recruit our graduates.
Ask for advice to collect information from
employers related to student capabilities or
skills after education.
Continue to request Advisory Committee
input annually.
Consistency with Mission
Indicator
Relation to
mission /
strategic goals
Data
Nursing Program Mission and Goals
reflect WVNCC Mission and Nine (9)
new Strategic Goals.
Relation to
other programs
Nursing Program is one of many
Health Science Programs that support
one another educationally and
functionally.
Nursing Program supports the Science
and Allied Health Courses within the
100
Future Plans
NLNAC has requested that we “update”
current program and course goals within the
next two years.
The Nursing Faculty will align the revised
goals to the WVNCC Strategic goals.
The Nursing Faculty will need to be afforded
the time to adequately review and revise
goals above current position description
requirements.
We work to promote practice requirements in
facilities, support of programs,
encouragement of student support within the
institution, and celebration of
accomplishments.
2006: Suggestions of employe
2005: Admission changes ben
2004: Support of curriculum c
Effect of
discontinuation
institution.
The community would be at great loss
of nurses. WVNCC has the largest
graduating class in the area. Students
are local and stay within the
community to work. The Nursing
Program has the largest number of
current and pre-major students in the
college community.
Continue to provide nursing education in the
Wheeling, Weirton, and New Martinsville
area.
Summary
Indicator
Strengths






NLNAC and State Board
Quality Faculty
Quality Campus Lab env
Quality Clinical facilities
Support of Institution and
Ability to provide increas
Concerns




Ability to expand withou
Ability to focus on progra
Ability to expand technol
Ability to maintain and e
Recommendations
for Improvement






Increase full time Nursing
Release time for Nursing
Continue to promote tech
Increase equal secretary s
Expansion of program fo
Expansion of LPN-RN co
101
Attachment 4.4
WEST VIRGINIA NORTHERN COMMUNITY COLLEGE
PROGRAM OBJECTIVES
ASSESSMENT MATRIX
INDUSTRIAL MAINTENANCE TECHNOLOGY
OCTOBER 12, 2006
Program Outcomes
EL 112
Communicate effective
with other plumbers and
pipe-fitters
IMT 100
MATH
100
RAH
100
RAH
206
ENG
101
ENG
115
IMT 205
PSYC 155
RAH
209
RAH 211
X
Communicate effective in
oral and written formats
X
Demonstrate treading and
sizing pipe
X
X
X
Demonstrate preventive
maintenance
X
Employ mathematics
literacy skills
X
X
X
x
102
X
X
X
X
Program Outcomes
EL 112
Obtain gainful
employment as plumber
and pipe -fitter
Utilize team building skills
IMT 100
MATH
100
RAH
100
RAH
206
ENG
101
ENG
115
X
X
X
X
X
PSYC 155
RAH
209
RAH 211
X
X
X
X
X
X
X
X
Demonstrate welding skills
Demonstrate electric skills
IMT 205
X
X
X
Acquire the EPA
Refrigerant Certification
X
Recognize the importance
o life long learning
X
X
103
Attachment 4.5
Assessment Committee
Peer Review: Course Assessment Report
Course Number and Title:
Course Assessment Report Submitted By:
Date of Assessment Report:
Peer Reviewer Name:
Date of Peer Review:
Reviewer Instructions: Please use the following guidelines to evaluate each course assessment report. Identify the level of development for each
of the four components of the report. Select only one level of development for each plan component. Include rationale and comments for
appraisal.
Peer Review Report Components/Guidelines
Levels of Development
Developing
Undeveloped
Student Learning Outcomes (SLOs) Assessed: Identification of Outcomes, Tool(s) and Performance Indicators

Are SLOs clearly stated in terms of student learning

Are SLOs written in terms of what students know or are able to do?

Are SLOs written in a measurable format 
Undeveloped: No specific SLOs listed
Evaluator Comments:
Developing: SLOs
stated but measurement
not defined
Established: SLOs clearly
stated in measurable
format
Evaluator Comments:
Evaluator Comments:
Method of Assessment/Data Collection

Are methods to assess SLOs clearly described

Are methods of assessment appropriate for measuring the selected SLOs 

Are multiple measures (direct and indirect) used to assess SLOs 
Undeveloped: Method(s) of assessment and plan for implementation not defined
Evaluator Comments:
Developing: Assessment
104
Established: Methods of
Established
methods or procedures
for implementation are
partially developed or
do not clearly support
assessment of selected
SLOs
assessment and
procedures for
implementation are well
developed
Evaluator Comments:
Evaluator Comments:
Ass
ess
me
nt
Res
ults
:
Dat
a
Su
m
ma
riz
atio
n
and
An
aly
sis




Does analysis include description of sample used for assessment
Does summary of results address measures described in method of assessment
Are results provided for each SLO selected for assessment cycle
Are results compared to earlier assessments (if data is available)
Undeveloped: No analysis of data supporting SLOs
Ev De
alu vel
ato opi
r ng:
Co Inc
m om
105
me plet
nts:e
ana
lysi
s of
sele
cte
d
SL
Os;
not
all
pla
nne
d
acti
viti
es
ana
lyz
ed.
Ev
alu
ato
r
Co
m
me
nts:
Established: Comprehensive analysis of stated SLOs
Evaluator Comments:
Rec
om
me
nda
tio
ns/
106
Act
ion
Pla
n

Does report include recommendation(s) regarding selected SLOs 
Are curriculum modifications/plan of action recommended based on results of assessment activities

Does data clearly support recommendations provided

When available, does assessment report build on previous assessment activities
Is there evidence that data is shared with other faculty (feedback loop evident in recommendation)
Undeveloped: No recommendation given or recommendation does not appear to be supported by data
Ev De
alu vel
ato opi
r ng:
Co Rec
m om
me me
nts:nda
tio
n
or
acti
on
pla
n is
ide
ntif
ied,
but
not
all
rec
om
me
nda
tio
ns
app
107
ear
to
be
sup
por
ted
by
ass
ess
me
nt
dat
a
pro
vid
ed.
Lin
k
to
dat
a is
im
plie
d
but
not
cle
arl
y
stat
ed.
Evaluator Comments:
Established: Recommendation or action plan
demonstrates use of assessment data for improvement of the
course
Established: Recommendation or action plan demonstrates
use of assessment data for improvement of the course
Evaluator Comments:
108
Established: Recommendation or action plan demonstrates
use of assessment data for improvement of the course
Evaluator Comments:
109
110
Appendix B
Appendix B
List of References cited in Focus Visit Report (Documents Available in the Resource Room)
Reference Code
Document Title
Chapter 2
REF 2.1
Faculty Senate Website http://www.northern.wvnet.edu/~tdanford/senate/
REF 2.2
Assessment website: Assessment Plan, Appendix F, Master Course Guide Requirements.
http://www.wvnorthern.edu/acadaffairs/academic%20assessment/
REF 2.3
Assessment Plan, Appendix H, Division Assessment Compilation Report
http://www.wvnorthern.edu/acadaffairs/academic%20assessment/plan%20for%20assessment%20of%20
student%20learning/
Chapter 3
REF 3.1
Assessment Plan, Goals, page 6
REF 3.2
Strategic Plan, Goal 2
REF 3.3
Strategic Plan Brochure, Pledge to Students
REF 3.4
Assessment Plan “Student as a Developing Learner”, page 12
REF 3.5
Master Course Guides, accessible on Assessment website,
http://www.wvnorthern.edu/acadaffairs/academic%20assessment/master%20course%20guides/
111
REF 3.6
Assessment Plan, Appendix F, Master Course Guide Requirements, page 29 or
http://www.wvnorthern.edu/acadaffairs/academic%20assessment/plan%20for%20assessment%20of%20
student%20learning/
REF 3.7
Model for Assessment of Student Learning and Evaluation of Institutional Effectiveness, Assessment
Plan, page 15 or
http://www.wvnort
hern.edu/acadaffair
s/academic%20asse
ssment/plan%20for
%20assessment%2
0of%20student%20
learning/
REF 3.8
Assessment Plan, Appendix H, Division Assessment Compilation Report
http://www.wvnorthern.edu/acadaffairs/academic%20assessment/plan%20for%20assessment%20of%20
student%20learning/
REF 3.9
WVCCTCE Series 10, Policy Regarding Program Review
REF 3.10
Assessment Plan, Appendix A, Program Objectives/Program Matrix or
http://www.wvnort
hern.edu/acadaffair
s/academic%20asse
ssment/plan%20for
%20assessment%2
0of%20student%20
learning/
REF 3.11
Data elements for Program Review, Draft Document
REF 3.12
WorkKeys Standards and Measures, WVCCTCE Chart
REF 3.13
112
Annual Academic Assessment Cycle, Assessment Plan, Page 17
http://www.wvnorthern.edu/acadaffairs/academic%20assessment/plan%20for%20assessment%20of%20
student%20learning/
REF 3.14
Model for Assessment of Student Learning and Evaluation of Institutional Effectiveness, Assessment
Plan, page 15 or
http://www.wvnort
hern.edu/acadaffair
s/academic%20asse
ssment/plan%20for
%20assessment%2
0of%20student%20
learning/
REF 3.15
Institutional Effectiveness Web Site http://www.wvnorthern.edu/ie/ and Institutional Research Website
http://www.wvnort
hern.edu/business/i
r/
Chapter 4
REF 4.1
Assessment Plan, Appendix F, Master Course Guide Requirements, page 29 or
http://www.wvnorthern.edu/acadaffairs/academic%20assessment/plan%20for%20assessment%20of%20
student%20learning/
REF 4.2
Assessment Webpage, Master Course Guides
http://www.wvnort
hern.edu/acadaffair
s/academic%20asse
ssment/master%20c
ourse%20guides/
REF 4.3
Course Level Assessment Reports
REF 4.4
WVCCTCE Series 10, Policy Regarding Program Review
REF 4.5
113
Data elements for Program Review, Draft Document
REF 4.6
Nursing Program Assessment Report, Pilot for new format
REF 4.7
Program Assessment/ Program Reviews
REF 4.8
Reports, Institutional Research Website http://www.wvnorthern.edu/business/ir/
REF 4.9
Assessment Plan, Appendix A and B, pages 23 - 24 or
http://www.wvnorthern.edu/acadaffairs/academic%20assessment/plan%20for%20assessment%20of%20
student%20learning/
REF 4.10
Program Outcomes Matrix, Completed matrices for current degree programs
REF 4.11
Peer Review Rubric; minutes from August 2, 2007 meeting
REF 4.12
Materials from Professional Development Sessions, List of professional Development Sessions related
to assessment
REF 4.13
BOG Agenda/Minutes
REF 4.14
Strategic Goal 2
Chapter 5
REF 5.1
Faculty Assessment Survey
REF 5.2
114
Assessment Requirement, College Catalog
115
Appendix C
Assessment Web Page Contents
The Assessment web page is located in the “Academics” pull down menu.
You may logon using the following:
Logging on provides access to restricted locations.
Academics
Academic Assessment Link Includes:
$
Academic Reports (Currently under contstruction while we transition to electronic format)
$
Assessment Committee (List of committee members for 2007-08)
$
Master Course Guides (courses grouped by subject area)
$
Plan for Assessment of Student Learning (includes Assessment Plan, Annual Academic
Assessment Cycle, appendices and report forms)
$
Appendix A - Program Objectives Assessment Matrix
$
Appendix B - General Education Core Outcomes Assessment Matrix
$
Appendix C - General Education Core Outcomes
$
Appendix D - Assessment Proposal Form
$
Appendix E - Assessment Report Form
$
Appendix F - Master Course Guide Requirements
$
Appendix G - Master Course Guide
$
Appendix H - Division Assessment Compilation Report
$
$
Master Course Guide Archive (Record of previous MCGs)
Assessment Resources and Web Pages (active links to assessment materials and
organizations supporting academic assessment activities.)
116
Download