UNIVERSITY OF TEXAS OF THE PERMIAN BASIN

advertisement
THE UNIVERSITY OF
TEXAS OF THE
PERMIAN BASIN
INSTITUTIONAL
EFFECTIVENESS
HANDBOOK
Approved by the Assessment
Review Committee
April 29, 2008
INSTITUTIONAL
EFFECTIVENESS
HANDBOOK
The Handbook is designed to assist instructional, administrative and support offices and
departments with developing and implementing institutional effectiveness processes that improve
student learning, support services, institutional activities, policies and procedures. As a basic
reference for faculty and staff members, it provides information about the development of useful
and meaningful mission statements, goals, objectives, and outcome statements, and the
selection of appropriate assessment methods in order to obtain and use the results of evaluation.
The Assessment Review Committee for 2007-2008, whose membership is shown in Appendix A,
was instrumental in the development of this document. Without them this Handbook would not
have been possible. In addition, the Handbook has benefited tremendously from the work of
colleagues at many other institutions, and their assistance is gratefully acknowledged. A listing of
helpful handbooks and manuals is shown in Appendix B. Comments and suggestions for
improvements are welcome and should be sent to Denise Watts in the Office of Institutional
Research, Planning and Effectiveness at watts_de@utpb.edu.
i
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
TABLE OF CONTENTS
Introduction .................................................................................................................................. 1
Characteristics of Institutional Effectiveness ................................................................................ 2
Planning ....................................................................................................................................... 3
Strategic Planning .................................................................................................................. 3
The Compact with the University of Texas System ............................................................... 5
University Budget Hearings ................................................................................................... 5
Program Review and Disciplinary Accreditation .................................................................... 5
Institutional Effectiveness Process ........................................................................................ 6
Steps in the Planning Process` .............................................................................................. 7
Assessment .................................................................................................................................. 8
Common Misconceptions Associated with Assessment ........................................................ 9
Some Important Philosophical Guidelines ........................................................................... 10
Good Practice for Assessing Student Learning ................................................................... 11
Assessing Student Learning ................................................................................................ 12
Preparing the Institutional Effectiveness Plan ..................................................................... 13
Student Learning Goals, Objectives and Outcome Statements .......................................... 13
Assessing General Education Outcomes ............................................................................ 18
Administrative and Support Outcomes ................................................................................ 19
Choosing an Assessment Method ....................................................................................... 21
Common Assessment Methods ........................................................................................... 22
Some Final Thoughts ........................................................................................................... 33
Bibliography ................................................................................................................................ 34
Appendices
A. Assessment Review Committee 2007-2008 .................................................................. 35
B. Helpful Handbooks, Manuals, and Wisdom from Other Institutions ............................... 36
C. Budget and Planning Committee................................................................................... 37
D. Guidelines and Instructions for Institutional Effectiveness Reporting ............................ 38
E. Examples of Correct and Incorrect Student Learning Outcomes ................................... 76
The University of Texas of the Permian Basin
ii
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
F. Action Verbs for Writing Outcome Statements ............................................................... 78
G. Core Curriculum: Assumptions and Defining Characteristics ........................................ 80
H. General Education Results Form ................................................................................... 86
I. General Education Results Form Example ...................................................................... 87
J. Examples of Correct and Incorrect Administrative and Support Office Outcomes ........ 91
K. Institutional Effectiveness Forms Rubric ........................................................................ 93
L. Further Reading .............................................................................................................. 96
M. Reporting Departments/Offices and Degree Programs ................................................ 100
iii
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
INTRODUCTION
At the most basic level planning is the process of thinking about and deciding how to
approach the future. Assessment is the process of understanding the level of performance
attained. When formalized, institutional effectiveness involves identifying the future’s most
significant forces and finding ways to mitigate the damage or enhance the impact while
simultaneously attaining the unit’s goals in the anticipated world defined by the planning
horizon. Uncertainty and risk always exist, and longer planning horizons involve greater
elements of both. When the margins for error are slim, the need to plan is greatest. When
times are tough, the need to be constantly gathering evidence about how well the plan is
working and where the next problematic force or event is likely to arise is most intense. Even
with its formidable downsides thinking about ways to shape the future is worth the time and
effort involved. In anticipating the future we also anticipate a way forward and the
contingencies involved in taking the path.
Unfortunately, the world is not static and planning is never perfect so an important attribute of
most planning systems is that they are forgiving. Plans need to be revisited often and revised
frequently, but always with the goals in sight and always in cooperation with those who are
expected to help put the plans into motion. No group on campus lacks goals although they
may differ from person to person especially when they remain unarticulated. Not to plan
makes the department a victim; not to revisit the plan often makes it a fool. With these ideas
in mind, planning at all levels of the University makes sense. At UTPB, planning and
assessment take place throughout the organization, and it is hoped that this Handbook can
help in the effort to do it well.
As a regional accrediting group SACS is charged by the member colleges and universities to
help improve institutions of higher education. SACS has also been concerned about helping
member schools to articulate their goals, devise plans, and develop information useful to the
pursuit of those goals. Since 1984 when the College Delegate Assembly first voted to
replace the Standards of the College Delegate Assembly with the Criteria for Accreditation,
institutional effectiveness has been a centerpiece of the accreditation process. In 2001 when
the College Delegate Assembly approved the Principles of Accreditation, institutional
effectiveness remained an important element of gaining and retaining accreditation. All
institutions were expected to be in complete compliance with the institutional effectiveness
standards by 1991. With the Principles SACS has moved institutional effectiveness from the
focus of accreditation to a major compliance element.
Institutional effectiveness is a phrase devised by SACS to encompass both planning and
assessment. Core Requirement 2.5 of The Principles of Accreditation: Foundation for Quality
Enhancement is focused on institutional effectiveness. Since it is a core requirement, it is
one of “the basic expectations of candidate and member institutions.” (Commission on
Colleges of the Southern Association of Colleges and Schools, 2008: 9). According to Core
Requirement 2.5, “The institution engages in ongoing, integrated, and institution-wide
research-based planning and evaluation processes that incorporate a systematic review of
programs and services that (a) results in continuing improvement and (b) demonstrates that
the institution is effectively accomplishing its mission.” (Commission on Colleges of the
Southern Association of Colleges and Schools, 2008: 10). According to the Resource
Manual for the Principles of Accreditation: Foundations for Quality Enhancement, the
institutional effectiveness process in an institution must be a “systematic, explicit, and
documented process of measuring performance against mission;” the process is “continuous,
cyclical . . . participative, flexible, relevant and responsive;” and it includes “all programs,
services, and constituencies and is strongly linked to the decision-making process at all
levels, including the institution’s budgeting process (Commission on Colleges of the Southern
Association of Colleges and Schools, 2005:9).”
1
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Institutional effectiveness at the University of Texas of the Permian Basin includes three
major elements: 1)the institutional planning process which includes the University Strategic
Plan, the University Compact with The University of Texas System, and the University Budget
Hearings; 2)program review and disciplinary accreditation; and 3)the departmental/office
institutional effectiveness process that supports both planning and student learning,
administrative, and support service outcomes assessment. These process elements provide
the major impetus for systematic and continuous improvement at the institution.
Continuous Improvement
Continuous improvement
 Strategic Planning
 UTPB Compact with The
University of Texas System
 Budget Hearings
 Administrative and
Support Dept/
Office Institutional
Effectiveness
Processes
 Unit Compacts
 Program Review or
Disciplinary
Accreditation
 Academic Institutional
Effectiveness
Processes
 Unit Compacts
Continuous Improvement
Continuous Improvement
CHARACTERISTICS OF INSTITUTIONAL EFFECTIVENESS
Effective planning and assessment are characterized by a number of design and process
elements. The institutional effectiveness process is
Mission-centered: The institutional effectiveness system is designed to demonstrate that every
institutional component including divisions, colleges, schools, departments and offices is helping
to realize the mission of the University while successfully accomplishing its own mission.
Improvement-oriented: In addition to being centered on mission, planning and assessment must
be clearly focused on continuous improvement in each unit and throughout the University. It
should be clear that outcomes are evaluated and the results used to improve the level of student
learning and the effectiveness and efficiency of offices and programs.
Participative: Planning and assessment are shared responsibilities that extend to faculty and
staff involved in the programs and activities to be evaluated. Planning and assessment work best
when participation is broadly-based.
On-going: Planning and evaluation are not one-time events. Institutional effectiveness is
regularly scheduled, regularly reviewed, and regularly documented.
Systematic: Planning and assessment are designed to evaluate and improve all elements of the
University through routine goal setting and evaluation of the extent to which both planning and
assessment goals are achieved. Student learning goals for every degree program are set to
ensure that each academic program attains the high goals set for it by program faculty members.
While every element of mission and all program goals do not need to be evaluated every year, all
should be evaluated on a regular schedule.
2
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Integrated: The various planning and assessment processes are interconnected with budget,
with one another, and with institutional decision-making to provide the most productive system
possible. Planning and assessment processes are also integrated with external processes and
reporting systems that affect the University. Such areas of interconnection exist with The
University of Texas System, the Texas Higher Education Coordinating Board (THECB), other
state agencies, the Southern Association of Colleges and Schools, and federal agencies.
(University of Montevallo, 2002: 5-6)
PLANNING
Planning has an extended history at UTPB. The first long range plan was developed in 1982. In
1991 House Bill 2009 which required every state agency and higher educational institution to
develop an agency plan every other year was passed by the 72 nd Legislature. The first agency
plan was submitted in 1993. Since that time a number of formal planning efforts have been
undertaken by the University.
The current planning process at UTPB is shown in the diagram on the next page. As a
component of the University of Texas System and a public university, UTPB must be responsive
to the strategic plans of The University of Texas System and The Higher Education Coordinating
Board (THECB). In addition, the University must monitor and plan for national and regional
needs and trends and internal issues.
STRATEGIC PLANNING
Strategic planning addresses the long-term (10 year) planning horizon for the institution. It is
initiated by the Budget and Planning Committee which is chaired by the Provost and Vice
President for Academic Affairs. The Committee roster for 2007-08 is shown in Appendix C. It is
composed of all of the vice presidents, a dean, a department chair, faculty members, staff
members, the Faculty Senate Chair and the Chair of the Student Senate. The Committee uses
information from a number of other planning activities including The University of Texas System
Strategic Plan, THECB strategic plan, the UTPB budget hearings, and the issues of the nation
and the region in its deliberations. The draft strategic plan is developed by the Committee, and
released for review to the University community through the Administrative Council and via the
University homepage. Comments are incorporated as needed into the final version. Once
approved the Plan is submitted to The University of Texas System for review. The institutional
planning process is shown on the next page.
A number of other issue-oriented plans are also part of the planning activities at UTPB. These
plans are more restricted in scope than the Strategic Plan since they cover a single or a cohesive
set of concerns. The issue-oriented plans are developed in response to perceived needs and the
requirements of a variety of state agencies and UT System initiatives. These plans include the
Facilities Master Plan, the Distance Education Plan, the Information Resources Plan, the HUB
plan, the Recruitment and Retention Plan, and the Enrollment Management Plan.
3
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
The Institutional Effectiveness System at UTPB
Other
State Agencies
Coordinating
Board
Coordinating
Board
UT System
UTPB
Divisions
Academic,
Administrative and
Support Service
Office Compacts
UTPB
Compact
UT
System
Budget Hearings
Regional Needs
Closing the
GapsTHECB Plan
Strategic
Plan
Program
Reviews/
Accreditation
UTPB Strategic Plan
Noninstructional
Institutional
Effectiveness
Plans
UT
System
Coordinating
Board
Capital
Improvement
Plan
Academic
Affairs
Institutional
Effectiveness
Plans
Issue Plans- Appendices
Master Plan
Recruitment/
Retention
Plan
Distance
Education Plan
Enrollment
Management
Plan
Information
Resource Plan
HUB Plan
Other
State
Agencies
4
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
THE COMPACT WITH THE UNIVERSITY OF TEXAS SYSTEM
The Compact is a short-range action plan. It is updated annually in August and is based on the
University strategic plan. Planning horizons range from 2 to 5 years. UT System instituted the
development of the Compacts in 2003 as written agreements between the Chancellor of The
University of Texas System and the presidents of each of the component institutions. The
Compacts contain both goals and evaluation measures and document progress on the major
goals of each institution. Prior to being finalized, the UTPB Compact is reviewed by the
University community, and comments are submitted to the Budget and Planning Committee.
Once completed on campus, the Compact is submitted to and reviewed by The University of
Texas System. The completed and approved Compact is available both in its brief form which is
used by The University of Texas System and in the full-text form on the University’s home page
http://www.utpb.edu/utpb_adm/utpbcompact-fulltext.pdf.
UNIVERSITY BUDGET HEARINGS
University Budget Hearings are held over a two-day period each November. Vice presidents and
academic deans present needs, plans, and funding issues from their divisions, college, or
schools. These presentations are constructed from budget hearings on the departmental and
office compacts held within the divisions, colleges and schools. All University Budget Hearings
are overseen by the Provost and Vice President for Academic Affairs and the Vice President for
Business Affairs. The President and the Vice President for Student Services attend as many of
the University Budget Hearings as possible. The members of the Budget and Planning
Committee are encouraged to attend as many of the hearings as possible. They are open
meetings and notice of when and where they will be held is distributed by e-mail.
The members of the Budget and Planning Committee use the information gained from the
hearings in the development of the Strategic Plan, the update of the institutional Compact, and in
review of budget priorities. The vice presidents and President also use the information in
budgeting and planning both at the institutional and divisional levels.
PROGRAM REVIEW AND DISCIPLINARY ACCREDITATION
Each unaccredited academic degree program is reviewed by external reviewers as part of the
program review process which is a part of the institutional effectiveness process of the University.
Programs accredited by a disciplinary accrediting association are regularly evaluated by external
reviewers and are therefore not required to participate in the institutional program review process.
Since the programs in the Schools of Business and Education are nationally accredited, program
reviews are undertaken primarily in the College of Arts and Sciences.
Over a five-year period all unaccredited programs are scheduled for review. Degree programs
designated for review conduct and write a self-study using guidelines identified by the Dean of
Arts and Sciences. Reviewers are recommended by the disciplines under review and approved
by the Dean and Provost and Vice President for Academic Affairs. All reviewers for all programs
are brought in and conduct the reviews simultaneously. Thus, an external review team will make
both general remarks that transcend the disciplines being reviewed and remarks uniquely
addressed to the particular degree program for which each reviewer is responsible. The external
reviewers prepare a written report to which the degree programs reply in writing.
Recommendations accepted from the program reviews are expected to become a part of the
planning and evaluation at the departmental and discipline level and are reflected in disciplinary
and departmental planning and compacts.
5
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
INSTITUTIONAL EFFECTIVENESS PROCESS
Academic Planning and Assessment
The Assessment Review Committee is responsible for oversight of all campus assessment
activities. The Budget and Planning Committee is responsible for review of the University mission
and institutional strategic planning, and it is part of the linkage between planning and budget. As
is true at all levels of the institution, planning is bound to evaluation and both are bound to
improvement in all aspects of a department, discipline and/or College or school’s programs,
activities, and services. When academic goals focus on the achievement of student learning
outcomes, evaluation is called assessment. Assessment of student learning goals in academic
areas is so important that it is covered extensively in another section and therefore will not be
discussed here. Other goals known as program goals might cover, but are not limited to, student
advising, student or faculty recruitment, retention, degree completion rates, curricular review,
research or scholarly productivity, grant activity, faculty or student service activities or student
satisfaction and are covered in the planning for the department.
Planning and assessment are directly related to a department’s mission statement. A good
mission statement is






Focused: The unit’s purpose within the University is the nucleus of the statement.
Brief: It usually contains no more than 100 words in 3 or 4 succinct sentences.
Clear: It is coherent, free of jargon, and communicates plainly.
Positive: The statement projects a positive image without being self-aggrandizing.
Values Driven: It delineates the principles shared and practiced by members of the unit.
Adaptable: The mission statement changes as the unit’s purpose changes.
Beginning with the mission, faculty and staff define a series of goals. Since most departments
contain more than one discipline and/or degree program, it is important to note that goals include
those for the entire department and major goals for disciplines and/or degree programs.
Academic planning defines short-term academic goals called planning outcomes that are
documented at the level of the department. Planning outcomes encompass the upcoming year
and are identified on Form 1 of the institutional effectiveness documentation (See Appendix D).
Planning outcomes are aligned with the goals of the University Strategic Plan, the Compact, and
the College or school compact. Each department is different and will have some planning
outcomes that address their unique circumstances as well. For instance, some applied programs
have industry regulatory issues that must be addressed either in the curriculum or in other ways;
disciplines that are moving toward accreditation may have outcomes that focus on compliance
with accreditation mandates; departments and disciplines in rapidly changing fields might have
equipment and/or curricular outcomes that focus on maintaining currency in the discipline and so
on. The extent to which a unit is achieving its goals is critical information for continued
improvement. The planning documentation includes progress measures and how the results of
evaluating progress have led to additional change or changes in strategies or tactics (Form 3
Appendix D).
Support and Administrative Office Planning
Offices that do not deliver direct instruction are also expected to plan and document their
planning. As is true with academic units, support and administrative offices plan for
improvements in such areas as services, programs, personnel, policies, procedures, resource
acquisition, activities, events, and documentation. Planning goals are developed with an
understanding of the environment within which the office operates. Broad-based participation by
the members of an office who will be affected by the plan is also an important attribute for
planning success. The broader the participation in setting the goals; the more likely members of
the unit will actively attempt to achieve them.
6
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Planning must be congruent with the goals of University Strategic Plan, the Compact, and the
division compact, and address the unique elements of the office. For example, financial aid
offices must address a challenging regulatory environment and need to address the continuing
education needs of staff. Computing services units face rapidly changing hardware and software
demands, staff training needs, changing educational technology, a rapidly shifting security
environment, and considerable regulation. The capacity of many offices to maintain acceptable
programs and services is dependent on their ability to stay ahead of the next big change in their
environment.
Planning outcomes are short-term and usually restricted to the next year to two years. In most
cases a planning outcome contains the objective and at what level the office hopes to accomplish
the objective. Planning outcomes, progress measures and methods for measuring progress are
documented on the office planning form (Form 5 is included in Appendix D). The results of
activities associated with achieving the outcome and the use that the office will make of the
results obtained are on the planning results and use form (Form 7) shown in Appendix D.
Assessment of support and administrative office outcomes will be covered in the next section.
STEPS IN THE PLANNING PROCESS
Step 1: Mission or Purpose. There are usually several reasons for the existence of a unit. Once
those reasons are enumerated, the unit develops a brief statement of purpose or mission. This
statement should reflect a purpose to which the members of the unit are committed because it will
become the foundation for all institutional effectiveness activities. While a mission statement may
capture the purposes of a unit perfectly at one particular time, no unit is immune to change.
Significant changes in purpose should be captured in the mission statement which means that
unit mission statements need to be reviewed periodically to keep them current and accurate.
This need to review why a unit exists is especially important in a University undergoing rapid
change. A serious reconsideration of the purpose of a unit should be undertaken any time the
unit adds or loses responsibilities and at least once every five years.
Step 2: Understand the Environment. All units exist in a rich environment of goals, issues, needs,
regulations, expectations and requirements. These forces exist both inside and outside the
University. Once the important elements of the environment are identified it is clearer what the
unit needs to do to move forward. Since no one can predict the future with unerring accuracy
having an understanding of the likelihood of a future event and the type of response required is
critical since important, high priority events must be addressed while other events may have a low
probability of occurrence, but would be serious in their consequences if they were to occur. For
instance, the possibility that the University might suffer a shooting incident like the one that
occurred at Virginia Polytechnic and State University is not high, however should such an incident
occur, emergency services on campus including the Police Department, Physical Plant, Student
Services including Counseling, Academic Affairs, Public Relations, and the President’s Office will
be expected to act rapidly and decisively. Rapid decisive action is almost impossible in the
absence of adequate planning.
Step 3: Develop Goals. Offices and departments use a variety of different sources to assist them
in developing goals. The University goals articulated in the Strategic Plan, the Compact, the
plans from the UT System and THECB, College and school goals, program review, accrediting
standards, disciplinary trends, state and federal regulations, and departmental issues among
other elements form the basis for departmental goal formation. As much as possible, plans
should be synchronized with the goals of the University and should cover the major internal and
external challenges. Everyone in the unit who will be expected to help achieve the goals should
be involved in development of the plan. Involvement fosters understanding and buy-in both of
which contribute to goal attainment. The goals are broken into a series of shorter term objectives
that are enumerated on the appropriate forms each year.
7
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
The forms that document the plans and assessments that will occur each year are submitted
once a year. However, plans are always subject to change as the environment changes. When
planned activities and assessments change, the department or degree program amends its forms
and submits them through its administrative channels to the Office of Institutional Research,
Planning, and Effectiveness.
Step 4: Implement the Plan. Departments must put their plans into action to realize the benefits.
Working on implementation of objectives should be continual over the year rather than a hurried
attempt to complete the objectives because the results are overdue or due in the near future. A
simple, but highly effective technique for supporting a set of objectives is to assign responsibility
for implementation and establish a timeline. The timeline provides obvious opportunities for
checkpoints on what has and has not been completed and a basis on which to encourage action.
Unassigned responsibilities are not likely to be performed unless the department chair or the
director initiates implementation activities.
Step 5: Build on Success. Once a plan has been put into motion it is important to check
progress regularly. With a planned activity, unless it is a very simple and straightforward activity,
there are milestones along the way to accomplishing the goal or objective. Taking time to
consider progress at those milestones allows mid-course corrections that reinforce the likelihood
of success. Attaining an objective is a moment to celebrate, but also a moment to consider what
comes next. Documenting the results of planning takes place on either form 3 for instructional
areas or form 7 for administrative or support offices (Appendix D). Results of planning need to be
displayed in column three of both forms. It should be clear how the planning results have led to
actions shown in column four of the forms.
ASSESSMENT
Assessment has been widely discussed in higher education since the 1980s. Interest in finding
ways to improve learning in colleges and universities was the subject of numerous reports and
commissions in the 1980s and 1990s. What began as a call within the academy for greater
concern about our students learning, has become a call for greater accountability by all
educational institutions. Interest in assessment now extends from the federal government
through governors and state officials and into coordinating boards and boards of trustees and
regents. Both disciplinary and regional accreditation groups now require student learning to be
measured and action taken to improve it. The belief that public colleges and universities should
be more transparent and accountable to parents, state officials and taxpayers is not confined to
any political party, and it is unlikely to diminish in the near future.
Despite changes in rhetoric, assessment should be of continuing interest among faculty members
in colleges and universities. It is the learning of our students that is the focus of so much
attention. The degrees those students obtain have the names of our universities imprinted on
them. Assessment should be our concern just as teaching and learning is our concern.
There are many definitions of assessment in the literature; however for the purposes of the
Handbook we will use the following definition from Upcraft and Schuh (1996:18)
Assessment is any effort to gather, analyze, and interpret evidence which
describes institutional, divisional, or agency effectiveness.
Assessment takes place throughout the University in both instructional and non-instructional
areas. While the subjects of assessment differ depending on whether the unit offers formal
instruction, all of assessment is focused on improvement. As Schuh and Upcraft (2001:4) note
Effectiveness includes not only assessment of student learning outcomes, but
assessing other important outcomes such as cost effectiveness, clientele
8
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
satisfaction, meeting clientele needs, complying with professional standards, and
comparisons with other institutions. Assessment . . . is not restricted to students,
but may include other constituents within the institution, such as the faculty,
administration, and governing boards, and outside the institution, such as
graduates, legislators, funding sources, and accreditation agencies.”
COMMON MISCONCEPTIONS ASSOCIATED WITH ASSESSMENT
1. Faculty members already evaluate students through the grades they issue in their
classes. Program assessment is redundant.
Faculty members do evaluate students routinely and assign grades to reflect individual levels
of accomplishment on tasks, tests, and courses. However grades reflect a variety of types of
learning, activities, bonus credits, and expectations that vary from instructor to instructor.
Assessment requires a program’s faculty to make a joint decision about what specific
learning should be evaluated and what knowledge, skills, and attitudes are indicative of the
major learning goals set for the program. Programmatic assessment, unlike grades, is
designed to evaluate the level of accomplishment of a program not an individual student.
Programmatic evaluations examine the aggregate level of accomplishment of a program’s
students as a way to establish whether or not the program as a whole is performing at a high
level.
2. Staff members already know about the processes and procedures, bottlenecks and
issues that they have with their users. Assessment does not tell us anything new.
Staff members do know a lot about the issues and problems that exist in their offices, but are
often too close to the problems, too invested in the processes and procedures, or just too
busy to develop solutions to the problems that they know exist. Assessment forces staff
members to step back, examine issues and problems from a new perspective, and rethink
the activities of the office. No office is so perfect that examination of issues and problems will
not allow staff members to make improvements in how the office functions.
3. Assessment violates academic freedom.
When appropriately conducted neither faculty knowledge nor judgments in teaching or
research are violated. While the joint faculty deliberations that produce program goals do
produce parameters within which instruction takes place, it has always been true that the
faculty as a group is responsible for a program’s curriculum. No faculty member is free to
teach anything they wish in a course. The discipline defines what is appropriate in terms of
knowledge, skills and attitudes, while course descriptions and programmatic needs define
what is taught in any particular course.
No one has ever had the right to teach a course just as she pleases; we always
are bound by the rules of responsible interaction with students, by
departmental agreement about what a course will cover, and the requirement
that we assign each student a grade that is public to limited audiences. We
hand out a syllabus or put it on the web. We establish goals for the course and
share them with colleagues and students. We share problems in student
learning and plans for a course whenever we submit a course to the curriculum
committee for approval, ask for new resources, come up for tenure, or engage
in a departmental discussion about improving our teaching. Assessment asks
for an extension of this collegial work (Walvoord, B.E. 2004: 8).
In a degree program, the disciplinary faculty define the goals of the program, interprets the
meaning of assessment results, determine the degree to which students have acquired
appropriate learning, and decide what will improve student attainment.
The University of Texas of the Permian Basin
9
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
4. Assessment should only be used to demonstrate a program’s or office’s successes.
Assessment in instructional areas is based on the improvement of student learning and
program success. In non-instructional offices or departments, it is based on improvement in
the efficiency or quality of services. If the program or office has no room for improvement
then assessment is of no use. It follows then that the areas in which assessment should be
concentrated are those in which the greatest improvements can be made. The reality is that
every program and every office can get better, and assessment can provide information to
inform decisions about improvement.
5. Assessment is just another form of faculty and staff evaluation.
The University has processes for evaluating faculty and staff. There is no interest on any
administrative level at UTPB in evaluating individual faculty or staff members through the
assessment system. Assessments are created by the faculty and staff, conducted by the
faculty and staff, analyzed by the faculty and staff, and any changes are initiated by the
faculty and the staff. The types of changes that are appropriate in instructional areas include
but are not limited to adding or changing student assignments, adding or dropping
prerequisites, adding courses, changing course descriptions, restructuring degrees and so
on. In non-instructional areas appropriate changes are related to finding new efficiencies,
changing policies or procedures, and improving communication with their users. Changes
are not related to personnel; they are related to improving academic programs and university
services.
6. Assessment is only useful for getting through the SACS reaffirmation process.
The gist of this misconception is that assessment is a waste of valuable faculty and staff time,
but we have to do it so we will put in as little time and effort as possible. It is true that the
Southern Association of Colleges and Schools (SACS) requires that assessment be
conducted and used. If, however, SACS requirements are the only reason why we conduct
assessments then faculty and staff time is indeed being wasted, but SACS is not the guilty
party. Assessment is like most things, the value of the return is dependent on the time, effort
and thought expended. Faculty members who view assessment as a form of research into
the effects of their program are often rewarded with information that can make a real
difference in the extent to which student’s achieve. Staff members who are willing to
examine the services that they provide in-depth are often surprised by the difference that
changes can make. Faculty members, especially those who just want to get through the
chore, can probably predict how valuable the effort will be—they have experience with the
same attitudes and outcomes in their classes.
SOME IMPORTANT PHILOSOPHICAL GUIDELINES
1. Assessment matters because the faculty and staff at UTPB believe that student learning
matters. Assessment will therefore be created, conducted, analyzed, and used by the faculty
and staff for improvement.
2. The assessment process will respect academic freedom and honor faculty responsibility for
general education and program curricula.
3. The assessment process will respect the expertise of the staff and honor their commitment to
conduct their responsibilities appropriately and professionally.
4. Assessment includes the routine, systematic collection of reliable and valid information about
student achievement of program goals and the operation of non-instructional services. Its
10
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
purpose is to assist faculty in increasing student learning and staff in improving university
functioning at all levels.
5. While assessment results are used to improve student learning in programs and operational
effectiveness in offices, results alone do not dictate how improvement should take place. The
faculty and staff, exercising their professional judgment and values, are responsible for
deciding on appropriate changes to improve student achievement and University operations.
6. Assessment will not be used for the purpose of evaluating faculty or staff members.
7. Assessment is an on-going process that will be a part of the institutional culture and regular
faculty and staff work at UTPB.
GOOD PRACTICE FOR ASSESSING STUDENT LEARNING
The American Association for Higher Education published the following nine principles for
assessment of student learning in 1992. While these good practice characteristics were
formulated for academic instructional assessment, they are valuable for all members of the
University community who are involved in the assessment process. They are as valid today as
they were when they were first published.
The assessment of student learning begins with educational values. Assessment is not an
end in itself but a vehicle for educational improvement. Its effective practice, then, begins with
and enacts a vision of the kinds of learning we most value for students and strive to help them
achieve. Educational values should drive not only what we choose to assess but also how we
do so. Where questions about educational mission and values are skipped over, assessment
threatens to be an exercise in measuring what’s easy, rather than a process of improving what
we really care about.
Assessment is most effective when it reflects an understanding of learning as
multidimensional, integrated, and revealed in performance over time. Learning is a complex
process. It entails not only what students know but what they can do with what they know; it
involves not only knowledge and abilities but values, attitudes, and habits of mind that affect both
academic success and performance beyond the classroom. Assessment should reflect these
understandings by employing a diverse array of methods, including those that call for actual
performance, using them over time so as to reveal change, growth, and increasing degrees of
integration. Such an approach aims for a more complete and accurate picture of learning, and
therefore firmer bases for improving our students’ educational experience.
Assessment works best when the programs it seeks to improve have clear, explicitly
stated purposes. Assessment is a goal-oriented process. It entails comparing educational
performance with educational purposes and expectations derived from the institution’s mission,
from faculty intentions in program and course design, and from knowledge of students’ own
goals. Where program purposes lack specificity or agreement, assessment as a process pushes
a campus toward clarity about where to aim and what standards to apply; assessment also
prompts attention to where and how program goals will be taught and learned. Clear, shared,
implementable goals are the cornerstone for assessment that is focused and useful.
Assessment requires attention to outcomes but also and equally to the experiences that
lead to those outcomes. Information about outcomes is of high importance; where students
“end up” matters greatly. But to improve outcomes, we need to know about student experience
along the way; about the curricula, teaching, and kind of student effort that lead to particular
outcomes. Assessment can help us understand which students learn best under what conditions;
with such knowledge comes the capacity to improve the whole of their learning.
11
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Assessment works best when it is ongoing, not episodic. Assessment is a process whose
power is cumulative. Though isolated, “one-shot” assessment can be better than none,
improvement over time is best fostered when assessment entails a linked series of cohorts of
students; it may mean collecting the same examples of student performance or using the same
instrument semester after semester. The point is to monitor progress toward intended goals in a
spirit of continuous improvement. Along the way, the assessment process itself should be
evaluated and refined in light of emerging insights.
Assessment fosters wider improvement when representatives from across the educational
community are involved. Student learning is a campus-wide responsibility and assessment is a
way of enacting that responsibility. Thus, while assessment efforts may start small, the aim over
time is to involve people from across the educational community. Faculty members play an
especially important role, but assessment’s questions can’t be fully addressed without
participation by student affairs educators, librarians, administrators, and students. Assessment
may also involve individuals from beyond the campus (alumni/ae, trustees, employers) whose
experience can enrich the sense of appropriate aims and standards for learning. Thus
understood, assessment is not a task for small groups of experts but a collaborative activity; its
aim is wider, better-informed attention to student learning by all parties with a stake in its
improvement.
Assessment makes a difference when it begins with issues of use and illuminates
questions that people really care about. Assessment recognizes the value of information in
the process of improvement. But to be useful, information must be connected to issues or
questions that people really care about. This implies assessment approaches that produce
evidence that relevant parties will find credible, suggestive, and applicable to decisions that need
to be made. It means thinking in advance about how the information will be used and by whom.
The point of assessment is not to gather data and return “results”; it is a process that starts with
the questions of decision-makers, that involves them in the gathering and interpreting of data, and
that informs and helps guide continuous improvement.
Assessment is most likely to lead to improvement when it is part of a larger set of
conditions that promote change. Assessment alone changes little. Its greatest contribution
comes on campuses where the quality of teaching and learning is visibly valued and worked at.
On such campuses, the push to improve educational performance is a visible and primary goal of
leadership; improving the quality of undergraduate education is central to the institution’s
planning, budgeting, and personnel decisions. On such campuses, information about learning
outcomes is seen as an integral part of decision-making, and avidly sought.
Through assessment, educators meet responsibilities to students and to the public. There
is a compelling public stake in education. As educators, we have a responsibility to the publics
that support or depend on us to provide information about the ways in which our students meet
goals and expectations. But that responsibility goes beyond the reporting of such information; our
deeper obligation to ourselves, our students, and society is to improve. Those to whom
educators are accountable have a corresponding obligation to support such attempts at
improvement.
ASSESSING STUDENT LEARNING
As is true with the definition of assessment, there are many different definitions of academic
assessment in the literature. One of the better definitions emphasizes the four major attributes of
student learning assessment: 1) it involves faculty articulating their values and expectations; 2)
setting high standards for learning; 3) investigating the extent to which the learning goals are
attained and 4) using the information to further improve student learning.
12
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Assessment is an ongoing process aimed at understanding and
improving student learning. It involves making our expectations explicit
and public; setting appropriate criteria and high standards for learning
quality; systematically gathering analyzing, and interpreting evidence to
determine how well performance matches those expectations and
standards; and using the resulting information to document, explain, and
improve performance (Tom Angelo as quoted in unpaginated University
of Texas System web page, Student Learning Assessment, Frequently
Asked
When applied
to Questions).
degree programs, assessment of student learning involves the faculty in a
degree program deciding on the program’s major goals, setting learning expectations for those
goals; deciding how the goals will be evaluated, and using the results of the evaluation to improve
the program’s ability to provide a high quality learning experience for students.
PREPARING THE INSTITUTIONAL EFFECTIVENESS FORMS
Institutional effectiveness in instructional areas has two components: departmental planning and
degree program assessment. The forms for those activities are shown in Appendix D. On all
forms, the information about a particular outcome must be lined up horizontally on each form
(objective, outcome, methodology or progress measure, results, use) to facilitate reading and
understanding the material.
Planning
Planning is used in two contexts in this discussion. The first usage is more global than the
second. This more global usage indicates that departments and offices must make explicit what
they will be doing in both new initiatives and assessments to be conducted. The second more
limited use of planning is to restrict it to the items listed on form 1 for the upcoming year. In this
usage an academic department, college, or school uses form 1 to describe the major initiatives to
be undertaken during the upcoming year. These plans may be initiated in any area of the unit’s
work and may be based on any legitimate source of information including professional judgment,
direction of the discipline, recognized needs in the department, recommendations from program
review committees or accreditation site visits, etc In deciding what changes to activities, services,
policies, procedures, equipment, staffing, etc. will be of high priority in the coming year, the
department/office must take the elements of the University’s current Compact with The University
of Texas System. The Compact is available on the University’s Web site in the links listed
underneath the University’s address at http://www.utpb.edu/utpb_adm/utpbcompact-fulltext.pdf .
Each dean’s office and academic department must submit a planning form to their next higher
level supervisor on the schedule issued by the Office of Institutional Effectiveness each year.
The results of planned change in the unit are reported on form 3 and how the current state of
progress will be used to inform continuing change or institutionalization of the change initiative
are also reported in the column on use of results.
Assessment
In instructional areas assessment of student learning is the primary focus. As Banta et al.
(1996:11) write, “Effective assessment programs reflect the imaginative, creative, and energizing
aspects of learning, not only so as to more accurately measure the breadth and depth of the
learning experience but also to contribute to the ongoing spirit of inquiry, reflection, and growth
that characterize the university as an institution.” Assessment focuses on three elements of
learning: core content or knowledge, skills that allow knowledge and facts to be applied, and
dispositions that represent the beliefs that students should attain as educated citizens and
members of their discipline.
Student Learning Goals, Objectives and Outcome Statements
13
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Student Learning Goals and Objectives. Program assessment is focused on five major
questions:





What do we, as a faculty, expect a student to know, be able to do, or believe as a result
of going through the program?
How well are students achieving those results?
How do we know?
How do we use the information to improve student learning?
Do the improvements to the program make a difference?
Assessment is directly related to a degree program’s major student learning goals. These
goals are the beginning point for assessment. Goals are broad statements of what students who
complete the program will know, be able to do or will believe. Learning goals are not usually
directly testable. For instance, the goal that “students will be able to write an acceptable research
report” would be difficult to test without specific information about what constitutes “acceptable” in
the discipline; what the correct format for a research report was in the discipline; and what
constitutes appropriate content for a research report in the discipline. Examples of goals might
be:
“Students will have the scientific tools to expand knowledge in the discipline”
“Students will understand the major theoretical frameworks in the field”
“Undergraduate majors will have a broad understanding of the major concepts and
vocabulary of the discipline.”
Statements which begin with “The department will provide . . .” or “The program will strive to . . .”
may be goals, but they are not student learning goals. The focus of a student learning goal is on
what student’s learn not what faculty members teach or what the program seeks to accomplish.
The goals are translated into more testable statements called objectives. These objectives
make clear the meaning of the goal within the framework of the discipline and the specific degree
program. For instance, an acceptable research report in psychology might entail the objective that
“students will write a research report using APA format that contains a clear description of the
research question, an appropriate review of the literature, acceptable hypotheses for the original
research that the student has undertaken, appropriate tests of the hypotheses using data
gathered and analyzed by the student and the drawing of appropriate conclusions from the
literature and the analysis.” The same type of operationalization of the goals could be done for
each of the goals noted above.
Writing Student Learning Outcomes. The next element in a degree program assessment is
the statement of the student learning outcomes.
Learning Outcomes are statements of what students will know, be able to
do, or value as the result of an academic program or learning activity.
An outcome is stated in such a way that it is clear what target a group of students must attain on
a specific measurement tool in order to be considered to have successfully attained the student
learning goals and objectives.
Student Learning Objectives are specific, observable and measurable. There are a number of
interesting systems for writing outcome statements. The University of Central Florida in their
Academic Program Assessment Handbook (2005: 30-31) describe the SMART system.
Specific
14
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook

Define specific learning outcomes. Clearly define the expected abilities, knowledge,
values, and attitudes a student who graduates from your program is expected to have
attained.
 Focus on critical outcomes. When data are available, there should an opportunity to
make improvements in the program.
Measurable
 It should be feasible to collect accurate and reliable information on the student learning
outcome.
 Consider your available resources in determining the method for data collection.
Aggressive but Attainable
 When defining the learning outcomes and setting targets, use targets that will move you
in the direction of your vision, but don’t try to be perfect all at once.
 Some questions that might be helpful are
o How have the student’s experiences in the program contributed to their abilities,
knowledge, values and/or attitudes?
 What do students know?
 What can students do?
 What do students care about?
o
o
o
What knowledge, abilities, values, and attitudes are expected of graduates of the
program?
What would the outcomes in a perfect program look like?
What would the outcomes in a good program look like?
Results Oriented and Timely
 Define where the program would like to be with a specified time-limit (i.e. an increase of
10% in test scores over the next year, 90% attainment this year, 15% improvement in the
rubric scores on communication within a year).
 Also determine what standards are expected from students in your program. For some
outcomes, you may want 100% attainment, while for others a lower target is reasonable.
In order to write a student learning outcome, both the student learning objective and the method
of assessment must be known. An outcome statement defines explicitly what constitutes
successful attainment of a particular objective and therefore successful attainment of the goal to
which the objective is related. Thus for instance, a student outcome might be “Overall, students
in PSY 4350 will have an average of 5 on the research report rubric.” If use of a microscope were
defined as one of the “scientific tools” that students had to master in order to “expand knowledge
in the discipline,” the objective might be phrased as “All students will be able to set up a
microscope and successfully view a prepared slide.” A student outcome from this objective might
be, “By their sophomore year 100 percent of biology majors will pass the microscope test with an
80 or above.” A list of correct and incorrect student learning outcomes are shown in Appendix E.
Identifying Skills and Knowledge for Student Learning Outcomes. The use of a taxonomy of
learning domains may be helpful in writing learning outcomes. The best known of these
frameworks is Bloom’s Taxonomy of Educational Objectives (1956). Bloom’s taxonomy
recognizes three domains of educational objectives: cognitive, skills, and affective.
Cognitive
Domain
Description
15
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Knowledge
Comprehension
Application
Analysis
Mastery of subject material; includes observation and recall of information;
knowledge of dates, events, places; knowledge of major ideas.
Ability to predict consequences and future trends; includes understanding
information; grasp of meaning; translating knowledge into new contexts;
interpreting, comparing and contrasting material; ordering, grouping and inferring
causes
Ability to solve problems using required knowledge/skills; includes using
information, material, methods, concepts, theories, etc. in new situations
Ability to break down material and recognize structure of organization; includes
seeing patterns; organization of parts, recognition of hidden meanings,
identification of components
Synthesis
Ability to use old ideas to create new ones; includes generalizing from given
facts, relating knowledge from several areas, predicting and drawing conclusions
Evaluation
Ability to judge and assess value of material; includes comparing and
discriminating between ideas; assessing value of theories, presentations, etc.,
making choices based on reasoned argument, verifying value of evidence,
recognizing subjectivity
Affective
Domain
Receiving
Responding
Valuing
Organization
Characterization
by value
Awareness; willingness to participate
Actual participation in learning activity; demonstrates interest
Attaching value or worth to object, person, activity, phenomenon
Prioritizing values; comparing and contrasting values to build a new value system
Modifies behavior based on new value system
Skill Domain
Perception
Use of sensory organs to guide actions
16
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Set
Readiness to act
Guided
Response
Mechanism
Complex Overt
Response
Imitation; knowledge of steps required to complete task
Ability to repeat complex motor skill
Display complex movement with skilled performance
Adaptation
Modifies motor skill to address changed situation
Origination
Creates new movement pattern in changed situations
Action Verbs Associated with Different Learning Domains. Action verbs associated with
various learning domains may be helpful in constructing learning outcomes. Use of the verbs
below helps to clearly define what students are expected demonstrate in terms of the
assessments.
Learning Domain
Action Verbs
Knowledge
Articulate, describe, define, name, indicate, order, recognize,
know, repeat, memorize, label, tabulate, quote
Comprehension
Discuss, explain, interpret, distinguish, suggest, summarize,
understand, translate, classify, contrast
Application
Apply, investigate, experiment, solve, practice, predict, utilize,
develop, illustrate
Analysis
Analyze, categorize, correlate, inform, infer, prioritize, criticize,
differentiate, examine, interpret
Synthesis
Arrange, collect, compose, assemble, compile, create, design,
formulate, organize, manage, propose, validate
Evaluation
Rate, conclude, appraise, evaluate, judge, defend, grade, assess
17
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Receiving
Identify, select, choose, describe
Responding
Recite, discuss, present, answer
Valuing
Describe, explain, differentiate, join, share
Organization
Order, arrange, combine, integrate, synthesize, generalize
Characterization by Value
Perception
Qualify, practice, listen, influence, share, propose
Identify, detect, describe, isolate
Set
Respond, show, react, display
Guided Response
Construct, manipulate, assemble
Mechanism
Build, fix, organize, work, calibrate
Complex Overt Response
Manipulate, measure, mix, dismantle
Adaptation
Alter, revise, change, vary
Origination
Compose, construct, design
(Adapted from Western Carolina University’s Handbook for Program Assessment, pages 29-30)
Additional action verbs useful for writing outcomes are shown in Appendix F.
ASSESSING GENERAL EDUCATION OUTCOMES
The general education curriculum must also be assessed. It is different from evaluation in other
programs as the result of having the most basic goals of the curriculum defined by the Texas
Higher Education Coordinating Board (THECB). In 1997, the Texas Legislature required THECB
to develop the “content, component areas, and objectives of the core curriculum.” This resulted
in the “Core Curriculum: Assumptions and Defining Characteristics” (Appendix G) which
identified the assumptions, defining characteristics of intellectual competencies, perspectives,
and exemplary educational objectives by component area. As a result of THECB guidance in
evaluation of the core curriculum UTPB has identified the following college level competencies for
its core curriculum:
 Students will be able to communicate effectively in clear and correct prose appropriate to
the subject, occasion, and audience.
 Students will be able to apply mathematical tools in the solution of real-world problems.
18
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook

Students will be able to understand, construct and evaluate relationships in the natural
sciences and understand the bases for building and testing theories.
 Students will engage in critical analysis, form aesthetic judgments and develop an
appreciation of the arts and humanities through knowledge of the human condition and
human cultures, especially in relation to behaviors, ideas, and values expressed in works
of human imagination and thought.
 Students will have knowledge of how social and behavioral scientists discover, describe,
and explain the behaviors and interactions among individuals, groups, institutions, events
and ideas.
Each of the competencies is associated with a series of exemplary educational objectives
(EEOs). The University is required by law evaluate and report to THECB on attainment of the
EEOs every 5 years. According to THECB’s “Core Curriculum: Assumptions and Defining
Characteristics (Rev. 1999),” Exemplary educational objectives become the basis for faculty
and institutional assessment of core components (p. 4).”
With the competencies and the exemplary educational objectives in mind, the disciplines and
departments have identified the defining characteristics, perspectives and exemplary educational
objectives addressed in each core course. Instructors in each core course have developed
student learning outcomes that operationalize the exemplary educational objectives chosen for a
course and conducted assessments of the extent to which students have attained the EEOs. Any
course that seeks to become a part of the core curriculum must identify the EEOs addressed by
the course and develop a methodology for assessment of the student learning outcomes that
correspond to the chosen EEOs in the appropriate component area. Forms for the identification
of appropriate EEOs may be obtained by contacting the Office of Institutional Research, Planning,
and Effectiveness. The reporting forms for identification of the student outcomes for each
selected EEO, the assessment methodology, the results and use of assessment are shown in
Appendix H along with an example in Appendix I. The Office of Institutional Research, Planning
and Effectiveness is responsible for working with the General Education Oversight Committee in
supporting the assessment process and completing appropriate reports.
ADMINISTRATIVE AND SUPPORT OUTCOMES
Administrative and support offices and departments provide critical services to the University and
to students even though they do not provide direct instruction in classes. They include, but are
not limited to offices like Financial Aid, Student Activities, Human Resources, Physical Plant,
Writing Center, Dunagan Library, and the Advising Center.
Like instructional departments, administrative and support offices develop plans and conduct
assessments in order to provide information for improvement in their programs, policies,
procedures, publications, services, activities, and events. Institutional effectiveness forms and
instructions are located in Appendix D. Forms 5 and 6 can be amended by sending the revised
forms through the office’s administrative line to the Office of Institutional Research, Planning and
Effectiveness. In all cases, the information about a particular outcome must be lined up
horizontally on each form (outcome, methodology, results, and use) to facilitate reading and
understanding the material.
PREPARING THE INSTITUTIONAL EFFECTIVENESS FORMS
Planning
Administrative and support office planning should be aligned with the University strategic plan,
the University Compact http://www.utpb.edu/utpb_adm/utpbcompact-fulltext.pdf, the compacts of
their division and any externally imposed goals that result from regulatory bodies, government
agencies or program audits. Every office is a part of the whole and needs to work diligently to
help the University achieve its highest priority goals. Every office is also unique and needs to
ameliorate its weaknesses and promote its strengths and opportunities for success.
19
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Planning is related to the department or office mission which should fall within the mission for the
University. Missions are brief, clear statements about the essential reasons for a unit’s existence.
Goals for the department or office are written, coordinated with progress measures, and used to
increase the effectiveness of the unit.
Assessment
Outcomes Statements
Assessment among administrative and support departments and offices is different from
instructional units although many of the same techniques can be used.
Unit outcomes are intended, observable, measurable results of processes, programs,
services, policies, activities, or events. It is not useful to evaluate all possible outcomes for
an office.
As a rule, outcomes that are most closely related to the mission need to be evaluated most
frequently. Other areas that may be the foci of assessment include processes perceived as
problematic, or procedures or activities that have been changed recently. Support units that have
direct contact with students may also have student outcomes. Student outcomes should be
developed as in academic areas. Please refer pages 12-18 for more information on student
learning outcomes.
A good paradigm for construction of administrative or support outcomes is the ABCD Model




Audience (specify the group that is affected by the process, policy, activity, etc.)
Behavior (measurable, observable variable that is being assessed)
Condition or Circumstance (the situation within which the expected behavior must occur)
Degree (minimum acceptable performance target)
(Adapted from the University of Virginia Assessment Workshop, Assessment Plan DevelopmentInstructional Programs Open Forum Presentation)
Another model for developing administrative or support unit outcomes gives a pattern for
developing outcome statements:
Name of unit will . . .
provide
improve
decrease
increase
provide quality
target . . .Name of current service.
Client . . . . . . . . . will be satisfied . . . .target . . .. . name of current service.
Students
Service
verb+objective . . .
Faculty
Tutoring
Target . . .
will improve . . .
Alumni
attending
Academic Advising
will increase . . .
Staff
Counseling sessions
will understand . . .
(Adapted from Cleaves, Cheryl et Training
al. A Road Map for Improvement of Student Learning and
Support Services through Assessment, p. 148 as shown in Assessment of Institutional
Effectiveness: An Introductory Manual, Texas Tech University)
In student services areas including academic advising, housing and residence life, learning
assistance, student conduct programs, financial aid programs and 17 other areas the Council for
the Advancement of Standards in Higher Education (CAS) has developed a series of national
standards to facilitate the improvement of support services. These standards provide an
excellent starting place for offices looking for significant planning and outcome areas.
The University of Texas of the Permian Basin
20
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
A list of verbs that can help you in devising outcomes is contained in Appendix F and some
examples of correct and incorrect outcomes by common university offices are shown in Appendix
J.
Assessment Methodologies
Methods for conducting outcome assessment are often congruent with those for academic
instructional areas especially in terms of assessing student learning outcomes. Student learning
should be assessed using direct methods for knowledge and skills outcomes. Attitudinal
outcomes such as satisfaction are evaluated using indirect methods such as surveys and focus
groups. Non-instructional areas are most likely to use indirect means of assessment and may
also use other forms of assessment including, but not limited to techniques such as participant
counts, time to completion, ratios, utilization measures, error rates, demographic information,
activity logs, audit findings etc.
Results and Use of Results
As with student learning outcomes, all administrative and support offices are required to report
the results of their assessments and discuss the use made of the results to improve their
services, policies, programs, and procedures. Results need to be reported in such a way that it is
possible to understand whether or not the target set in the outcome statement has been
achieved. If the outcome target was achieved, no further action on that particular outcome is
necessary unless the target is to be raised. However, as is often the case, the target may have
been achieved, but in the course of conducting the assessment other information about the
outcome may have been obtained that allows for improvements to be made. In those cases, the
results column on the form should show not only whether or not the target was achieved, but also
the information that led to the particular improvements to be implemented.
Documentation
Documenting the planning and assessment outcomes, the progress measures, methodologies of
assessment, results, and use is extremely important. A schedule that lists due dates for plans
and results is published and distributed by e-mail each year. It is extremely important that
deadlines be respected and materials sent forward on time. Missed deadlines increase
everyone’s workload. Problems and questions can be referred to the Office of Institutional
Research, Planning, and Effectiveness. The Office is always interested in helping offices to
develop, conduct, and use the results of planning and assessment.
CHOOSING AN ASSESSMENT METHOD
There are several important choices to be made as you consider assessment methods. First,
how appropriate is the measure for the objective to be measured. Applicability refers to the
degree to which a measure can actually serve to return the information necessary to understand
whether or not an objective has been achieved. A second consideration is the degree to which
the method returns diagnostic information that will allow intervention in the program in ways that
improve the extent to which a particular objective is attained. A third consideration is the extent to
which the measure will return consistent information about the objective under consideration. A
final consideration is whether or not the measure is unbiased. Does it offer reasonable and
useful information across a broad spectrum of groups? Finally, is the data obtained from the
chosen method understandable? The difference between data and information is the difference
between having results and being able to use them.
COMMON ASSESSMENT METHODS
Categories of Assessment Methodologies
21
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Methods fall into two categories either direct or indirect. Direct methods return evidence of
attainment through a demonstration of accomplishment. These performance-based methods
include such things as examinations, projects, portfolios, juries, or audio or videotape evaluations.
Objectives and outcome statements related to knowledge and skills should be evaluated using at
least one direct method of assessment. Indirect methods include assessments based on
perception. These methods include, but are not limited to surveys, both questionnaires and
interviews, job placement rates, focus groups, benchmarking and graduate or professional school
acceptance rates. These methods are good secondary measures for knowledge and skills, but
do not return actual evidence acceptable for student learning outcome statements. Indirect
methods may be the best evidence for other types of outcome statements like student
perceptions of the advising system or satisfaction with services.
Course embedded assessments like all other programmatic assessments need to be selected,
evaluated and approved by a designated faculty group or by the faculty as a whole in the
disciplinary area in which the assessment will be conducted. The assessments take place within
courses and students understand the assessment as a regular course assignment or a regular
examination. The examination or assignment may be graded by the faculty member and be a
part of a student’s grade. However, when used for assessment, the test or assignment is graded
according to criteria that are specific to a student learning outcome statement(s) for the purposes
of evaluating and improving student learning. Unless the embedded assessment is very
straightforward, it is desirable that more than one faculty member perform the evaluation. If a
sampling methodology is employed, not all student assignments will be selected for evaluation.
Disadvantages to embedded assessments are that faculty members may be asked to include
assessment instruments in their courses that their colleagues agree upon, but they may not find
desirable.
For all assessment methods, the faculty as a whole or a designated faculty group must decide
how, when, and under what conditions the assessment will take place, and how the results of the
evaluation be considered by the disciplinary faculty as a whole. It will also be necessary to
decide which students or student groups need to participate and whether or not all students in the
group or a sample of the students will be evaluated. If a sample is used it is important that a
representative group of students be included in the evaluation.
The writer of this manual is indebted to the Skidmore College Assessment Handbook for the
format and some of the information in the following discussion.
Direct Assessment Methods
Examinations
Examinations may be either standardized or locally developed, and are used in a variety ways in
assessment.
Standardized examinations may be either norm-referenced or criterion-referenced. Normreferenced tests compare a student score against the scores of a group that have been selected
as representative of the larger group of which the student is a part. The representative group is
known as the norm group. The Major Field Test in selected disciplines sold by the Educational
Testing Services is an example of standardized norm-referenced examinations. Criterionreferenced tests demonstrate to what extent a particular body of knowledge has been learned.
There is no comparison group.
Advantages
1. The tests are already developed and field tested reducing faculty time and effort.
2. Reliability, validity and bias have already been evaluated and high quality information is
available to facilitate evaluation of the tests.
22
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
3. In most cases, the testing company will score the examination and return a series of
standard reports.
4. It is possible to compare one’s students to students nationally.
Disadvantages
1. The test may not reflect the program of study for a particular program.
2. It may be difficult or impossible to disaggregate results in such a way that program
components can be evaluated.
3. Students may not be highly motivated to do well if the assessment is not a part of their
grade.
4. Standardized tests are expensive.
5. The purpose for which the examination was constructed must be carefully considered.
Examinations such as the GRE, MCAT, GMAT, and LSAT were constructed in order to
predict probable success in graduate and professional school, and their value for even
that purpose tends to be limited to predicting success only in the first semester. The
instruments were not constructed to measure knowledge or ability at the conclusion of an
academic program. Scores may be returned for the entire examination or at most only a
few subscores may be available which seriously limits their diagnostic potential for a
degree program.
Implementing a Standardized Examination
1. Decide on the student learning objectives the test will evaluate.
2. Decide on the level at which the learning objectives will be evaluated (i.e., recall,
comprehension, analysis, application, etc.).
3. Make the specific knowledge, skills, or affective components that are to be evaluated
explicit. It is important that faculty members have a clear conception of exactly what they
expect the results of the examination to tell them about their students’ knowledge and
abilities.
4. Obtain examination copies of the standardized examinations that appear to be most
appropriate.
5. Decide on whether or not the scores and subscores will return enough information for the
examination to be useful in program evaluation.
6. Decide whether students should get individual score reports.
7. Decide how to pay for the examinations.
8. Decide who will be responsible for ordering the tests and how the testing company will
get the raw data (online or on paper).
9. Decide how, when and by whom the examinations will be conducted.
10. Decide who will be responsible for conducting the analysis of the score reports and
reporting to the faculty as a whole.
Locally developed examinations are composed by departmental faculty members. Questions
should be field tested to make sure that they return valid and reliable information about the
student learning outcomes for which they will be used.
Advantages
1.
2.
3.
4.
Locally developed examinations are inexpensive to develop.
They are inexpensive to administer.
They reflect actual rather than generic program objectives.
Faculty members are more likely to trust and use the results since they have participated
in their construction.
Disadvantages
23
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
1. Locally developed examinations usually have not been examined for reliability, validity, or
bias.
2. Student results can only be compared to prior student groups taking the test and to
standards set by the faculty in the department rather than to a national sample or norm
group.
3. Faculty time is necessary in order to score the examination and document the results.
Developing Local Examinations
1. Decide on the student learning objectives to be evaluated.
2. Decide on where in the curriculum the examination will be given.
3. Decide on the level at which the objectives will be evaluated (i.e. recall, comprehension,
analysis, application, etc.).
4. Develop and field test the questions to make sure that they are comprehensible and
appropriately discriminate student ability or knowledge.
5. Decide on how and when the scoring and analysis will be conducted.
6. Decide on how the results and the analysis will be considered by the faculty.
Portfolios
Portfolios are a compilation of student work which may include works over some time period
and/or works in which students may be asked to reflect on their learning in terms of specific
learning objectives and discuss how the elements of the portfolio support their conclusions.
There are several different types of portfolios which are not necessarily mutually exclusive in use
across the country including electronic or e-portfolios, showcase portfolios, and developmental
portfolios.
Electronic portfolios may use either specialized software available from several vendors
including LiveText (http://college.livetext.com/college/index.html ), eFolio
(http://www.avenetefolio.com/ ), Concord (http://www.concord-usa.com/ ), iWebfolio
(http://www.nuventive.com/index.html ), Open Source Portfolio Initiative
(http://www.osportfolio.org/ )and many others or other non-specialized software
(http://www.kzoo.edu/pfolio/ ). Portfolios contain electronic artifacts (text, images, audio clips,
video clips, blogs, etc.) assembled by a student and managed by the student usually on-line. The
electronic portfolios are usually submitted on websites or on compact discs. They solve the
storage problems encountered when students must submit their work in paper folders or binders
and allow a much greater variety of products to be included.
Showcase portfolios ask students to submit their finest work or the work of which they are
proudest in order to demonstrate their learning. Typically, the student introduces the portfolio and
each piece of work and then discusses why they chose the particular piece of work and how the
student believes that it demonstrates their learning on the student learning objectives. Showcase
portfolios have been common in the arts for some time although the rest of the academy is
learning to understand their value. Showcase portfolios may be either paper or electronic
although electronic portfolios are more versatile.
Developmental portfolios demonstrate the acquisition of one or more student learning
objectives by including both early work and later work. The inclusion of work over time allows the
student to demonstrate how and in what ways his/her understanding and skill levels have
changed over time to give a value-added component to the portfolio. It also forces a student to
reflect on the learning experience in order to demonstrate its progression.
Advantages
1. Portfolios can be a rich source of information about what students learn.
2. They can be used to evaluate complex learning including applications and integration of
learning in the discipline and even in interdisciplinary products.
The University of Texas of the Permian Basin
24
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
3. It can be used to provide evidence of both the finished products and earlier drafts of
products or projects.
4. It allows students to think about what they have learned and how to demonstrate it
effectively.
5. It can be used to demonstrate to potential employers, graduate student selection
committees and others the quality and potential that the student brings to their tasks.
Disadvantages
1. They are time consuming for students to construct and for faculty to encourage and
evaluate.
2. They require considerable consideration and management for implementation to be
successful.
3. They require storage capacity combined with careful consideration of student privacy and
confidentiality.
Development of a Portfolio Assessment. Although portfolios are a rich and varied source of
information about student learning, they require considerable thought and deliberation in their
implementation. Use the steps below to initiate a system:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
Spend time looking at resources on portfolios and at information on how portfolios
have been implemented in other colleges and universities.
Decide on the student learning objectives that the portfolio will be designed to
demonstrate.
Decide on the type of portfolio that will best demonstrate the objectives you have
selected and over what period of time. Portfolios can span a student career or major
or may be confined to a particular course or cluster of courses.
Obtain agreement among faculty members about what particular pieces must be
included in the portfolio given the objectives that it is designed to reveal. Portfolios
might contain examinations, essays, projects, case studies, recordings of recitals,
pictures of juried pieces, research papers and many other items. Also consider
whether or not students will be allowed to substitute pieces if, through no fault of their
own, they are unable to include a required piece of work.
Decide whether or not students will be allowed to include other information.
Decide whether there will be a minimum length or a maximum length.
Decide how the portfolio will be handled. Will it receive a grade? If yes, consider
whether or not the pieces in the portfolio are previously graded. If the portfolio pieces
have already been graded, then will the portfolio itself be graded? How? How will
the pieces be verified as the authentic product that was originally graded? Will
submission and/or creation of the portfolio be a course requirement and if so, in what
course(s)? If the portfolio is a course requirement what kinds of accommodations
might be necessary for students with disabilities?
Decide how the portfolios will be evaluated to provide the necessary information on
the student learning objectives.
Develop the rubric that will be used to evaluate the portfolio.
Consider when and what students will be told about the portfolio. Write instructions
indicating how to create the portfolio, what is included, what an exemplary portfolio
looks like, what other resources might be available to improve the portfolio and be
sure to include the portfolio rubric so that students understand what will be evaluated.
Make it clear that the student is responsible for the creation and maintenance of the
portfolio.
Determine and explain how the student should partition the portfolio for different
learning objectives.
Decide whether the portfolio belongs to the student or to the program.
25
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
14.
15.
16.
Determine who will have access and under what circumstances to the portfolios. You
must determine in advance how student privacy and confidentiality will be
maintained.
Decide when and who will score the portfolios on the rubric for evaluation of the
student learning objectives and whether or not all portfolios will be evaluated or a
sample will be used. If more than one evaluator will be used decide on the measures
to promote inter-rater agreement, and what will be done when there is score
disagreement among raters.
Whether or not portfolios that are truly distinguished will be acknowledged in some
way.
Rubrics for Scoring Portfolios
1.
2.
3.
4.
5.
6.
List each of the learning objectives that the portfolio is designed to demonstrate.
Identify the elements of the learning objectives that collectively demonstrate the
objective’s attainment.
Develop a scale for scoring the rubric. You can use a numeric scale or a series of
descriptors (i.e., unacceptable, acceptable, exemplary; pass, fail; poor, below
average, average, above average, commendable) and you can use any number of
scale categories.
Decide what each scale category looks like for each element of the rubric. You must
identify each category of quality of work to a degree that different raters will be able
to use the rubric in ways that result in similar evaluations.
Calibrate the use of the rubric by having raters evaluate actual portfolios prior to
beginning the actual evaluation. Discuss the ratings obtained for each element in the
rubric. Use the discussion to improve the rubric over time.
Give raters space on the rubric form to make notes on what they see in particular
portfolios in order to improve the rubric form and to provide additional information
about the strengths and weaknesses of the students to enrich the discussion of the
extent to which student learning objectives have been attained and in what way they
can be improved.
Demonstration (Performance)
Demonstration encompasses a wide range of different types of student work that can be
evaluated. The evaluation of student performances has a long and distinguished history in higher
education. It can encompass a wide variety of activities including juried art exhibits, musical
recitals and juried recitals, experiments, oral examinations, a painting, speeches and physical
performances (American Sign Language, swimming, marathons, gymnastics, CPR, etc.).
Demonstrations used to assess student learning outcomes are usually scored by using rubrics or
scoring guides.
Advantages
1. Students usually perform demonstrations as a regular part of their student work.
2. Faculty usually grade demonstrations as a regular part of the course grade and the
performances can be used as embedded assessments.
3. The performance often offers a rich, diagnostic experience that can be useful for students
and very useful for programmatic assessment.
Disadvantages
1. Faculty members may resent using the performances as embedded assessments as
intrusions into their courses.
2. Scoring guides and rubrics require faculty members to agree on the elements of the
performance to be evaluated and the weight to be given to each element.
The University of Texas of the Permian Basin
26
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Development of a Demonstration Assessment
1. Faculty members must identify the student learning objectives to be evaluated.
2. It must be decided what individual or group will conduct the assessment. It is always
more appropriate to have more than one evaluator.
3. A rubric or scoring guide needs to be constructed that indicates the elements of the
student learning objective to be evaluated in order to decide whether or not the objective
has been attained, the categories of attainment, and the description of each category of
attainment on each element of the rubric.
4. Decide when and where the assessment will take place.
5. Decide how the assessment will be used to improve programs.
Paper and Project Assessments
Research papers, essays, computer programs, projects and other written and oral activities can
also be used to evaluate student attainment of learning objectives. As embedded assessments,
paper and project assessments offer the opportunity to evaluate reasonably complex learning
outcomes and depending on the instructions, some beliefs and attitudes. The assignments that
prompt the paper or project activities need to evoke products that demonstrate the student
learning outcomes of interest.
Advantages
1. The students perform such activities as a regular part of their course work.
2. Faculty members are experienced at evoking learning on particular desired learning
outcomes through the use of such activities with students.
3. The learning outcomes examined may be relatively simple or quite complex.
4. Such activities may encompass several different kinds of skills and knowledge. For
example, students might be asked to develop a research paper that shows their ability to
chose an appropriate topic, do a literature review, conduct an piece of research using
appropriate methods for the discipline, write it in a way that is both grammatically correct
and competently written and present it orally in their native or a target language using
appropriate technology.
5. By sharing the rubric or scoring guide with students, they may have a clearer idea of what
is expected of them in the paper or project.
Disadvantages
1. Faculty members may find it difficult to select a particular project or paper that is a
culminating expression of one or several student learning objectives especially in
disciplines that are more horizontally organized.
2. Depending on the number of parts of the paper or project to be evaluated, faculty
members may need to construct a number of parts or several scoring guides or rubrics to
conduct the assessment.
3. The weights given to different elements of the scoring guides or rubrics may be a matter
of contention among faculty members.
Development of a Paper or Project Assessment
1. Decide on the student learning objectives to be evaluated.
2. Decide in which course or courses the evaluation assignments will be conducted and
when the assignments will be conducted both in the curriculum and in the career of the
student.
3. Decide on the instructions for the assignment such that the degree of attainment of the
student learning objectives among students can be identified.
The University of Texas of the Permian Basin
27
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
4. Decide on the scoring guides or rubrics to be used for each of the student learning
objectives to be evaluated.
5. Decide on who and when the rubrics or scoring guides will be used. As with all
assessments using rubrics or scoring guides, it is strongly encouraged that more than
one individual should score the paper or project.
6. Decide whether all the papers or projects will be evaluated or a sampling design will be
used.
7. Decide how the information will be used to improve the academic program.
Field Experiences and Internships
A wide variety of disciplines in higher education offer some form of field experience, practicum or
internship experience. The experiences provide an important opportunity to gain information on
student learning as students apply the knowledge, skills and attitudes gained in a program in a
more “real world” environment. There is usually an evaluation form that the internship supervisor
completes, the faculty supervisor completes and often there is a self-evaluation form that the
student completes. If the feedback forms are to be used for program assessment it is necessary
that the student learning outcomes of interest be included on the feedback forms.
Advantages
1.
2.
3.
4.
5.
6.
7.
Student motivation to do well is high.
Students must apply knowledge and skills thus demonstrating higher levels of learning.
Attitudes and beliefs can be observed at the level of action.
There are often two observers (supervisor and faculty supervisor).
Evaluation instruments can be targeted to student learning outcomes.
Faculty and student time is conserved.
The cost is relatively low in both time and money.
Disadvantages
1. Supervisors who are not members of the faculty may not have a clear idea of the student
learning outcomes and may not give as useful feedback as hoped.
2. Different types of field experiences and internships within the same program may not
yield comparable information depending on what students actually do.
Development of a Field Experience or Internship Evaluation
1. Decide on the student learning objectives to be evaluated.
2. Decide on or revise the evaluation instrument to be used for the field experience or
internship experience in order to have particular student learning objectives evaluated.
3. Decide who will gather and analyze the data.
4. Decide how and when the analysis will be considered by the faculty as a whole.
5. Decide how the information will be used to improve the academic program.
Indirect Assessment Methods
Surveys
Surveys include both questionnaires and interviews. They gather opinions, beliefs, attitudes and
perceptions. The instruments themselves and methods of administration vary widely. Common
surveys on college and university campuses include those on advising, administrative and
support services, student characteristics, student expectations, and perceptions of the academic
program. Surveys are used with a wide variety of groups including alumni, employers, and
students at various times in their matriculation. It is common for campuses to have departments
that conduct exit interviews and phone interviews on specific topics. Surveys may be constructed
The University of Texas of the Permian Basin
28
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
locally or commercially available. Commercial surveys on a variety of topics and appropriate for
use with a variety of respondents are widely available. Surveys may be conducted in-person, in
groups, or online, and they may be narrowly or broadly focused.
Advantages
1. They are relatively easy to administer and may be administered to large numbers of
subjects.
2. They can be designed to allow statistical analyses to be conducted.
3. Faculty members in several different departments in most universities have experience
with the design and conduct of surveys.
4. They can cover a broad range of information in a relatively short period of time.
5. The results are relatively easy to understand.
6. They may offer an access to individuals who might be difficult to include in other forms of
assessments (i.e., alumni, employers, parents, etc.)
Disadvantages
1. They provide perceptions only. Thus they are not usually appropriate as a primary
measure for knowledge and skills.
2. Designing valid and reliable surveys is not necessarily easy. Survey results may be
influenced by the instructions, word and/or question order, vocabulary used, survey
organization, different methods of administration and the personality of those who
conduct the survey. Interviewers may require training to return reliable results.
3. Designing good surveys and interviews usually takes a considerable amount of time and
effort.
4. Interviews can be very challenging to conduct and analyze especially if a large number of
interviews will be conducted.
5. Unless there is a captive subject pool available, return rates for mail and online surveys
may be low and there are real difficulties in obtaining an unbiased sample. For mail or
online surveys it is not unusual to need to make several attempts to stimulate the surveys
return.
6. Certain types of surveys may be expensive in terms of time and money to administer.
7. Commercial surveys usually are written to be appropriate for a broad range of institutions
and thus, will not reflect any particular college or university.
8. Research has demonstrated that beliefs and attitudes are usually not good guides to the
actions of respondents.
9. Surveys that rely on the memories of respondents to answer particular questions must be
very carefully worded since memory has not proven to be highly reliable among survey
respondents.
10. Unless the surveys are conducted online or can be scanned into a computer file, there
may be a considerable cost incurred in time and money for data entry and validation.
11. Surveys must be designed to facilitate the statistical analysis to be performed; thus data
analysis needs to be a component of survey design.
Developing a Survey
1. Decide clearly what objectives are to be evaluated.
2. If possible seek assistance from an individual with experience in designing and
administering surveys of the type proposed.
3. Design questions carefully and in field testing, use several versions of the same question
to test which one returns the best information.
4. Keep the survey as concise as possible to encourage higher response rates and maintain
the goodwill of participants.
5. In both questionnaires and interviews be sensitive to the effect that particular interviewers
and survey administrators may have on the results. Faculty members who may be a
The University of Texas of the Permian Basin
29
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
6.
7.
8.
9.
10.
11.
12.
continuing influence on a student’s career may not be appropriate as an interviewer or
survey administrator for some surveys just as a student’s advisor may not be the best
person to administer an advising survey.
If you are considering a commercial product, obtain review copies and evaluate it
carefully to decide if it will successfully return the type of information required.
For commercial products, evaluate carefully whether or not you can successfully perform
the tasks that will be required of you and that the information the product will return
justifies the cost.
Make provisions to maintain respondent confidentiality and anonymity if anonymity has
been promised.
If responses are to be disaggregated by particular student characteristics, be sure to
include information that will allow that disaggregation to occur, and if you desire to
compare groups within the response set be sure that enough respondents will be
included in each group to make comparisons meaningful.
Do not abuse the patience of respondents by surveying the same group of respondents
over and over.
Consider carefully how, when, and by whom the responses will analyzed and if a
particular statistical package is to be used, be sure that the package is available for use
on campus and that the data analyst is skilled at using it.
Decide when and by whom the results will brought to the faculty for discussion and how
decisions about use of the information will be made.
Focus Groups
Focus groups are structured, face-to-face, group interviews. Much of the success of a focus
group depends on the skill of the facilitator who must maintain the group’s concentration on the
particular topics to be covered, elicit comments from all members of the group and ask probing
questions without losing the confidence and trust of group members. Leaders in focus groups
must be viewed by participants as credible, competent and trustworthy and they must be highly
skilled in group dynamics and handling conflict. The 5 to 7 participants in focus groups must be
carefully chosen to represent the group whose opinions are being sought. In many cases a
series of focus groups may be the most appropriate technique for obtaining the quality of
information sought.
In planning for a focus group, it is necessary to identify the set of topics to be considered and the
depth in which in each will be covered. The leader uses a series of scripted questions and
prompts to elicit the desired information. There is some leeway in most focus group scripts for
the leader to probe interesting or unanticipated information, but for the most part the leader will
stick to the script provided. The focus group is usually documented with an audio recording
although it is not uncommon for video taping to also be used. Participants need to be aware of
and give consent for audio and/or video taping. Usually transcripts of the proceedings are also
prepared to facilitate analysis. In order to get the highest quality responses, participants must be
assured that no information they divulge will used in ways that would be detrimental to them, and
if confidentiality at some level or anonymity is promised that promise must kept.
It is usually preferable to have a neutral leader for a focus group. If the focus group is being held
to evaluate a particular program or service, faculty and staff members associated with the
program or service may not be able to maintain an appropriate distance from the content of the
questions or responses. In addition, participants who are likely to find themselves in continuing
contact with a faculty or staff member in the conduct of their student careers may be prone to
withhold or be less than candid because of their fear of reprisal or fear of the loss of a faculty or
staff member’s esteem or friendship.
Advantages1
30
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
1. There is usually a pool of individuals on college and university campuses trained to
conduct focus groups.
2. The format is flexible and can include a wide variety of questions.
3. There is the opportunity to probe topics in greater depth than in some other formats.
4. It may be possible to elicit the reasons for a participant or participants’ beliefs, attitudes or
actions.
5. Because the format is with a group, it may be possible to uncover the degree of
consensus among the participants on topics and to probe the depth with which particular
positions are held.
1 Adapted from Mary J. Allen, Assessing Academic Programs in Higher Education, Bolton,
Massachusetts: Anker Publishing Company. 2004. pp. 128-129.
Disadvantages2
1. Since focus groups are an indirect measure, it is not possible to ascertain actual levels of
student learning.
2. A skilled facilitator is a necessary component of the process.
3. The quality of the questions is extremely important in the process for eliciting useful
feedback from participants.
4. Recruiting appropriate participants and scheduling focus groups can be extremely
challenging.
5. Focus groups can be quite expensive if a facilitator must be hired, participants paid
monetary inducements, and the transcript must be prepared by a professional.
6. A single group may not return either the quality or the range of opinions, perceptions or
attitudes that exist within the population as a whole.
7. Analysis of focus group data can be very challenging.
(Adapted from Mary J. Allen, Assessing Academic Programs in Higher Education, Bolton,
Massachusetts: Anker Publishing Company. 2004. pp. 128-129.)
Using a Focus Group
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Decide on the outcomes to be assessed.
Decide on the number of groups to be conducted.
Decide whether appropriate facilitators are available to conduct the group(s).
Decide whether or not appropriate participants are available and whether inducements to
participate are likely to be necessary.
Decide on the topics to be covered, the questions and prompts, and the degree to which
the facilitator may probe participant responses.
Establish the script and decide whether the group will be audio and/or videotaped.
Find an appropriate place to conduct the group.
Schedule the group or groups to facilitate participation.
Decide how, when and by whom the analysis will be conducted.
Decide how the analysis will be used by the faculty.
External Review
A wide variety of external review options exist for a program. Entire programs, samples of
student work and or particular programs or services may be reviewed by one or more external
reviewers. External reviewers may be representatives of disciplinary groups, members of a
discipline, or in some cases practitioners in the discipline may be used. The basis of the
evaluation is usually standards in the discipline or the area of practice.
Advantages
1. External reviewers offer an unbiased, third-party evaluation of the program, service, or
student products.
2. Reviewers use standards common in the discipline.
The University of Texas of the Permian Basin
31
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
3. The review is normally structured by the program or service provider to focus on
particular issues.
4. The self-study that is often a part of external program reviews can be very useful in
providing programs an opportunity to evaluate themselves.
5. Since the external reviewers are not involved in the program or service on the campus,
the recommendations or conclusions offered may have greater credibility at program
level at higher levels within the university.
6. External reviews are typically structured in such a way that replies to the
recommendations and conclusions of the outside reviewers are normal and expected,
and discussions about appropriate changes to the program or department take place at a
number of levels within the university.
7. An external review report may provide support for needed changes that will require
resources.
Disadvantages
1. An external reviewer(s) must be chosen carefully to provide an unbiased and useful
evaluation of the program, service or student products.
2. Conducting an external review takes a great deal of time and effort on the part of a
program or service provider especially when self-studies must be conducted.
3. External reviewers may be costly since a program must pay for transportation, lodging,
and usually a stipend for the reviewer(s), and for the costs of the external review report’s
development and reproduction.
4. If colleges and universities expect programs and services to prepare and conduct
external reviews, then the institution incurs the responsibility for supporting needed
changes with appropriate resources.
Conducting External Reviews
External program review processes are usually structured by institutions.
External reviews of student products
1. External reviews of student products need to be structured so that student learning
outcomes are an explicit part of the review.
2. Students need to give consent to have their work evaluated.
3. Appropriate safeguards for the transportation of the work and the confidentiality of
students submitting work need to be established.
4. If the reviewers come to the campus to review the student work, then appropriate
arrangements for transportation and lodging for reviewers needs to be considered.
5. Expectations for when review reports are expected and the format and elements of the
review need to be clear.
6. Reviewers need to respected member of the discipline or practitioners in order for faculty
to have confidence in their assessments.
7. Program faculty need to decide who will choose the reviewers, who will make the
arrangements, how expectations are developed and approved and who will be
responsible for follow-up.
8. Program faculty need to decide how the information obtained from reviewers will be used
to improve the program.
Documenting and Using Results
The institutional effectiveness system is also used to document results and the use of results of
planning and assessment. Academic degree programs should submit goals and objectives, for
the program and to the extent that it is appropriate conduct assessments and document results;
32
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
however, within the first 4 years of a degree program’s existence it is not required to make
changes to the program based on assessment results unless it feels that the changes will be an
improvement to the program. It is important that new programs have an opportunity to see the
curriculum through at least once before programmatic changes are required.
Documenting data and conducting an analysis of the data is required on at least an annual basis.
Results are the information gathered or progress made on planned outcomes and assessment
activities. For instance, a department has planned to increase student enrollment in a particular
degree program by having faculty members give lectures in high school classrooms at the
request of the high school teachers. Two faculty members give 3 lectures a piece in local high
school classrooms. The results are that student enrollment does not increase. The use of the
results are that the department will attempt to increase student enrollment by asking the faculty
member and a major in the program who graduated from the same high school to team teach the
lecture in the local high school classroom. Sixty percent of students attained an outcome, but the
knowledge or skill evaluated is important to a graduate educated in the field so the 85 percent
target was not met. Use might include restructuring the program, teaching the knowledge or skill
multiple times or improving prerequisites for a course. Staff evaluators may find that a particular
event is not particularly helpful to their users and decide to redesign the presentation and so on.
What ever the evaluation, it is very important to record the results and any activities designed to
enhance attainment. An assessment rubric is contained in Appendix K to assist you in evaluating
the extent to which your documentation is written appropriately.
SOME FINAL THOUGHTS
Planning and assessment if done deliberately and diligently can make a tremendous difference in
the quality of UTPB’s graduates and in the services that the institution provides. The University
believes in ability of the faculty and staff to develop high quality, challenging goals, design
appropriate methods to do the research necessary to establish an in-depth understanding of
whether the goals are being achieved, and power of thought necessary to devise strategies for
improvement. The Office of Institutional Research, Planning and Assessment can provide a
sounding board for ideas, suggest methodologies, improve documentation and find other forms of
assistance. The staff would be delighted to provide any assistance required; please use them as
a resource to enhance your achievement.
33
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
BIBLIOGRAPHY
Allen, M.J. (2004). Assessing Academic Programs in Higher Education. Bolton, Massachusetts:
Anker Publishing Company.
Astin, A., et al. (1992). Nine Principles of Good Practice for Assessing Student Learning.
Developed under the auspices of the AAHE Assessment Forum with support from the Fund
for the Improvement of Post-Secondary Education with additional support for publication and
dissemination from the Exxon Education Foundation.
Banta, T. et al. (1996). Assessment in Practice: Putting Principles to Work on College
Campuses. San Francisco, CA: Jossey-Bass.
Bloom, B., Englehart, M., Furst E., Hill, W. and D. Krathwohl. (1956). Taxonomy of Educational
Objectives: Cognitive Domain. New York: David McKay.
Cleaves, C. et al. (August 2005). A Road Map for Improvement of Student Learning and Support
Services through Assessment. New York: Agathon Press, as quoted in Office of Institutional
Planning and Effectiveness. Assessment of Institutional Effectiveness: An Introductory
Manual, Texas Tech University.
Commission on Colleges. (2008). Principles of Accreditation 2008 Interim Edition. Decatur,
Georgia: Southern Association of Colleges and Schools.
Commission on Colleges. (2005). Resource Manual for the Principles of Accreditation:
Foundations for Quality Enhancement. Decatur, Georgia: Southern Association of Colleges
and Schools.
Dean, L.A. (2006). CAS Professional Standards for Higher Education (6th ed.). Washington, D.C.
Council for the Advancement of Standards in Higher Education.
Information, Analysis and Assessment. (February 2005). UCF Academic Program Assessment
Handbook as adapted from the Guidelines for Program Assessment: Standards and Levels,
2002; and UCF Continuous Quality Improvement website, 2003.
Office of Institutional Research and Planning. (Spring 2006). Unit Effectiveness Process
Assessment Handbook. The University of Arlington.
Ohia, Uche (September 2004). “Assessment Plan Development-Instructional Programs.” Open
Forum Presentation.
Rodrigues, R.J. (Unknown). Skidmore College Assessment Handbook.
http://www.skidmore.edu/administration/assessment/handbook.htm
Schuh, J.H. and M.L. Upcraft (2001). Assessment Practice in Student Affairs.
San Francisco: Jossey-Bass.
University Planning Committee (Fall 2002). A Guide to Planning at Assessment at the University
of Montevallo 3rd Edition. Montevallo, AL: University of Montevallo.
Upcraft, M.L. and J.H. Schuh. (1996). Assessment in Student Affairs: A Guide for Practitioners.
San Francisco: Jossey-Bass.
Walvoord, B.E. (2004). Assessment Clear and Simple. San Francisco: Jossey-Bass.
Wargo, M.C. (August 2006). Handbook for Program Assessment. Western Carolina University.
The University of Texas of the Permian Basin
34
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix A
Assessment Review Committee 2007-2008
Leslie Toombs, Chair
Chad Vanderford
Kay Ketzenberger
Emilio Mutis-Duplat
Chad Greenfield
Sherry McKibben
Cherry Owen
Jeannine Hurst
Linda Isham
Hector Govea
William Fannin - ex officio
This committee oversees and makes recommendations on campus assessment activities as
contained in the plan submitted to the Southern Association of Colleges and Schools (SACS).
From University Committee Assignments 2007-2008
35
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix B
Helpful Handbooks, Manuals and Wisdom from Other Institutions
Assessment Steering Committee (June 29, 1999) The Assessment Program. University of the
Sciences in Philadelphia.
Dane, A. J. (2005-2006). Institutional Effectiveness Manual.
http://angelostate.net/publications/institutional_effectiveness/documents/IE_Manual_2005F.d
oc.
Kohout, S.L. and H.D. Stearman. (September 2006). Assessment of Institutional Effectiveness:
An Introductory Manual. Texas Tech University Health Sciences Center.
Lyon, L. (Undated). “Ten Myths of Academic Program Assessment at Baylor.” Baylor SACS
Reaffirmation Web Site http://www.baylor.edu/sacs/index.php?id=25925.
Office of Institutional Effectiveness and Planning. (January 2006). Institutional Effectiveness
Practitioner’s Manual. Texas A&M International University.
http://www.tamiu.edu/adminis/iep/pdf/TAMIU-IE-Practitioners-Manual.pdf.
Office of Institutional Research and Planning. (Spring 2006). Unit Effectiveness Process
Assessment Handbook. The University of Texas Arlington.
http://www.uta.edu/irp/unit_effectiveness_plans/assets/UEPHandbook.pdf.
Office of Planning and Assessment. (2001-2002). Eastern Kentucky University Institutional
Effectiveness and Assessment Principles, Procedures and Resource Manual.
Wargo, M.C. (August 2006). Western Carolina University Handbook for Program Assessment.
http://www.wcu.edu/assessment/documents/AssessmentHandbook_Sept06.pdf.
36
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix C
Budget and Planning Committee
Academic Department Chair:
Faculty Senate President:
Human Resources Director:
Vice President for Academic Affairs, Chair:
Vice President for Student Services:
Vice President for Business Affairs:
Director of Accounting:
Interim Director of Physical Plant:
Academic Dean:
Student Senate President:
Information Resources Director:
Faculty A:
Faculty B:
Faculty C:
Institutional Effectiveness Director:
This committee makes budgetary, mission, and strategic planning recommendations, both on
process and substance.
37
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix D
Guidelines and Instructions for Institutional Effectiveness Reporting
For 20XX-20XX, each budget unit will submit institutional effectiveness planning and assessment forms for this year on the dates shown on page
XX of this document. All institutional effectiveness reports will be submitted through regular reporting channels both electronically and on paper.
The Institutional Effectiveness Program (IEP) is a method of tracking improvement in programs and services over time. The IEP reports will
include the following items:
INSTRUCTIONAL DEPARTMENTS
Unit Planning Form for Instructional Departments (Form 1)
University Mission
This will be the same for every unit and program. It is consistent from year to year, updated only when the University adopts a new, authorized
mission statement.
Mission of Academic Affairs
Each of the major divisions of the University has a mission statement. It is consistent from year to year, updated only when the division adopts a
new statement.
Mission of the College/School
Each college/school has a mission statement. It is consistent from year to year, updated only when the college/school adopts a new statement.
Mission of the Department
This statement identifies the purposes of the unit and defines its role in the University. The mission will be different for every unit, but will be
consistent from year to year, unless the unit adopts a revised mission statement.
Planning Outcomes
Each department/office will identify planning outcomes for the next 3 to 5 years based on the strategic plan of the university, the university
compact, and the missions of the university, the division, the college/school and the unit itself. Planning outcome statements focus on what the
unit intends to accomplish in the areas of university compact success, academic quality, educational support quality, operational efficiency and
effectiveness, and service or unit program delivery. The areas of the university compact are growth, quality, graduation rate improvement,
research, and partnerships. Departmental planning outcomes should produce a clear focus on what needs to be accomplished given the
conditions that must be addressed.
38
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Planning has no magic formulas. Outcomes, in and of themselves, do not produce resources nor are there simple formulas for identifying and
implementing important and/or hard to accomplish objectives. Plans do not “work” – people do. The magic in planning is the quality of thought,
ingenuity and ability of those who do the planning.
Progress Measures
Progress measures identify the methods for evaluating progress associated with each planning outcome.
39
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School:
Submitted by:
Academic Year 20XX-20XX
Department:
Date Submitted:
UNIT PLANNING FORM FOR INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 1)
Compact elements this year are growth, quality, graduation rate improvement, research, partnerships, and public trust and accountability.
University Mission Statement:
Academic Affairs Mission Statement:
(College/School) Mission Statement:
40
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
(Department or Office) Mission Statement:
Planning Outcomes
Progress Measures
41
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Instructional Unit Assessment Plan (Form 2)
An Instructional Unit Assessment Plan Form needs to be submitted for each degree program in the department.
Degree Program Learning Goals
Learning goals are broadly focused on knowledge, skills, and attitudes that students should attain as a result of completing the degree program.
Goals are the foundation for all that follows in assessment. They are the basis on which assessment is constructed. For each academic degree
program at both the undergraduate and graduate level, the faculty need to identify 3 to 4 major goals focused on what students know, can do, and
believe as a result of being in the degree program.
Degree Program Learning Objectives
For each degree program at both the undergraduate and graduate level, objectives need to be developed. Objectives identify specific, observable
behaviors and actions related to a particular degree program goal that faculty will use to describe, monitor and assess student achievement.
Objectives are indicators for goals. In any single year, no more than 3 to 5 objectives should be identified for evaluation.
Student Learning Outcomes
Student learning outcomes will be identified for each degree program objective. These outcomes should identify the precise behaviors and
actions of students that will be evaluated and the desired levels at which those behaviors and actions must be demonstrated by students in order
for the objective to be considered satisfactorily accomplished. Expected outcomes contain a reference to time or other constraints (if any), a
description of the knowledge, skill or attitude desired, and a level of attainment. The level of attainment does not need to be a number or a
percentage, but it should be specific enough that it can serve as a triggering mechanism to signal when a change to the program or service should
be put into motion. Learning outcomes are dynamic. They should be revised over time to maintain their relevance.
Assessment Methodology
The assessment methodology specifies the means of measurement. The assessment methodology section of the assessment plan needs to
include a description of the methodology that contains enough detail that a third party reader can readily understand what is being planned.
The assessment methodology should be the best possible evaluation of the outcome balanced against the cost of conducting the measurement.
Measurements may be direct or indirect with the proviso that if the outcome being measured is a knowledge or skill outcome then at least one
direct measure is required. Direct measures require that students demonstrate the knowledge or skill through a performance of that knowledge or
skill. Direct measures include for example, a content examination, a case study analysis, a computer program, an oral proficiency evaluation, etc.
Indirect measures examine students or others’ perceptions of learning or other attributes and include such measures as surveys, interviews, focus
groups, etc. Indirect measures are acceptable and for some outcomes, they represent an excellent evaluative tool. Methodologies for evaluating
outcomes may include both quantitative and qualitative methods and may be expressed in either numerical data or narrative description.
42
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Department:
Submitted by:
Academic Year 20XX-20XX
Degree Program:
Date Submitted:
INSTRUCTIONAL UNIT ASSESSMENT PLAN 20XX-20XX (Form 2)
One form for each degree program in the department
(Degree Program) Learning Goals
Degree Program Learning Objectives
Student Learning Outcomes
43
Assessment Methodology
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Instructional Unit Planning Results Form (Form 3)
The results form documents the planning and evaluation conducted during the academic year. The items in the form include:
Planning Outcomes
These are the same planning outcomes planned for the year unless an amendment to the plan has been filed.
Progress Measures
The progress measures for each planning outcome should be the same as those in the planning forms submitted for the year unless an
amendment to the plan has been filed.
Results
The section should briefly describe the actual results from each progress measure. Please feel free to attach any appendices that you feel are
necessary to describe the results in greater detail. It should be possible for a third party reader to understand the results and to make a judgment
about how the results obtained led to the way the results were used to improve the programs or services of the department.
Use of Results
Describe any actions designed to improve the department or programs within the department as a consequence of the results obtained for each
planning outcome. It should be clear to a third party reader how the use of results is related to the actual results of the progress measure.
44
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School:
Submitted by:
Academic Year 20XX-20XX
Department:
Date Submitted:
UNIT PLANNING FORM FOR INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 3)
The planning outcomes shown below must correspond to those planned for academic year 20XX-20XX, including those required by the University
Compact with The University of Texas System.
Planning Outcomes
Progress Measures
Results
45
Use of Results
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Instructional Unit Assessment Results Form (Form 4)
An Instructional Unit Assessment Results form needs to be submitted for each degree program in the department.
Student Outcomes
These are the same student learning outcomes planned for the year unless an amendment to the plan has been filed.
Assessment Methodology
The assessment measures for each student outcome should be the same as those in the planning forms submitted for the year unless an amendment to
the plan has been filed.
Results
The section should briefly describe the results from each student learning outcome. Please feel free to attach any appendices that you feel are necessary
to describe the results in greater detail. It should be possible for a third party reader to understand the results and to make a judgment about how the
results obtained led to the way the results were used to improve the degree program.
Use of Results
Describe any actions designed to improve the degree program as a consequence of the results obtained for each student learning outcome. It should be
clear to a third party reader how the use of results is related to the actual results of the student learning outcome.
46
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Department:
Submitted by:
Academic Year 20XX-20XX
Degree Program:
Date Submitted:
UNIT PLANNING FORM FOR INSTRUCTIONAL DEPARTMENTS 2007-2008 (Form 4)
Student outcomes shown below must correspond to those planned for academic year 20XX-20XX
Student Outcomes
Assessment Methodology
Results
47
Use of Results
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
NON-INSTRUCTIONAL DEPARTMENTS/OFFICES
The requirements for institutional effectiveness plans for support offices and departments differ from those required of degree programs in several
important ways. Since there are no degree programs in support offices, they will have outcomes for the office rather than for a particular program.
In addition, in many support offices, student learning may not be the primary focus for direct accomplishment of their mission. As a consequence,
outcome statements in support offices will focus on the basic services and consumers of services provided by the office.
Unit Planning Form for Non-Instructional Departments (Form 5)
University Mission
This will be the same for every unit and program. It is consistent from year to year, updated only when the University adopts a new, authorized
mission statement.
Mission of the Division
Each of the major divisions of the University (i.e., Academic Affairs, Student Affairs, and Business Affairs) has a mission statement. It is consistent
from year to year, updated only when the division adopts a new statement.
Mission of the College/School (if applicable)
This segment of the form is only applicable to staff offices that reside inside a college/school in the division of Academic Affairs. Each
college/school has a mission statement. It is consistent from year to year, updated only when the college/school adopts a new statement.
Mission of the Department
This statement identifies the purposes of the unit and defines its role in the University. The mission will be different for every unit, but will be
consistent from year to year, unless the unit adopts a revised mission statement.
Planning Outcomes
Each department/office will identify planning outcomes for the next 3 to 5 years based on the strategic plan of the university, the university
compact, and the missions of the university, the division, the college/school and the unit itself. Planning outcome statements focus on what the
unit intends to accomplish in the areas of service quality, operational efficiency and effectiveness, user satisfaction, service delivery, and the
university compact. The areas of the university compact are growth, quality, graduation rate improvement, research, and partnerships. Planning
outcomes should produce a clear focus on what needs to be accomplished given the conditions that must be addressed.
Planning has no magic formulas. Planning outcomes, in and of themselves, do not produce resources nor are there simple formulas for
implementing important and/or hard to accomplish tasks. Plans do not “work” – people do. The magic in planning is the quality of thought,
ingenuity and ability of those who do the planning.
Progress Measures
Progress measures identify the methods for evaluating progress associated with each planning outcome.
48
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School or Division:
Submitted by:
Academic Year 20XX-20XX
Department:
Date Submitted:
UNIT PLANNING FORM FOR NON-INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 5)
Compact elements this year are growth, quality, graduation rate improvement, research, partnerships, and public trust and accountability.
University Mission Statement:
(Division) Mission Statement:
(College/School) Mission Statement
49
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
(Department or Office) Mission Statement:
Planning Outcomes
Progress Measures
50
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Non-Instructional Unit Assessment Plan Form (Form 6)
Department/Office Mission Statement
This statement identifies the purposes of the unit and defines its role in the University. The mission will be different for every unit, but will be
consistent from year to year, unless the unit adopts a revised mission statement.
Department/Office Outcomes
Department/office outcomes must be identified for each significant element in the mission statement. These outcomes will be primarily focused on
whether or not services are efficient, effective, satisfactorily delivered, and they need to demonstrate that the mission of the unit is fulfilled.
Support offices especially in some academic areas and in some student affairs areas may be and probably should be focused on the students who
are their primary target user group. In which case, those offices would also write one or more outcome statements focused on what students
know, can do or believe as a result of participation in the programs and services provided by the office.
Expected outcomes contain a reference to time or other constraints (if any), a description of the student or service outcome, and a level of
attainment. The level of attainment does not need to be a number or a percentage, but it should be specific enough that it can serve as a
triggering mechanism to signal when a change to the program or service should be put into motion. Outcomes are dynamic. They should be
revised over time to maintain their relevance.
Assessment Methodology
The assessment methodology specifies the means of measurement. The assessment methodology section of the assessment plan needs to
include a description of the methodology that contains enough detail that a third party reader can readily understand what is being planned.
The assessment methodology should be the best possible evaluation of the outcome balanced against the cost of conducting the measurement.
Measurements may be direct or indirect with the proviso that if the outcome being measured is a student outcome focused on a knowledge or skill
then at least one direct measure is required. Direct measures require that students demonstrate the knowledge or skill through a performance of
that knowledge or skill. Direct measures include for example, a content examination, an actual performance of a skill, a case study analysis or
other directly observable demonstration. Indirect measures examine student or others’ perceptions of learning, satisfaction, understanding, or
other attributes and include such measures as surveys, interviews, focus groups, etc. Indirect measures are acceptable and for some outcomes,
they represent an excellent evaluative tool. Methodologies for evaluating outcomes may include both quantitative and qualitative methods and
may be expressed in either numerical data or narrative description.
51
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School or Division:
Submitted by:
Academic Year 20XX-20XX
Department:
Date Submitted:
UNIT PLANNING FORM FOR NON-INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 6)
Department/Office Outcomes
Assessment Methodology
52
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Non-Instructional Unit Planning Results Form
The form reports the results of the planning and evaluation conducted during the academic year. The items in the form include:
Planning Outcomes
These are the same planning outcomes planned for the year unless an amendment to the plan has been filed.
Progress Measures
The progress measures for each planning outcome should be the same as those in the planning forms submitted for the year unless an
amendment to the plan has been filed.
Results
Results should briefly describe what the results were from each progress measure. Please feel free to attach any appendices that you feel are
necessary to describe the results in greater detail. It should be possible for a third party reader to understand the results and to make a judgment
about how the results obtained led to the way the results were used to improve the programs or services of the department.
Use of Results
Describe any actions designed and implemented to improve the department or programs within the department as a consequence of the results
obtained for each planning outcome. It should be clear to a third party reader how the use of results is related to the actual results of the progress
measure.
53
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School or Division:
Submitted by:
Academic Year 20XX-20XX
Department:
Date Submitted:
UNIT PLANNING FORM FOR NON-INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 7)
The planning outcomes shown below must correspond to those planned for academic year 20XX-20XX, including those required by the University
Compact with The University of Texas System.
Planning Outcomes
Progress Measures
Results
54
Use of Results
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Non-Instructional Unit Assessment Results (Form 8)
Department/Office Outcomes
These are the same assessment outcomes planned for the year unless an amendment to the plan has been filed.
Assessment Methodology
The assessment measures for each outcome should be the same as those in the assessment planning forms submitted for the year unless an
amendment to the plan has been filed.
Results
Results should briefly describe what the results were from each outcome. Please feel free to attach any appendices that you feel are necessary to
describe the results in greater detail. It should be possible for a third party reader to understand the results and to make a judgment about how
the results obtained led to the way the results were used to improve the programs or services offered by the office.
Use of Results
Describe any actions designed and implemented to improve the programs or services of the office as a consequence of the results obtained for
each outcome. It should be clear to a third party reader how the use of results is related to the actual results of the outcome methodology.
55
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School or Division:
Submitted by:
Academic Year 20XX-20XX
Department:
Date Submitted:
UNIT PLANNING FORM FOR NON-INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 8)
The outcomes shown below must correspond to those planned for academic year 20XX-20XX.
Department/Office Outcomes
Assessment Methodology
Results
56
Use of Results
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Generic Institutional Effectiveness Schedule
20XX-20XX
Activity
Date
Responsible
All planning /assessment plan forms for 2008-09 due
(Academic depts. form 1 & form 2 for each degree program)
(Non-Instructional Units send Forms 5 & 6)
May XX, 20XX
Dept Heads submit
to Deans or VP
Academic departments, degree programs and offices
submit completed results/use forms for 2007-2008
(instructional forms 3&4, non-instructional forms 7&8)
May XX, 20XX
Dept Heads submit to Deans
Administrative/support offices submit completed results/
use forms for 2007-2008 (forms 7&8)
May XX, 20XX
Office heads submit to VPs
Academic planning/assessment forms for 2008-2009
and academic results/use forms for 2007-2008
June X, 20XX
Deans submit to Provost
Submit all planning/assessment forms for upcoming year
and results/use forms for current year
June XX, 20XX
Provost and VPs submit to
President
57
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
EXAMPLES
INSTRUCTIONAL AND NON-INSTRUCTIONAL DEPARTMENTS
58
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School: Arts and Sciences
Submitted by: Dr. J. D. Clark
Academic Year 20XX-20XX
Department: Anthropology
Date Submitted: May 14, 20XX
UNIT PLANNING FORM FOR INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 1)
Compact elements this year are growth, quality, graduation rate improvement, research, partnerships, and public trust and accountability.
University Mission Statement:
The mission of The University of Texas of the Permian Basin is to provide quality education to all qualified students in a supportive educational
environment; to promote excellence in teaching, research, and service; and to serve as a resource for the intellectual, social, economic, and technological
advancement of the diverse constituency in Texas and the region.
Academic Affairs Mission Statement:
Academic Affairs promotes teaching, learning, inquiry and public service in support of the mission of the University of Texas of the Permian Basin and The
University of Texas System. We are committed to
 Providing an innovative and dedicated faculty and staff;
 Supporting excellence in instruction and enhancing the teaching-learning environment;
 Maintaining academic programs responsive to the needs of learners, the State and the region;
 Supporting faculty research and creative activities of regional, state and national distinction;
 Serving as a resource for the intellectual, cultural, technological and economic advancement of Texas’ citizens, particularly those in West Texas; and
 Enhancing the effectiveness and efficiency of instructional and support programs to achieve the standards of performance Texans expect from their
higher educational institutions.
Arts and Sciences Mission Statement
The mission of the College of Arts and Sciences of The University of Texas of the Permian Basin is
 To support The University’s goal of providing the responsible student with a quality liberal arts and sciences education within the context of a growing
computerized environment and an ethnically and culturally diverse and global society.
 To interweave the arts and sciences with professional education which provides the student with the freedom to realize one’s potential as an
independent person with critical thinking, openness, adaptability, tolerance, integrity, a capacity for life-long learning. Central to this task is the
general education curriculum, which requires study in a broad array of disciplines designed to provide breadth and diversity of knowledge and skills.
 To address the needs of those students who desire intensive study in a special field in the arts and sciences. To make the most of a liberal arts
education, these students must move beyond the breadth of general education to an emphasis on learning about a discipline in considerable depth
and to be able to speak and write effectively.
 To offer a quality set of master’s level research and applied programs designed to prepare advanced students for careers in teaching, research,
creativity and other areas of scholarly or public service.
 To provide a level of excellence in the teaching field for students seeking teacher certification.
 To contribute through excellence in teaching, scholarship and creative activities to the advancement and dissemination of knowledge; and, in so
doing maintain statewide and national recognition with regard to these activities.
 To recognize the special needs of a community and nontraditional student population.
 To provide special programs and services for the cultural enrichment of the community
59
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Department of Anthropology Mission Statement:
The mission of the department is to provide students with an understanding of the nature and role of cultural and physical diversity in the world
and throughout the history of human development through the sub-disciplines of archaeology, physical anthropology, linguistics, and cultural
anthropology. In addition, through research and service, the department provides the community and region with professional expertise in a
variety of specializations within the discipline.
Planning Outcomes
Progress Measures
1. The department will fund two internal research projects from
departmental development funds, one in linguistics and one
in cultural at a level of $3,000 for one year. (Compact
Element-Research)
a. Physical anthropology and archaeology will
compete for the funds next year.
b. The applications for research funds will be
competitive and evaluated by a committee of faculty
members including at least one in the sub-discipline
and 3 members of the advisory committee.
1. The process for evaluating the applications will work smoothly and
a research stipend will be given in both linguistics and cultural.
2. Linguistics will undergo a complete curriculum review this
academic year. (Compact Element-Quality)
2. The curriculum review will be completed and changes to the
curriculum will be prepared for submission through the university
curriculum process by the end of the academic year.
3. At least three members of the department will be asked to
give a lecture in a high school classroom in the coming year
in order to increase freshman anthropology majors.
(Compact Element-Growth)
a. The chairman of the department will contact social
studies teachers in high schools in the surrounding
area with a list topics faculty would be willing to do
a lecture on in their classrooms.
b. The chairman will also accept offers to discuss
topics not on the list and attempt to find a faculty
member qualified and willing to do the lecture.
3. At least three faculty members will give at least one lecture in a
high school classroom in this academic year.
4. The department will continue to pursue a contract with the
City Coroner’s Office for forensic anthropology services.
4. A contract will be finalized with the City Coroner’s Office this
academic year.
60
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Department: Anthropology
Submitted by: Dr. J.D. Clark
Academic Year 20XX-20XX
Degree Program: BS Anthropology
Date Submitted: May 14, 20XX
INSTRUCTIONAL UNIT ASSESSMENT PLAN 20XX-20XX (Form 2)
One form for each degree program in the department
Anthropology Program Learning Goals
The undergraduate program is committed to applying the scientific method to the study of evolution, diversity and commonality of humans as
members of a species, as members of societies, and as members of particular cultures. We seek to promote in students a critical
understanding of biological diversity and the social worlds in which they live. Students will gain basic knowledge in the four subfields of the
discipline and will learn to think holistically and comparatively about humans and their evolutionary relatives. Upon completion students will be
prepared to proceed to graduate school or attain positions commensurate with their training.
Degree Program Learning Objectives
1. Demonstrate knowledge of the breadth
of anthropology, including its main
subfields, and its ties to other scientific
disciplines.
Student Learning Outcomes
Assessment Methodology
1a. 75 percent of anthropology majors will
score 70 percent or above on the core
portion of the departmental exit
examination.
1a. A locally developed 100 question exit
examination is given as the final in the
capstone course. The multiple choice
examination is graded by the faculty member
of record and the grades on each section of
the test are brought to the first faculty
meeting of each academic year.
1b. 80 percent of graduating seniors will agree
or strongly agree that they have a basic
understanding of the main subfields and
anthropologies ties to other scientific
disciplines on the exit survey.
1b. A short exit survey is conducted in the last
week of classes in the capstone course. The
survey covers self-reported learning items,
scheduling, advising, plans for the future, and
areas for improvement in the anthropology
program.
2. Demonstrate knowledge of the range
of past and present human cultural
systems, including ecological
relationships, subsistence, social
organization, and belief systems.
2. 75 percent of anthropology majors will
score 70 percent or above on the cultural
portion of the departmental exit
examination.
2.
A locally developed 100 question exit
examination is given as the final in the
capstone course. The multiple choice
examination is graded by the faculty member
of record and the grades on each section of
the test are brought to the first faculty
meeting of each academic year.
3. Demonstrate knowledge of the range
of human language systems and
elements, their relationships, and the
3. 75 percent of anthropology majors will
score 70 percent or above on the physical
anthropology portion of the departmental
61
3.
A locally developed 100 question exit
examination is given as the final in the
capstone course. The multiple choice
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
relationship to social organization, and
belief systems.
exit examination.
examination is graded by the faculty member
of record and the grades on each section of
the test are brought to the first faculty
meeting of each academic year.
4. Demonstrate knowledge of
evolutionary theory as it applies to
human and nonhuman primate
biological phenomena; this should
include the ability to summarize the
basic timeline and processes of
general primate and specific hominid
biological evolution.
4. 75 percent of anthropology majors will
score 70 percent or above on the physical
anthropology portion of the departmental
exit examination.
4.
A locally developed 100 question exit
examination is given as the final in the
capstone course. The multiple choice
examination is graded by the faculty member
of record and the grades on each section of
the test are brought to the first faculty
meeting of each academic year.
5. Demonstrate knowledge of the
methods used discover physical
remains of peoples and cultures and to
reconstruct and develop propositions
about the past.
5. 75 percent of anthropology majors will
score 70 percent or above on the
archeology portion of the departmental exit
examination.
5.
A locally developed 100 question exit
examination is given as the final in the
capstone course. The multiple choice
examination is graded by the faculty member
of record and the grades on each section of
the test are brought to the first faculty
meeting of each academic year.
6. Formulate a critical, scientific
understanding of the basis for
contemporary human variation, in the
student’s area of specialization,
including appreciation of related ethical
concerns.
6. 75 percent of graduating seniors will
receive a 25 or above overall on the
research paper rubric of the capstone
course and 90 percent of seniors will
receive a nothing less than a grade of
satisfactory on each element of the rubric.
6.
The culminating project in the capstone
course is a major research paper in the
student’s area of specialization. Papers in
each subfield will be read and scored on the
research paper rubric by one or two faculty
members in the appropriate subfield. Rubric
scores, both aggregated and disaggregated
by field, are brought to first faculty meeting of
the fall semester for discussion.
7. Graduates will be prepared for
employment or acceptance to graduate
school.
7. At least 60 percent of graduates will either
be employed or in graduate school within 1
year after graduation.
7.
The results of the alumni survey question on
employment and the question on enrollment
in graduate school will be brought to the first
faculty meeting of the year.
62
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School: Arts and Sciences
Submitted by: Dr. J.D. Clark
Academic Year 20XX-20XX
Department: Anthropology
Date Submitted: May 9, 20XX
UNIT PLANNING FORM FOR INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 3)
The planning outcomes shown below must correspond to those planned for academic year 20XX-20XX, including those required by the University
Compact with The University of Texas System.
Planning Outcomes
Progress Measures
Results
1. The department will fund two
internal research projects from
departmental development
funds, one in linguistics and
one in cultural at a level of
$3,000 for one year. (Compact
Element-Research)
1. The process for evaluating the
applications will work smoothly
and a research stipend will be
given in both linguistics and
cultural.
1. Two research stipends were
awarded, one to Dr. Smith to
assist with his studies of Urdu
and one to Dr. Jones to assist
him in his work among the
Ainu in Japan this summer. All
participants in the process
were basically satisfied with
the process, however several
faculty members thought that
extra time should be allowed
for submission of materials.
1. The schedule for research
proposal submissions has been
changed from January 15th to
February 15th of each year.
2. The curriculum review will be
completed and changes to the
curriculum will be prepared for
submission through the
university curriculum process
by the end of the academic
year.
2. The curriculum review was
completed as scheduled and
the 2 course changes and 1
added course have been
prepared for submission to the
university curriculum process.
2. If the curriculum submissions are
successful at the university level,
the department will continue to
monitor the changes as they go
through the system and THECB
levels.
3. At least 3 members of the
department will be asked to
give a lecture in a high school
social studies classroom in the
coming year in order to
increase freshman
anthropology majors.
(Compact Element-Growth)
3. At least three faculty members
will give at least one lecture in a
high school social studies
classroom in this academic
year.
3. Two faculty members were
invited to give at least one
lecture in a high school social
studies classroom this
academic year. An additional
faculty member in physical
anthropology was invited to
discuss osteology in a biology
classroom.
3. Next academic year we will
broaden the program to include
high school biology classrooms.
We will also monitor how many
freshmen majors if any result
from these lectures over the next
3 years through a poll of new
freshmen majors.
4. The department will continue
4. A contract will be finalized with
4. A contract was finalized with
4. In addition to the faculty member
2. Linguistics will undergo a
complete curriculum review
this academic year. (Compact
Element-Quality)
63
Use of Results
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
to pursue a contract with the
City Coroner’s Office for
forensic anthropology services.
the City Coroner’s Office this
academic year.
the City Coroner’s Office as of
April 30, 20XX.
64
mentioned in the contract, two
additional student internship
positions will be set up by the
department to comply with the
terms of the contract. The
internships will be paid by the
city.
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Department: Anthropology
Submitted by: Dr. J.D. Clark
Academic Year 20XX-20XX
Degree Program: BS Anthropology
Date Submitted: May 9, 20XX
UNIT PLANNING FORM FOR INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 4)
Student outcomes shown below must correspond to those planned for academic year 20XX-20XX
Student Outcomes
Assessment Methodology
Results
1a. 75 percent of anthropology
1a. A locally developed 100
1a. 82 percent of majors
majors will score 70 percent or
question exit examination is
scored 70 percent or
above on the core portion of
given as the final in the
above on the core portion
the departmental exit
capstone course. The
of the departmental exit
examination.
multiple choice examination
examination.
is graded by the faculty
member of record and the
grades on each section of
the test are brought to the
first faculty meeting of each
academic year.
Use of Results
1a. No further action is
anticipated at this time.
1b. 80 percent of graduating seniors
will agree or strongly agree that
they have a basic understanding
of the main subfields and
anthropology’s ties to other
scientific disciplines on the exit
survey.
1b. A short exit survey is
conducted in the last week of
classes in the capstone
course. The survey covers
self-reported learning items,
scheduling, advising, plans
for the future, and areas for
improvement in the
anthropology program.
1b. 91 percent of graduating
seniors either agreed or
strongly agreed that they
have a basic
understanding of the main
subfields and 86 percent
agreed or strongly agreed
that they understood
anthropology’s ties to other
scientific disciplines on the
exit survey.
1b. The target on this outcome
measure will be raised to 85
percent next academic year.
2. 75 percent of anthropology
majors will score 70 percent or
above on the cultural portion of
the departmental exit
examination.
2.
2.
2.
A locally developed 100
question exit examination is
given as the final in the
capstone course. The
multiple choice examination
is graded by the faculty
member of record and the
grades on each section of
the test are brought to the
first faculty meeting of each
76 percent of majors
scored at least 70 percent
or above on the cultural
portion of the exit
examination. However, an
analysis of the responses
indicated that only 50
percent of students
satisfactorily answered the
questions on kinship
65
Kinship is taught as part of at
least 3 courses in cultural
anthropology. It was agreed
by the faculty that in ANTH
3340 students will be
required to conduct a short
project that requires the use
of kinship associations and
relationships. The
department will continue to
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
academic year.
associations.
monitor the results on the
exit exam.
3. 75 percent of anthropology
majors will score 70 percent or
above on the physical
anthropology portion of the
departmental exit examination.
3.
A locally developed 100
question exit examination is
given as the final in the
capstone course. The
multiple choice examination
is graded by the faculty
member of record and the
grades on each section of
the test are brought to the
first faculty meeting of each
academic year.
3.
80 percent of majors
scored 70 percent or
above on the physical
anthropology portion of the
exit examination.
3.
No further action is
necessary at this time.
4. 75 percent of anthropology
majors will score 70 percent or
above on the linguistics portion
of the departmental exit
examination.
4.
A locally developed 100
question exit examination is
given as the final in the
capstone course. The
multiple choice examination
is graded by the faculty
member of record and the
grades on each section of
the test are brought to the
first faculty meeting of each
academic year.
4.
70 percent of majors
scored 70 percent or
above on the linguistics
portion of the exit
examination.
4.
The linguistics faculty have
submitted several revised
course descriptions as part
of the curriculum review that
should increase scores on
this examination. We will
continue to monitor this
outcome.
5. 75 percent of anthropology
majors will score 70 percent or
above on the archeology portion
of the departmental exit
examination.
5.
A locally developed 100
question exit examination is
given as the final in the
capstone course. The
multiple choice examination
is graded by the faculty
member of record and the
grades on each section of
the test are brought to the
first faculty meeting of each
academic year.
5.
87 percent of majors
scored at or above 70
percent on the archeology
portion of the exit
examination.
5.
No further action is required
at this time.
6. 75 percent of graduating seniors
will receive a 25 or above overall
6.
The culminating project in
the capstone course is a
6.
Overall, the average score
on the rubric for all
6.
The faculty continues to
discuss the best approach to
66
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
on the research paper rubric of
the capstone course and 90
percent of seniors will receive a
nothing less than a grade of
satisfactory on each element of
the rubric.
7. At least 60 percent of graduates
will either be employed or in
graduate school within 1 year
after graduation.
major research paper in the
student’s area of
specialization. Papers in
each subfield will be read
and scored on the research
paper rubric by one or two
faculty members in the
appropriate subfield. Rubric
scores, both aggregated and
disaggregated by field, are
brought to first faculty
meeting of the fall semester
for discussion.
7.
The results of the alumni
survey question on
employment and the
question on enrollment in
graduate school will be
brought to the first faculty
meeting of the year.
specializations was 20 or
above indicating that the
papers were at least
acceptable. However, for
those papers in which
qualitative research was
chosen as the
methodology, an overall
weakness was identified in
the analysis of the data
and the use of the data in
drawing conclusions.
7.
47 percent of the alumni
surveyed returned
questionnaires. Of that
group of participants, 70
percent are either pursuing
a graduate degree or are
employed.
67
remedy this weakness. It is
anticipated that a proposal
for strengthening the
qualitative methodology
components of the program
will be brought to the
November faculty meeting
for discussion. The proposal
should result in some form of
curricular change for the next
academic year. We will
continue to monitor this
outcome carefully.
7.
No further action is required
at this time.
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School or Division: Academic Affairs
Submitted by: Dr. S. Pearson
Academic Year 20XX-20XX
Department: Institutional Research
Date Submitted: September 14, 20XX
UNIT PLANNING FORM FOR NON-INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 5)
Compact elements this year are growth, quality, graduation rate improvement, research, partnerships, and public trust and accountability.
University Mission Statement:
The mission of The University of Texas of the Permian Basin is to provide quality education to all qualified students in a supportive educational
environment; to promote excellence in teaching, research, and service; and to serve as a resource for the intellectual, social, economic, and
technological advancement of the diverse constituency in Texas and the region.
Academic Affairs Mission Statement:
Academic Affairs promotes teaching, learning, inquiry and public service in support of the mission of the University of Texas of the Permian Basin
and The University of Texas System. We are committed to
 Providing an innovative and dedicated faculty and staff;
 Supporting excellence in instruction and enhancing the teaching-learning environment;
 Maintaining academic programs responsive to the needs of learners, the State and the region;
 Supporting faculty research and creative activities of regional, state and national distinction;
 Serving as a resource for the intellectual, cultural, technological and economic advancement of Texas’ citizens, particularly those in West
Texas; and
 Enhancing the effectiveness and efficiency of instructional and support programs to achieve the standards of performance Texans expect
from their higher educational institutions.
College/School Mission Statement: NA
68
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Office of Institutional Research Mission Statement: The Office of Institutional Research provides decision support services for the President, Vice
Presidents, Deans and other administrative offices. The Office conducts research studies, provides external and internal reporting, completes the
University Fact Book, coordinates the planning and assessment system and acts as the regional accreditation liaison.
Planning Outcomes
Progress Measures
1. The IR Office will act as staff support for the Enrollment
Management Task Force. (Compact Element- Growth)
1. All requests for data and data analysis made by the Task
Force will be completed on time and in a satisfactory
manner.
2. A study of retention and progression of students will be completed
on time and submitted to the Vice President for Student Affairs.
(Compact Element-Graduation Rate Improvement)
2. The study will be completed and at least some of the
recommendations will be approved for implementation.
3. Two of the three computers in the department that are 4 or more
years old will be replaced.
3. The computers will be successfully replaced.
69
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School or Division: Academic Affairs
Submitted by: Dr. S. Pearson
Academic Year 20XX-20XX
Department: Institutional Research
Date Submitted: May 12, 20XX
UNIT PLANNING FORM FOR NON-INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 6)
Department/Office Outcomes
Assessment Methodology
1. 95 percent of external reports will be submitted prior to or on the
due date.
1. The office logs every report in on the day it arrives and
notes the due date and if the due date changes, the office
logs in the revised due date. The office also logs out every
report with the day the report was submitted to the
requesting agency. At year’s end the, the office compares
the due date and the submitted date and calculates a
percentage of reports submitted by the due date. A
justification is required for reports that were submitted late in
order to improve to the system to the greatest extent
possible.
2. The University Fact Book will be evaluated as “helpful” or “very
helpful” by at least 85 percent of the respondents on the annual
satisfaction survey administered by the Office.
2. The Office administers a satisfaction survey to all
administrative and support offices each year in April. The
responses are returned to the Office of the Provost and Vice
President. The usefulness of the University Fact Book is
routinely evaluated by the instrument. Any comments that
suggest improvements are reviewed and changes are made
as needed.
3. The President and vice presidents will evaluate the information
provided by the office as “important” or “very important” to their
work.
3. An evaluation of the usefulness of the information provided
to the President and vice presidents is conducted by the
Provost each year. The overall evaluation of the importance
of the information and any suggestions for improvement are
evaluated carefully for implementation by the Office and the
Provost.
70
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School or Division: Academic Affairs
Submitted by: Dr. S. Pearson
Academic Year 20XX-20XX
Department: Institutional Research
Date Submitted: May 30, 20XX
UNIT PLANNING FORM FOR NON-INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 7)
The planning outcomes shown below must correspond to those planned for academic year 20XX-20XX, including those required by the University
Compact with The University of Texas System
Planning Outcomes
Progress Measures
Results
Use of Results
1.
The IR Office will act as staff
support for the Enrollment
Management Task Force.
(Compact Element- Growth)
1.
All requests for data and data
analysis made by the Task
Force will be completed on
time and in a satisfactory
manner.
1. All requests for data and data
analysis were completed on
time and the satisfaction card
that accompanies all
completed studies indicated
that they were “very
satisfactory.”
1. No further action in required.
2.
A study of retention and
progression of students will
be completed on time and
submitted to the Vice
President for Student Affairs.
(Compact ElementGraduation Rate
Improvement)
2. The study will be completed
and at least some of the
recommendations will be
approved for implementation.
2. The retention study was
completed on schedule and 3
of the 4 recommendations are
contained in the approved
retention plan.
2. No further action is required.
3.
Two of the three computers
in the department that are 4
or more years old will be
replaced.
3. The computers will be
successfully replaced.
3. One of the three computers
was replaced.
3. An additional budget request
for year-end funds will be
submitted to fund the
replacement of at least one
more office computer.
71
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
College or School or Division: Academic Affairs
Submitted by: Dr. S. Pearson
Academic Year 20XX-20XX
Department: Institutional Research
Date Submitted: May 30, 20XX
UNIT PLANNING FORM FOR NON-INSTRUCTIONAL DEPARTMENTS 20XX-20XX (Form 8)
The outcomes shown below must correspond to those planned for academic year 20XX-20XX.
Department/Office Outcomes
Assessment Methodology
Results
1.
Use of Results
95 percent of external reports
will be submitted prior to or on
the due date.
1.
The office logs every report in
on the day it arrives and notes
the due date and if the due
date changes, the office logs in
the revised due date. The
office also logs out every
report with the day the report
was submitted to the
requesting agency. At year’s
end the, the office compares
the due date and the submitted
date and calculates a
percentage of reports
submitted by the due date. A
justification is required for
reports that were submitted
late in order to improve to the
system to the greatest extent
possible.
1.
93 percent of external
reports will be submitted
prior to or on the due date.
Of the 4 reports that were
not on time, 3 were the
result of extensive
reprogramming required
by changes in Federal
student aid programs.
1.
The Director has contacted
the Office of Student
Financial Assistance to
explain the importance of
reporting requirements and
to make sure that those
requirements are taken into
account in revisions to
financial aid programs.
2. The University Fact Book will
be evaluated as “helpful” or
“very helpful” by at least 85
percent of the respondents on
the annual satisfaction survey
administered by the Office.
2.
The Office administers a
satisfaction survey to all
administrative and support
offices each year in April. The
responses are returned to the
Office of the Provost and Vice
President. The usefulness of
the University Fact Book is
routinely evaluated by the
2.
The University Fact Book
was evaluated as “helpful”
or “very helpful” by 92
percent of the respondents
to the survey.
2.
The target in the outcome
statement will be raised to 90
percent in next year’s
institutional effectiveness
planning.
72
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
instrument. Any comments
that suggest improvements are
reviewed and changes are
made as needed.
3. The President and vice
presidents will evaluate the
information provided by the
office as “important” or “very
important” to their work.
3.
An evaluation of the
usefulness of the information
provided to the President and
vice presidents is conducted
by the Provost each year. The
overall evaluation of the
importance of the information
and any suggestions for
improvement are evaluated
carefully for implementation by
the Office and the Provost.
73
3.
The President and all of
the Vice Presidents rated
the information provided
by the office as “important”
or “very important “to their
work. A suggestion was
made to develop a list of
comparable of institutions
for standard comparisons
of university data.
3.
The Office will select a list of
comparable institutions,
justify the selection of those
institutions on the list, and
have the list approved for
comparative use in
institutional studies within the
next academic year.
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix E
Examples of Correct and Incorrect Student Learning Outcomes
Biology
Correct: 85% of senior Biology majors will demonstrate their ability to engage in scientific inquiry
by attaining 50 or more points on the research paper rubric.
Incorrect: Senior Biology majors will engage in scientific inquiry by demonstrating competency in
analytical, information and communication skills.
History
Correct: 85% of history seniors will be able to develop and conduct an analytically sound study
within a historiographical context.
Incorrect: Students will be able to conduct historical research.
Kinesiology
Correct: Using case studies appropriate for their concentration 90% of seniors will be able to
apply kinesiological principles integrate and apply multiples effectively and communicate a plan to
enhance the quality of life and encourage healthy lifestyles within the target group.
Incorrect: Seniors will understand the relationship between movement and quality of life.
English
Correct: All seniors will be able to analyze a text using at least two different approaches from
literacy, rhetorical and/or linguistic theories.
Incorrect: Students will be able to evaluate sources relevant to the English language.
Spanish
Correct: 95% of seniors will score at the “acceptable” level or above on the evaluation of the
analysis literacy texts in research paper in Span 4391.
Incorrect: Students will demonstrate familiarity with major literacy trends in Spain and Latin
America.
Mathematics
Correct: 95% of graduating seniors will be able to model and analyze a real world problem in
Math 4362 by reformulating the problem in mathematical context.
Incorrect: Graduating seniors will have both breath and depth of knowledge in mathematics.
Computer Science
Correct: 90% of students will be able to analyze a problem and identify and define a set of
appropriate computing requirements for its solution.
Incorrect: Graduates will be able to formulate, analyze, and implement appropriate solutions to
computing problems.
Chemistry
Correct: 90% of seniors will demonstrate the ability to clearly communicate the results of scientific
investigation in writing by obtaining at least an “adequate” on the research paper rubric.
Incorrect: 85% of students will be bale to identify and solve chemical problems.
Environmental Science
Correct: 85% of seniors will be able to analyze scientific information and develop an appropriate
management strategy to manage a particular environmental or resource issue.
Incorrect: Students will have the ability to apply scientific principles and technology to resource
and environmental problems.
74
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Geology
Correct: 95% of graduating seniors will demonstrate the ability to write technical reports that
receive at least an “acceptable” in the areas of research, organization, illustration, and writing
skills.
Incorrect: Graduating seniors will understand basic concepts in geological knowledge.
Psychology
Correct: 90% of seniors will apply basic research methods including research design, data
analysis and interpretation to a research question in PSYC 4392 and receive at least an
“acceptable” on all 3 areas.
Incorrect: Graduates will be able to weigh evidence, act ethically and reflect other values in
psychology.
Criminology
Correct: 90% of senior students will demonstrate the ability to read and understand selections
from the literature by constructing an acceptable annulated bibliography on an appropriate topic
in the discipline.
Incorrect: Students will have intellectual skills adequate to function as a field fraction.
Leadership
Correct: 95% of students will attain an “adequate” or better on an in-box exercise concerned with
ethical issues related to a leadership dilemma in LEAD 4325.
Incorrect: Students will develop critical thinking skills.
Political Science
Correct: 90% of seniors will receive at least 50 points on the research paper rubric in the area of
1) Ability to conceptualize a research question. 2) Formulation or testable research hypothesis. 3)
Application of satisfied techniques and drawing appropriate conclusions from the analysis.
Incorrect: Students will be able to critically examine major governmental institutions.
Sociology
Correct: 95% of seniors will demonstrate the ability to understand and apply the fundamental
principles of research design and elementary data analysis by receiving no less than an
acceptable on those 20 areas of the research paper rubric.
Incorrect: Students will apply a sociological perspective to social problems.
Social Work
Correct: 100% of social work seniors will demonstrate the knowledge and skills necessary to
successfully engage in the entry level practice in social work by earning a satisfactory or better on
the field practicum evaluation completed by the field supervisors.
Incorrect: Students will understand human behavior.
75
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix F
Action Verbs for Writing Outcome Statements
Knowledge Acquisition and Application
Add
Change
Compute
Describe
Dramatize
Explain
Indicate
List
Modify
Point
Rank
Record
Restate
Specify
Translate
Apply
Chart
Construct
Discover
Draw
Express
Inform
Locate
Name
Predict
Read
Relate
Review
State
Use
Arrange
Choose
Count
Discuss
Duplicate
Graph
Interpolate
Manipulate
Operate
Prepare
Recall
Repeat
Select
Stimulate
Calculate
Classify
Define
Distinguish
Employ
Identify
Interpret
Match
Order
Produce
Recite
Report
Show
Subtract
Categorize
Complete
Demonstrate
Divide
Examine
Illustrate
Label
Memorize
Outline
Quote
Recognize
Reproduce
Solve
Summarize
Assess
Compare
Criticize
Differentiate
Formulate
Invent
Order
Propose
Related
Rewrite
Survey
Calculate
Compile
Defend
Dissect
Generate
Investigate
Organize
Rate
Reorganize
Select
Synthesize
Categorize
Compose
Design
Estimate
Group
Judge
Plan
Rearrange
Research
Separate
Test
Adjust
Assemble
Clean
Construct
Design
Dissect
Install
Measure
Produce
Reorganize
Select
Sort
Use
Align
Calibrate
Combine
Correct
Detect
Distinguish
Isolate
Operate
React
Repair
Separate
Test
Vary
Alter
Change
Compose
Create
Differentiate
Employ
Locate
Originate
Rearrange
Replace
Set
Transfer
Higher Order Thinking Skills
Adapt
Classify
Contrast
Devise
Evaluate
Infer
Justify
Prescribe
Reconstruct
Review
Specify
Transform
Analyze
Combine
Create
Diagram
Explain
Integrate
Modify
Produce
Reflect
Revise
Summarize
Psychomotor Skills
Activate
Apply
Check
Conduct
Demonstrate
Dismantle
Follow
Make
Perform
Relate
Respond
Show
Troubleshoot
Adapt
Arrange
Choose
Connect
Describe
Display
Identify
Manipulate
Prepare
Remove
Revise
Sketch
Tune
76
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Attitude, Values, and Dispositions
Accept
Adopt
Approve
Assume
Change
Comply
Deny
Endorse
Form
Help
Integrate
Justify
Perform
Protest
Resolve
Share
Tell
Work
Acclaim
Advocate
Arrange
Attend
Choose
Conform
Describe
Enjoy
Formulate
Hold
Interpret
Listen
Persuade
Qualify
Respect
Show
Use
Accommodate
Alter
Ask
Balance
Classify
Cooperate
Develop
Establish
Give
Identify
Invite
Obey
Practice
Question
Revise
Solve
Verify
Act
Answer
Assist
Believe
Combine
Debate
Differentiate
Express
Greet
Influence
Join
Organize
Present
Reflect
Select
Subscribe
Volunteer
Adhere
Applaud
Associate
Challenge
Complete
Defend
Display
Follow
Have
Initiate
Judge
Participate
Propose
Report
Serve
Support
Weigh
(Adapted from Morningside College, Assessment Handbook as quoted in The University of
Arlington, Unit Effectiveness Process Assessment Handbook, p. 48)
77
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix G
Core Curriculum: Assumptions and Defining Characteristics (Rev. 1999)
Senate Bill (SB) 148, enacted in 1997 by the 75th Texas Legislature, requires the Texas Higher
Education Coordinating Board to adopt rules that include "a statement of the content, component
areas, and objectives of the core curriculum," which each institution is to fulfill by its own selection
of specific courses. Those rules are included in Chapter 5, Subchapter S, Sections 5.390 through
5.404. The Coordinating Board has adopted this document in order to provide additional guidance
to institutions as they refine their core curricula to comply with SB 148 and the Coordinating
Board rules that implement the statute. The Assumptions, Defining Characteristics of Intellectual
Competencies, Perspectives, and Exemplary Educational Objectives (listed by component area)
contained in this document are derived from the Report of the Advisory Committee on Core
Curriculum (1997-98). That Advisory Committee based its work on the 1989 Report of the
Subcommittee on Core Curriculum, which the Board received and endorsed in accordance with
House Bill 2187 of the 70th Legislature. That legislation required all institutions to adopt, evaluate,
and report on an undergraduate core curriculum. Each institution should consider these guiding
principles carefully as it proceeds with the revision of its core curriculum.
ASSUMPTIONS
In establishing its guidelines for core curricula, the Board has made the following assumptions:
1. Every institution of higher education is required by law to adopt a core curriculum of no
less than 42 semester credit hours which is consistent with the Texas Common Course
Numbering System and the statement, recommendations, and rules issued by The Texas
Higher Education Coordinating Board.
[The Core Curriculum Advisory Committee (1997-1998) has defined "consistent with the
Texas Common Course Numbering System" as meeting one of the following criteria: a)
the course already has a common course number, b) application for a common course
number has been made, or c) the course is not a common course but at least one
common course number that may be accepted in lieu of the course is designated by the
institution.]
2. If a student successfully completes the 42-hour core at an institution of higher education,
that block of courses must be substituted for the receiving institution's core curriculum. A
student shall receive academic credit for each of the courses transferred and may not be
required to take additional core curriculum courses at the receiving institution unless the
Board has approved a larger core curriculum at the receiving institution.
3. Students who transfer without completing the core curriculum shall receive academic
credit in the core curriculum of the receiving institution for each of the courses that the
student has successfully completed in the core curriculum of the sending institution, with
certain exceptions noted in the rules [Chapter 5, Subchapter S, Section 5.403 (h)].
4. The basic intellectual competencies discussed in this document -- reading, writing,
speaking, listening, critical thinking, and computer literacy -- should inform the
components of any core curriculum. Moreover, a core curriculum should contain courses
that provide multiple perspectives about the individual and the world in which he or she
lives; that stimulate a capacity to discuss and reflect upon individual, political, and social
aspects of life so students understand ways in which to exercise responsible citizenship;
and that enable students to integrate knowledge and understand the interrelationships of
the disciplines.
5. There should be no attempt by the state to prescribe a specific set of core courses or a
single core curriculum that would be uniform across all Texas colleges and universities.
78
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
6. A core curriculum should be described and assessed by faculty and institutions in terms
of basic intellectual competencies and perspectives, and of specified student outcomes,
rather than simply in terms of specific courses and course content.
DEFINING CHARACTERISTICS OF BASIC INTELLECTUAL COMPETENCIES IN THE CORE
CURRICULUM
The core curriculum guidelines described here are predicated on the judgment that a series of
basic intellectual competencies -- reading, writing, speaking, listening, critical thinking, and
computer literacy -- are essential to the learning process in any discipline and thus should inform
any core curriculum. Although students can be expected to come to college with some
experience in exercising these competencies, they often need further instruction and practice to
meet college standards and, later, to succeed in both their major field of academic study and their
chosen career or profession.
READING: Reading at the college level means the ability to analyze and interpret a variety of
printed materials -- books, articles, and documents. A core curriculum should offer students the
opportunity to master both general methods of analyzing printed materials and specific methods
for analyzing the subject matter of individual disciplines.
WRITING: Competency in writing is the ability to produce clear, correct, and coherent prose
adapted to purpose, occasion, and audience. Although correct grammar, spelling, and
punctuation are each a sine qua non in any composition, they do not automatically ensure that
the composition itself makes sense or that the writer has much of anything to say. Students need
to be familiar with the writing process including how to discover a topic and how to develop and
organize it, how to phrase it effectively for their audience. These abilities can be acquired only
through practice and reflection.
SPEAKING: Competence in speaking is the ability to communicate orally in clear, coherent, and
persuasive language appropriate to purpose, occasion, and audience. Developing this
competency includes acquiring poise and developing control of the language through experience
in making presentations to small groups, to large groups, and through the media.
LISTENING: Listening at the college level means the ability to analyze and interpret various
forms of spoken communication.
CRITICAL THINKING: Critical thinking embraces methods for applying both qualitative and
quantitative skills analytically and creatively to subject matter in order to evaluate arguments and
to construct alternative strategies. Problem solving is one of the applications of critical thinking,
used to address an identified task.
COMPUTER LITERACY: Computer literacy at the college level means the ability to use
computer-based technology in communicating, solving problems, and acquiring information.
Core-educated students should have an understanding of the limits, problems, and possibilities
associated with the use of technology, and should have the tools necessary to evaluate and learn
new technologies as they become available.
Some of these intellectual competencies have traditionally been tied to specific courses required
of all students during their first two years of college. For example, courses in college composition,
together with mathematics, have long been the cornerstone experience of the freshman year. But
a single course or two-course sequence in college composition can do little more than introduce
students to the principles and practices of good writing. Within the boundary of three to six
semester credit hours of course work, neither of these sequences can guarantee proficiency.
Moreover, in most curricula there are no required courses specifically dedicated to reading or to
The University of Texas of the Permian Basin
79
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
critical thinking. Thus, if a core curriculum is to prepare students effectively, it is imperative that,
insofar as possible, these intellectual competencies be included among the objectives of many
individual core courses and reflected in their course content.
PERSPECTIVES IN THE CORE CURRICULUM
Another imperative of a core curriculum is that it contains courses that help students attain the
following:
1. Establish broad and multiple perspectives on the individual in relationship to the larger
society and world in which he or she lives, and to understand the responsibilities of living
in a culturally and ethnically diversified world;
2. Stimulate a capacity to discuss and reflect upon individual, political, economic, and social
aspects of life in order to understand ways in which to be a responsible member of
society;
3. Recognize the importance of maintaining health and wellness;
4. Develop a capacity to use knowledge of how technology and science affect their lives;
5. Develop personal values for ethical behavior;
6. Develop the ability to make aesthetic judgments;
7. Use logical reasoning in problem solving; and
8. Integrate knowledge and understand the interrelationships of the scholarly disciplines.
INSTRUCTION AND CONTENT IN THE CORE CURRICULUM
Education, as distinct from training, demands a knowledge of various contrasting views of human
experience in the world. Both the humanities and the visual and performing arts deal with the
individual's reaction to the human situation in analytical and creative ways. The social and
behavioral sciences deal with the principles and norms that govern human interaction in society
and in the production of goods and services. The natural sciences investigate the phenomena of
the physical world. Mathematics examines relations among abstract quantities and is the
language of the sciences. Composition and communication deal with oral and written language.
Each of these disciplines, using its own methodology, offers a different perspective on human
experience. Taken together, study in these disciplines provides a breadth of vision against which
students can establish and reflect on their own goals and values.
The outcomes which are specified for the disciplinary areas are thus intended primarily to provide
students with a perspective on their experience through an acquaintance with the subject matter
and methodology of each discipline. They provide students with the opportunity to understand
how these disciplines present varying views of the individual, society, and the world, and of
appreciating the methods by which scholars in a given discipline organize and evaluate data. The
perspectives acquired in these studies describe the potential, as well as the limitations, of each
discipline in understanding the human experience.
The objective of disciplinary studies within a core curriculum is to foster multiple perspectives as
well as to inform and deliver content. Disciplinary courses within a core curriculum should
promote outcomes focused on the intellectual core competencies, as well as outcomes related to
establishing perspectives, and the basic concepts in the discipline -- methods of analysis and
interpretation specific to the discipline.
Institutions are urged to consider development and utilization of appropriate interdisciplinary
courses as a means of helping students develop multiple perspectives on the individual in
relationship to other people and societies. Comparison and contrast of disciplinary perspectives
on an issue within the context of a single course can be a particularly effective instructional
device.
80
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
CORE COMPONENTS AND RELATED EXEMPLARY EDUCATIONAL OBJECTIVES
In designing and implementing a core curriculum of at least 42 semester credit hours, each Texas
college and university should select and/or develop courses which satisfy exemplary educational
objectives specified for each component area. The following exemplary educational objectives
should be used as basic guidelines for selected component areas. Exemplary educational
objectives become the basis for faculty and institutional assessment of core components.
Since it is difficult to define exemplary educational objectives for a core curriculum outside of
some framework of the general areas of content, the objectives and outcomes described below
are suggested as those that meet the intent of Senate Bill 148. The outcomes for student learning
provide both guidelines for instruction and a profile of students as they complete each component
of a core curriculum. Although these component areas could easily be "translated" directly into
disciplinary or departmental terms, it is not necessary to restrict the areas to one or a few
departments. These objectives could be met in a number of differing course configurations,
including multi-disciplinary courses.
Colleges and universities across the state have specific missions and different roles and scope.
The way in which colleges and universities achieve these outcomes will thus vary. These outlines
are not intended in any way to impose restrictions on the creativity of the classroom instructor or
to dictate pedagogical methods. The emergent profile of the students, however, will presumably
have common characteristics insofar as they achieve the specified outcomes. A core curriculum
experience will prepare them to learn effectively through the rest of their college years so that
they carry these aptitudes for learning into their life careers.
I. COMMUNICATION (composition, speech, modern language)
The objective of a communication component of a core curriculum is to enable the student to
communicate effectively in clear and correct prose in a style appropriate to the subject, occasion,
and audience.
Exemplary Educational Objectives
1. To understand and demonstrate writing and speaking processes through invention,
organization, drafting, revision, editing, and presentation.
2. To understand the importance of specifying audience and purpose and to select
appropriate communication choices.
3. To understand and appropriately apply modes of expression, i.e., descriptive, expositive,
narrative, scientific, and self-expressive, in written, visual, and oral communication.
4. To participate effectively in groups with emphasis on listening, critical and reflective
thinking, and responding.
5. To understand and apply basic principles of critical thinking, problem solving, and
technical proficiency in the development of exposition and argument.
6. To develop the ability to research and write a documented paper and/or to give an oral
presentation.
II. MATHEMATICS
The objective of the mathematics component of the core curriculum is to develop a quantitatively
literate college graduate. Every college graduate should be able to apply basic mathematical
tools in the solution of real-world problems.
81
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Exemplary Educational Objectives
1. To apply arithmetic, algebraic, geometric, higher-order thinking, and statistical methods to
modeling and solving real-world situations.
2. To represent and evaluate basic mathematical information verbally, numerically,
graphically, and symbolically.
3. To expand mathematical reasoning skills and formal logic to develop convincing
mathematical arguments.
4. To use appropriate technology to enhance mathematical thinking and understanding and
to solve mathematical problems and judge the reasonableness of the results.
5. To interpret mathematical models such as formulas, graphs, tables and schematics, and
draw inferences from them.
6. To recognize the limitations of mathematical and statistical models.
7. To develop the view that mathematics is an evolving discipline, interrelated with human
culture, and understand its connections to other disciplines.
III. NATURAL SCIENCES
The objective of the study of a natural sciences component of a core curriculum is to enable the
student to understand, construct, and evaluate relationships in the natural sciences, and to
enable the student to understand the bases for building and testing theories.
Exemplary Educational Objectives
1. To understand and apply method and appropriate technology to the study of natural
sciences.
2. To recognize scientific and quantitative methods and the differences between these
approaches and other methods of inquiry and to communicate findings, analyses, and
interpretation both orally and in writing.
3. To identify and recognize the differences among competing scientific theories.
4. To demonstrate knowledge of the major issues and problems facing modern science,
including issues that touch upon ethics, values, and public policies.
5. To demonstrate knowledge of the interdependence of science and technology and their
influence on, and contribution to, modern culture.
IV. HUMANITIES AND VISUAL AND PERFORMING ARTS
The objective of the humanities and visual and performing arts in a core curriculum is to expand
students' knowledge of the human condition and human cultures, especially in relation to
behaviors, ideas, and values expressed in works of human imagination and thought. Through
study in disciplines such as literature, philosophy, and the visual and performing arts, students
will engage in critical analysis, form aesthetic judgments, and develop an appreciation of the arts
and humanities as fundamental to the health and survival of any society. Students should have
experiences in both the arts and humanities.
Exemplary Educational Objectives
1. To demonstrate awareness of the scope and variety of works in the arts and humanities.
2. To understand those works as expressions of individual and human values within an
historical and social context.
3. To respond critically to works in the arts and humanities.
4. To engage in the creative process or interpretive performance and comprehend the
physical and intellectual demands required of the author or visual or performing artist.
5. To articulate an informed personal reaction to works in the arts and humanities.
The University of Texas of the Permian Basin
82
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
6. To develop an appreciation for the aesthetic principles that guide or govern the
humanities and arts.
7. To demonstrate knowledge of the influence of literature, philosophy, and/or the arts on
intercultural experiences.
V. SOCIAL AND BEHAVIORAL SCIENCES
The objective of a social and behavioral science component of a core curriculum is to increase
students' knowledge of how social and behavioral scientists discover, describe, and explain the
behaviors and interactions among individuals, groups, institutions, events, and ideas. Such
knowledge will better equip students to understand themselves and the roles they play in
addressing the issues facing humanity.
Exemplary Educational Objectives
1. To employ the appropriate methods, technologies, and data that social and behavioral
scientists use to investigate the human condition.
2. To examine social institutions and processes across a range of historical periods, social
structures, and cultures.
3. To use and critique alternative explanatory systems or theories.
4. To develop and communicate alternative explanations or solutions for contemporary
social issues.
5. To analyze the effects of historical, social, political, economic, cultural, and global forces
on the area under study.
6. To comprehend the origins and evolution of U.S. and Texas political systems, with a
focus on the growth of political institutions, the constitutions of the U.S. and Texas,
Federalism, civil liberties, and civil and human rights.
7. To understand the evolution and current role of the U.S. in the world.
8. To differentiate and analyze historical evidence (documentary and statistical) and
differing points of view.
9. To recognize and apply reasonable criteria for the acceptability of historical evidence and
social research.
10. To analyze, critically assess, and develop creative solutions to public policy problems.
11. To recognize and assume one's responsibility as a citizen in a democratic society by
learning to think for oneself, by engaging in public discourse, and by obtaining
information through the news media and other appropriate information sources about
politics and public policy.
12. To identify and understand differences and commonalities within diverse cultures.
VI. INSTITUTIONALLY DESIGNATED OPTION
An institution may wish to include in its core curriculum courses that address exemplary
educational objectives not covered in the preceding broad discipline categories. Such courses
may include computer literacy, kinesiology, health/wellness, interdisciplinary or linked courses, or
other courses that address a specific institutional role and mission.
From The Texas Higher Education Coordinating Board Website
http://www.thecb.state.tx.us/AAR/UndergraduateEd/fos_assumpdef.cfm
83
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix H
General Education Results Form
Course Prefix and Number:
EEO
Data Collection Semester/ Year:
Student Outcome
Methodology
Information Provided By:
Results
Use of Results
1.
84
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix I
General Education Results Form Example
Course Prefix and Number: Biology 1307
EEO for Biology 2
1. To understand
and apply method
and appropriate
technology to the
study of natural
sciences
2. To recognize
scientific and
Data Collection Fall 2007
Information Provided By: Donald M. Allen, Ph.D.
Student Outcome
a. 60 % of students will
be able to a) correctly
develop a testable
hypothesis, b) identify
appropriate methods for
obtaining data and c)
analyze/interpret data to
support or reject a
hypothesis
Methodology
a. Students will show
mastery of these
elements as in b (below):
1- ability to correctly
phrase a testable
hypothesis, 2- identify
appropriate methodology;
3-ability to foresee,
collect and interpret data
to support or reject a
hypothesis and/or
correctly make identify an
appropriate conclusion.
Results
a. On a midterm exam,
54 students answered 5
MC questions at an
average correct response
rate of 45%).
Use of Results
a. Improve delivery of
instruction and homework
activities to provide the
experience necessary to
attain the EEO at the desired
level.
b. 60 % of students will
correctly answer test
questions which require
an understanding of
hypothesis formation,
data collection and
quantitative methodology
within sub-disciplines of
Biology.
b. Students will answer
several questions
(multiple choice or essay
on mid-term exam
and/or Final exam).
Student must match or
exceed a 60% correct
response rate in order to
demonstrate mastery of
this EEO.
b. 56 students took the
final exam. Of 9 MC
questions, the average
correct response rate
was 56.7%. The average
correct response for 3
essay questions was
62.6 %.
b. Add more time for
instruction and recitation in
those subject areas in which
the number of students
deemed proficient fall below
60% correct.
a. 60 % of students will
be able to distinguish
a. Student responses will
be evaluated for
85
The EEO was not
attained at the desired
response rate (60%
correct response rate).
Specific areas of concern
are indicated.
a. 54 students answered
9 MC questions on a
a. Improve delivery of
instruction and the homework
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
quantitative
methods and the
differences
between the
approaches and
other methods of
inquiry and to
communicate
findings, analyses,
and interpretation
both orally and in
writing
3. To demonstrate
knowledge of the
major issues and
problems facing
modern science,
including issues
that touch upon
ethics, values, and
public policies
between different
experimental designs and
the appropriate methods
for obtaining data under
these designs; and will be
able to interpret data,
display data in graphic or
tabular form and
communicate their
findings in written or oral
form.
understanding of the
EEO using the rubric
elements, as in b.
(below): 1- ability to
correctly identify and/or
use a quantitative
analysis; 2- ability to
analyze, interpret and
present data in written
and graphical formats,
and 3-to apply the data
to support a hypothesis
or distinguish between
hypotheses.
midterm exam for a 64%
correct response.
activities associated with the
quantitative methods and
other methods of inquiry to
provide the experience
necessary to attain the EEO
at the desired level.
b. 60 % of students will
understand different
quantitative methods and
experimental approaches.
b. Students will answer
questions (multiple choice
or essay) on a mid-term
and/or Final exam.
Answering these
questions correctly would
demonstrate attainment
of the EEO. Students
must show mastery of
these elements (1-3, in a.
above) for 3 of 5
questions to demonstrate
proficiency (60%).
b. On the final exam, 56
students answered 6 MC
questions with a 54.2 %
correct response rate.
The students also
answered 3 essay
questions with a 62.6%
correct response rate.
b. Add more time for
instruction and recitation in
those subject areas in which
the student responses fall
below 60% correct response
rate
a. 60% of students will be
able to demonstrate
informed opinion on one
or more major issues in
the Biological Sciences
which interface with
ethics, values and public
policy.
a. Students will be
evaluated for
understanding of the EEO
using the rubric elements,
as in b. (below): 1- ability
to correctly identify and
interpret any of several
issues facing mankind.
2- ability to understand
86
Results are mixed. EEO
was attained less than
60% of the time by testtaking students. Areas of
concern are being
identified.
a. 54 students answered
8 MC and 2 essay
questions for a mean
correct response rate of
72%.
a. No actions necessary at
this time.
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
and/or communicate
ideas for solutions to
problems based on
scientific knowledge. 3ability to recognize both
ethical and political
dimensions of any actions
that might be taken.
b. 60% of students will be
asked to state their
opinions on various
issues. The scientific
bases of their responses
should permit an
assessment of whether
they could be considered
adequately informed with
respect to science
(biological) background.
4. To demonstrate
knowledge of the
interdependence of
science and
technology and
their influence on,
and contribution to
modern culture
a. 60% of students will
demonstrate knowledge
of the interdependence of
modern biological science
and technology, including
the computational
sciences and to
understand relationships
affect culture.
b. 60% of students will
be asked to state their
b. Students will answer
questions (multiple choice
or essay) on a mid-term
and/or Final exam which
requires attainment of the
EEO as in a. above.
Students must show
mastery of these
elements for 3 of 5
questions to demonstrate
proficiency (60%).
a. Student responses will
be evaluated for
understanding of the EEO
using the rubric elements
as in b. (below): 1- ability
to understand the
interdependence of
science and technology;
2- ability to communicate
this understanding; 3ability to recognize the
role of science and
technology as a
determinant of modern
human culture and
society.
b. Students will answer
questions (multiple
87
b. 55 students took the
final exam and answered
4 MC questions with a
60.2 % correct response
rate.
They also answered 2
essay questions at an
83% correct response
rate.
Overall, this EEO is being
attained.
a. 54 students took the
midterm exam and
answered 3 MC
questions at a 51%
correct response rate
and answered 2 essay
questions at a 63%
correct response rate.
a. No actions necessary at
this time, except that delivery
of instruction and the
homework activities should
continue to attain the EEO at
the desired level.
b. On the final exam, 55
students answered 3
b. Instruction and recitation
should be increased in those
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
knowledge on various
issues related to the
interdependence of
science and technology
AND show understanding
of how science has and
will affect human culture
(society).
choice or essay) on a
mid-term and/or Final
exam which requires
attainment of the EEO,
using the rubric of a.
(above). Students must
show knowledge of
these elements for 3 of 5
questions to
demonstrate proficiency
(60%)
essay questions at an
80.3% correct response
rate.
subject areas in which the
student responses fall below
60% correct response rate.
Taken together, the data
indicated that the EEO is
being attained by a
sufficient number of
students.
However, the low
response rate in the 3
MC questions must be
addressed.
*NOTE: Each question used and the number of students answering it correctly and incorrectly will be archived at the Department Level (Biology
Department). A new RRF will be submitted along with new data and use of data each semester at the conclusion of each semester during which
the course is offered.
88
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix J
Examples of Correct and Incorrect Administrate/Support Office Outcome Statements
University Police Department
Correct: Eighty-five percent of the students will indicate that they are “satisfied” or “very satisfied” with
the campus escort service on the Graduating Senior Survey.
Incorrect: Students will approve of the campus escort service.
Purchasing
Correct: The numbers of errors in purchase orders submitted to Purchasing will decrease by 10%
within one month after the Purchasing Workshop.
Incorrect: The accuracy of information on Purchase Orders will improve.
Physical Plant
Correct: The number of gallons of water used per square foot of landscaping will decrease by 5%
once the new pump controller is installed.
Incorrect: The new pump controller will decrease water usage.
Human Resources
Correct: The number of date input errors will decrease by at least 5% per month in the first two
months of the fall 2008 term with the use web- based system for data entry.
Incorrect: Data errors will face when the Office begins using the new web-based system for data
entry.
Student Housing
Correct: Overall satisfaction with life in student housing will not face below 3.5% on the annual
housing survey.
Incorrect: Students will be satisfied with student housing.
Compliance
Correct: At least ninety-five percent of graduating seniors answering the Graduation Survey will be
“satisfied” or “very satisfied” with the services of the Cashiers Office.
Incorrect: Students will approve of the Cashiers Office
Accounting
Correct: At least seventy-five percent of graduating seniors responding to the Senior Survey will be
“satisfied” or “very satisfied” with the services of the Cashier’s Office.
Incorrect: Students will approve of the Cashier’s Office.
Central Services
Correct: With the exception of computer equipment, ninety-five percent of packages received in
Central Receiving will be either delivered or picked up within 24 hours of the time and date logged in
as received.
Incorrect: Personnel in Central Services will deliver packages as quickly as possible after delivery.
Admissions
Correct: Ninety percent of inquiries will receive a reply within 24 hours of the receipt of the inquiry in
Admissions.
Incorrect: The Office will reply to inquiries quickly.
Career Center
Correct: All the final resources completed by students in the Sell Yourself Workshop will show at least
3 points of improvement from the initial resume constricted by the students in the Workshop.
Incorrect: 90% of students will be better resume writers after the Sell yourself Workshop.
89
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Financial Aid
Correct: Eighty-five percent of students believe their log in time and the time that a counselor logs
them in to their office will spend less than 30 minutes waiting.
Incorrect: Students will not experience long wait times in the office.
PASS Office
Correct: Students who attend 3 or more tutoring sessions will on average have higher grades by 0.05
to higher in the tutored subject than a comparison group who did not attend tutoring.
Incorrect: Students who attend tutoring will do better academically than those who do not.
Student Activities
Correct: At least fifty percent of students will vote in the Student Senate Election.
Incorrect: More students will vote in the Student Senate Election.
Student Recreation
Correct: User complaints about the cleanliness of the weight room will fall by at least 20%.
Incorrect: Complaints about the facilities will decrease.
Counseling Center
Correct: From May 1, 2007 to August 30, 2008, the number of repeat drug and alcohol offenders who
were referred to the Counseling Center on their first offence will fall 5%.
Incorrect: Repeat drug and alcohol offenders will decrease this year.
Dean of Students
Correct: The number of students referred to students discipline will fall by at least 25 during between
June 1, 2007 and June 1, 2008.
Incorrect: The number of drug and alcohol cases will fall this academic year.
Athletics
Correct: The 4-year graduation rate for student athletes will exceed the 4-year graduations rate for
the student population as a whole.
Incorrect: Graduation rates for athletes will be higher than the general student population.
Continuing Education
Correct: 200 or more children will enroll in Camp Falcon during summer 2008.
Incorrect: More children will sign up for Camp Falcon.
Registrar
Correct: Transcript requests will be completed within 24 hours of their receipt.
Incorrect: Transcripts will go out in a timely fashion.
Library
Correct: There will be an increase of 20 percent in the number of students trained to use the Library
in 2007-2008 over the number of students trained in 2006-2007.
Incorrect: Students will be trained in information literacy.
Institutional Research
Correct: 95% of external reports and surveys will be completed and certified (as required) by the
agency established deadline.
Incorrect: External reports will be submitted promptly.
Athletics
Correct: Each year students originally recruited as athletes will have a higher four-year graduation
rate than the general university population.
Incorrect: Athletes will graduate at a higher rate than the general university population.
90
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix K
Institutional Effectiveness Forms Rubric
Organization
Clarity
Goal
Statements
Excellent
 Each goal has at least one
objective/outcome
statement.
 Each objective has one or
more outcome
statements.
 Each outcome statement
has one or more methods.
 Results and use to
improve the
program/service are clear.
 Improvements show an
understanding of the
significance of results.
 All statements are logical
and easily understood.
 Goals, objectives,
outcomes, methods and
use of results are stated
are at appropriate level.
 Outcome targets are
stated.
 It is clear whether or not
the outcome targets
were attained.
 Goals are broad
statements of what
students should know,
do or believe; or
functions and how they
operate.
 Goals are appropriate for
Adequate
 80% to 90% of goals have at
least 1 objective/outcome.
 Each objective has one or
more outcome statements.
 Each outcome statement has
one or more methods.
 Results and use of results to
improve the program/service
is clear in most cases.
 Improvements appear useful to
the program.
Needs Work
 Major goals are not addressed
by objectives/outcomes.
 Most objectives have an
appropriate outcome.
 It may be unclear which
outcome is associated with
which method.
 It is unclear which results
come from which method.
 How results led to use is
unclear.
 Improvements are rare,
routine and inconsequential.
Unacceptable
 Major goals unaddressed by
an objective/ outcome.
 It is not clear which
objectives are associated
with which outcomes.
 It is unclear which outcome
statement is associated with
which method.
 It is unclear how results
were obtained.
 Use is unrelated to any of
the statements of results.
 Statements are basically
logical and understandable.
 Statements have enough
detail to understand what is
happening and why.
 Outcome targets are stated
in most cases.
 It is usually clear whether or
not the outcomes were
attained.
 At least some statements are
confusing or poorly stated.
 Statements do not contain
enough information about
what is being done or why.
 Some outcomes have
targets.
 In some cases it is unclear
whether or not the outcomes
have been attained.
 Statements are difficult to
understand.
 Statements have
insufficient information to
clarify what is being done
or why.
 Outcomes lack targets.
 Unclear whether or the
outcomes are attained.
 Some goals are too specific
or too general.
 Goals are appropriate for a
college level program or
university office.
 Goals are too specific or too
broad.
 Goals are appropriate for a
college level program or
university office.
91
 Goals are too specific or
too broad.
 Goal statements are
missing.
 One or more goals are
inappropriate for a collegelevel program or the office.
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Objectives (if
applicable)
Outcomes
Methods
Results
Use of
Results
a college level program
or university office.
 Objectives are
measurable.
 Objectives follow clearly
from goals.
 Outcomes have
challenging targets for
attaining the objective.
 Outcomes are good
operational
representations of the
objectives.
 Methods are appropriate
for outcomes.
 Description of the
method and its
implementation are
complete.
 Methods are understood
from initiation through
interpretation of results.
 The methods give
diagnostic results that
can be used.
 Pertinent results are
stated clearly and
concisely.
 In all cases in which the
target was not attained
the results were used.
 One or more cases the
results were used for
program/service
improvement.
 Most objectives are
measurable.
 Most of the objectives are
derived the goals.
 Most of objectives are too
broad to be measurable.
 It is mostly clear which
objectives are derived from
which goals.
 Outcomes have clear if not
always appropriate targets
for success.
 Most outcomes are
reasonable representations
of their associated objective.
 The objectives are too
broad to be measurable.
 It is unclear why objectives
were chosen and how they
relate to the goals.
 Most outcomes have no
clear target or the target is
too low to be appropriate.
 Most outcomes are poorly
represent objectives or are
unrelated to an objective.
 Methods are appropriate.
 Methods and their
implementations are
basically clear.
 Methods are understood at
least through implementation
although the analysis may
need work.
 The methods give some
diagnostic/useable results.
 Methods are mostly
appropriate.
 Description of methods and
implementations needs to be
clearer.
 It is not clear how well
methods are understood.
 The methods could yield
diagnostic/useable results.
 Methods are inappropriate
for the outcomes.
 The method can only be
poorly implemented if at all
 The methods in the
majority of cases appear to
be poorly understood.
 The methods will not yield
diagnostic/useable results.
 Pertinent results are included
in the statement of results.
 Some pertinent results are
missing or confusing worded.
 Appropriate results are
missing or poorly worded.
 Results were used in all
cases in which the target
was not attained.
 In one or more cases the
results were used for
improvement.
 There is a relationship
 It is not clear that in all cases
in which the target was not
attained the results were
used.
 It is not clear that results
were used for improvement.
 The relationship between
 When the target was not
attained the results were
not used or an attempt
was made to explain them
away.
 None of the results were
used for improvement.
 Outcomes have reasonable
targets.
 Outcomes are reasonably
valid representations of their
associated objective.
92
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
 There is a clear
relationship between the
results shown and how
the results were used.
between the results shown
and their use.
results and use may not be
completely clear.
 The relationship between
results and use is unclear.
Comments:
93
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix L
Further Reading
Adelman, E. (Ed.). (1986). Assessment in higher education: Issues and contexts. Washington,
D.C.: Office of Educational Research and Improvement. U.S. Department of Education.
Allen, M.J. (2006). Assessing general education programs. Bolton, MA: Anker Publishing Co.
___________ (2004). Assessing academic programs in higher education. Bolton, MA: Anker
Publishing Co.
Angelo, T.A. (1996). Transforming assessment. AAHE Bulletin , 48 (8), 3-4.
___________ (1995). Reassessing (and defining) assessment. AAHE Bulletin, 48 (3), 7-9.
___________ (1995). Reassessing assessment. AAHE Bulletin, 47 (8), 10-13.
____________ & Cross, K.P. (1993). Classroom assessment techniques (2nd ed.). San
Francisco: Jossey-Bass.
Assessment Forum. (1992). Principles of good practice for assessing student learning.
Washington, DC: American Association of Higher Education.
Astin, A.W. (1993). Assessment for excellence. Phoenix: Oryx Press.
___________ (1993). What matters in college? Four critical years revisited. San Francisco:
Jossey-Bass.
Banta, T.W. et al. (2002). Building a scholarship of assessment. San Francisco: Jossey-Bass.
Banta, T.W. & Palomba, C.A. (Eds). (2001) Assessing student competence in accredited
disciplines: pioneering approaches to assessment in higher education. Sterling, VA: Stylus.
Banta, T.W. (1999). Assessment essentials: planning, implementing, and improving assessment
in higher education. San Francisco: Jossey-Bass.
Banta, T.W., Lund, J.P., Black, K. E. and Oblander F.W. (1996) Assessment in practice. San
Francisco: Jossey-Bass.
Banta, T.W. (Ed.). (1988). New directions for institutional research series: implementing
outcomes assessment: promise and perils. San Francisco: Jossey-Bass.
Barr, M.J., Upcraft, M.L. & Associates. (1990). New future for student affairs. San Francisco:
Jossey-Bass.
Bennion, D.H. & Work, S.D. (1996). Building trust and promoting faculty involvement in a
comprehensive assessment program at a large regional university. Assessment Update 8, 10-11.
Bresciani, M.J. (Ed.) (2007). Assessing student learning in general education. Bolton, MA: Anker
Publishing Co.
94
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Cambridge, B.L. et al. (Eds). (2001). Electronic portfolios: emerging practices in student, faculty,
and institutional learning. Sterling, VA: Stylus.
Curry, L., Wergin, J.F., and Associates (1993). Educating professionals: responding to new
expectations for competence and accountability. San Francisco: Jossey-Bass.
Denny, L (1994). Institutionalizing assessment. Assessment Update, 6 (2), 8-9.
Dillman, D.A. (2000). Mail and internet surveys: the tailored design method. 2nd Ed. New York:
John Wiley and Sons.
Donald, J. G., (2002). Learning to think: disciplinary perspectives, San Francisco: Jossey-Bass.
Erwin, T.D. (1991). Assessing student learning and development. San Francisco: Jossey-Bass.
Ewell, P. (1994). A policy guide for assessment. Princeton: Educational Testing Service.
__________. (Ed.) (1985). Assessing educational outcomes. New directions for institutional
research. San Francisco: Jossey-Bass.
__________ (Ed.) (1989).: Enhancing information use in decision making. New directions for
institutional research. San Francisco: Jossey-Bass.
Gray, P.J. (1985).: Achieving assessment goals using evaluation techniques. New directions for
higher education. San Francisco: Jossey-Bass.
Hart, D. (1994). Authentic assessment: a handbook for educators. White Plains, NY: Dale
Seymour Publisher.
Juillerat, S. & Schreiner, L.A. (1996). The role of student satisfaction in the assessment of
institutional effectiveness. Assessment Update 8, 8-9.
Keller, P.A. (1994). Using student quality focus groups to assess department climates for
teaching and learning. Mansfield, PA: Mansfield University.
Liddell, D. L.& Lund, J. P. (2000). Powerful programming for student learning: approaches that
make a difference. New Directions for Student Services, no. 90. San Francisco: Jossey-Bass.
Light R. (2001). Making the most of college: students speak their minds, Boston, MA: Harvard
Press.
Madison, B.L. (Ed.) (2006). Assessment of student learning in college mathematics: towards
improved programs and courses. Volumes 1 and 2. Tallahassee, FL: Association for Institutional
Research.
Marchese, T.J. (1994). Assessment, quality, and undergraduate improvement. Assessment
Update, 6 (3), 1-2, 12-14.
McKeachie, W.J. & Kaplan, M. (1996). Persistent problems in evaluating college teaching. AAHE
Bulletin, 48, 5-8.
95
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Mentkowski, M. et al (2000). Learning that lasts: integrating learning, development, and
performance in college and beyond. San Francisco: Jossey-Bass.
Morgan, D.L. (1988). Focus groups as qualitative research. Newbury Park: Sage Publications.
Nichols, J.O. (1991). A practitioner's handbook for institutional effectiveness & student outcomes
assessment implementation. New York: Agathon Press.
Nichols, J. O.; Nichols, K. W. (2001). General education assessment for improvement of student
academic achievement: guidance for academic departments and committees. New York: Agathon
Press
Nichols, K. W. and Nichols, J.O. (2000). The department heads guide to assessment implement
in administrative and educational support units. New York: Agathon Press.
Pascarella, E.T. & Terezini, P.T. (1991). How college affects students. San Francisco: JosseyBass.
Ratcliff, J. L.; Lubinescu, E. S.; Gaffney, M. A. (Eds). (Spring 2001). How accreditation influences
assessment. New Directions for Higher Education, no. 113. San Francisco: Jossey-Bass.
Ratcliff, J.L. (1992). Assessment and curriculum reform. New directions for higher education. San
Francisco: Jossey-Bass.
Romberg, E. (Ed.). Outcomes assessment: a resource book. Washington, D.C.: American
Association of Dental Schools.
Schiltz, M.E. (Ed.). (1992). Ethics and standards in institutional research. New directions for
institutional research. San Francisco: Jossey-Bass.
Schuh, J.H. & Upcraft, M.L. et al (2000). Assessment practice in student affairs: an applications
manual. San Francisco: Jossey-Bass.
Steen, L.A. (Ed). (2006). Supporting assessment in undergraduate mathematics. The
Mathematical Association of America, Inc.
Stufflebeam, D. L., (Spring 2001). Evaluation Models. New Directions for Evaluation, no. 89. San
Francisco: Jossey-Bass, Inc. Publishers.
Suskie, L.A. (1992). Questionnaire survey research - what works. Tallahassee: Association for
Institutional Research.
Suzuki, L. A.; Ponterotto, J. G.; & Meller, P. J. (Eds). (2001). The handbook of multicultural
assessment: clinical, psychological, and educational applications, 2nd ed. San Francisco: JosseyBass.
Upcraft, M.L. & Schuh, J.H. (1996). Assessment in student affairs. San Francisco: Jossey-Bass.
Voorhees, R.A. (Ed). (2001). Measuring what matters: competency-based learning models in
higher education. San Francisco: Jossey-Bass.
Walvoord, B.E. (2004). Assessment clear and simple. San Francisco: John Wiley and Sons.
96
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Yance, B.D. (1988). Applying statistics in institutional research. New directions for institutional
research. San Francisco: Jossey-Bass.
97
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Appendix M
Reporting Departments/Offices and Degree Programs*
President’s Division
Athletics
Continuing Education and Outreach
Internal Audit
Publications and Special Projects
Institutional Advancement
Academic Affairs Division
College of Arts and Sciences
Department of Biology
B.S. in Biology
B.A. in Multidisciplinary Studies
M.S. in Biology
Department of History
B.A. in History
B.A. in Humanities
M.A. in History
Department of Kinesiology
B.S. in Kinesiology
B.S. in Athletic Training
B.A.A.S. (tracks in Health Professions, Graphic Arts, and Human and Legal Studies)
M.S. in Kinesiology (collaborative online program)
Department of Literature and Languages
B.A. in English
B.A. in Spanish
M.A. in English
M.A. in Spanish
Department of Mathematics and Computer Sciences
B.S. in Computer Science
B.S. in Information Systems
B.S. in Mathematics
M.S. in Computer Science
Department of Physical Sciences
B.S. in Chemistry
B.S. in Environmental Science
B.S. in Geology
M.S. in Geology
Department of Psychology
B.A. in Child and Family Studies
B.A. in Psychology
M.A. in Clinical Psychology
M.A. in Applied Research in Psychology
Department of Social Sciences
B.A. in Criminology
B.S. in Criminal Justice (collaborative online program)
B.A. in Leadership Studies
B.A. in Political Science
B.S.W. in Social Work
B.A. in Sociology
M.S. in Criminal Justice Administration
MPA in Public Administration-Leadership
98
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Department of Visual and Performing Arts
B.A. in Art
B.F.A. in Art
B.A. in Communication
B.A. in Humanities (Music Track only)
Academic Advising Center
Laboratory Division
Math and Science Center
Writing Center
School of Business
B.A. in Economics
B.B.A. in Accountancy
B.B.A. in Finance
B.B.A. in Management
B.B.A. in Marketing
M.B.A.
M.P.A. (Professional Accountancy)
Department of Engineering and Technology
B.A.A.S. (Industrial Technology Track only)
B.S. in Industrial Technology
B.S. in Mechanical Engineering
Small Business Development Center
School of Education
Department of Curriculum and Instruction
M.A. in Education in Bilingual/ESL
M.A. in Early Childhood Education
M.A. in Special Education
M.A. in Reading
Department of Leadership, Counseling, and Foundations
M.A. in Educational Leadership
M.A. in Professional Education
M.A. in Counseling
Certification Office
Office of Field Experience
Center for Energy and Economic Diversification
Dunagan Library
Graduate Studies and Research
HT3R
Information Resources Division
Institutional Research, Planning, and Effectiveness
JBS Public Leadership Institute
Registrar’s Office
REACH
Business Affairs Division
Accounting
Compliance
Human Resources
Purchasing
Central Stores
Physical Plant
Student Housing
University Police
99
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Student Services Division
Admissions
Career Services
Financial Aid
Hispanic Serving Institutions Programs**
Mentor Program
PASS Office
Student Engagement/ Dean of Students (Student Discipline)
Retention
Student Life
Intramurals
Gym
Counseling Center
*Degree programs are shown in italics. Assessment forms must be turned in for each degree
program except shared online programs through the UT telecampus.)
**Grant programs are temporary and submit effectiveness information to the granting agency.
4-12-09
100
The University of Texas of the Permian Basin
Office of Institutional Effectiveness
Institutional Effectiveness Handbook
Download