Assessment Q & A, Examples, and Tools

advertisement
Assessment Q & A, Examples, and Tools
What is a student learning outcome (SLO)?
Student Learning Outcome (SLO): An SLO identifies the measurable knowledge, skills, behaviors, or
attitudes of the learner as the result of engaging in a learning activity or program. Typically, SLOs are
composed with the stem, “The student will…”.
What is Assessment?
The systematic collection and analysis of information to improve student learning and program
viability. Assessment is “…the process of gathering evidence to make inferences about…how
students are progressing toward specific goals” (National Standards, quote form Pennington,
2001,p. 206)
What is Value-Added Assessment?
Value-Added is an analytical strategy to determine the degree to which students change from
the beginning to the end of a program. Astin (1985) referred to this type of change as talent
development.
Aren’t the SLOs (Student Learning Outcomes) essentially the same thing as the SOLs
(Standards of Learning) that are creating havoc in the public schools?
No, actually they are quite different. The SOLs really focus on student assessment, whereas our
SLOs are meant to be course/program assessment. The goal of the SOLs is to evaluate individual
student achievement in a state imposed curriculum and determine whether they are ready to
go on to the next grade. SOLs are also used to evaluate whether teachers or schools are
successful in getting al their students where they need to be to precede to the next level.
Our SLOs are of our own choosing, within the parameters of the VCCS course/program guide.
They are the specific, measurable skills our faculty have stated they want students to achieve in
a particular course/program. It is our job to develop assessment tools that measure these, in a
way that does not hinder individual teaching styles or methods and promotes sharing of best
practice and good ideas.
Why aren’t grades enough?
When faced with the news that it‘s your discipline‘s turn for outcomes assessment, it is tempting
to ask why you can‘t just look at final grades to determine whether a course is successful.
Although counting letter grades is easy, it provides neither consistent nor meaningful
information about student success in a multi-section course.
In outcomes assessment, the terms ―scoring‖ and ―grading‖ have different meanings. Scoring
refers to the process of marking an assessment instrument to get data about how well the course
has done at achieving its outcomes. Grading is the process of marking an assessment instrument
for the purpose of assigning a student a grade for the course. Scoring needs to be done
consistently across all sections; grading can be done differently in each section if instructors
desire. In no way, does the outcome assessment scoring process infringe on an instructor‘s
grading.
Unless every instructor teaching a particular course assigns final course grades in exactly the
same way (same assignments, same exams, same weights, same grading approach), you cannot
be confident that one section‘s A is the same as another section‘s A. More significantly, final
grades are an aggregate assessment of a student‘s entire work for the course, often including
attendance and class participation. Consequently, looking at a distribution of grades will provide
little, if any, useful information about the degree to which students are learning those things that
instructors deem most important in the course.
Course Grades versus Course Assessment
Course grades do not provide the same insight that a course assessment does.
Grades give a global evaluation but do not provide sufficiently detailed information about
which course outcomes students are mastering well and which are giving them trouble.
Course grades alone don‘t stimulate faculty discussions about how to improve student
learning of particular course outcomes.
Grades sometimes are based on more than mastery of course content; for example,
participation, attendance, bonus points.
Grading standards often vary widely among different instructors and do not indicate the
same degree of mastery of course outcomes.
Grade inflation (easy tests, generous grading, extra-credit bonuses) sometimes presents a
misleading indicator of student mastery of course outcomes.
The list below shows additional differences between assessment versus grades:
Assessment
Grades
Formative
Summative
Formative refers to the formation of a concept or item whereas summative refers to an ―adding-up‖ or summary
stage. Assessments usually occur in mid-progress when corrections can be made. Grades are usually recorded at the
end of a project or class in order to summarize academic quality.
Diagnostic
Final
Non-Judgmental
Evaluative
Assessment is non-judgmental in the sense that it focuses on learning, which is the outcome of many influences,
including teaching style, student motivation, time on task, study intensity, and background knowledge. Therefore, no
one element can be reasonably singled out for praise or blame for a particular learning outcome. In contrast, grades
carry evaluative weight as to the worthiness of student achievement and are applied, for good or ill, directly to them.
Private
Administrative
Assessment tends to be used in private and become public only under the assessor‘s control. Grades, while not truly
public, are part of the administrative record available throughout an educational institution.
Often Anonymous
Identified
Assessment is almost always collected in anonymous fashion and the results are released in the aggregate. Grades
2
are identified with specific students.
Partial
Integrative
To use a metaphor from the calculus, assessment more resembles a partial derivative whereas grades are more
recognizable as in integrative process.
Specific
Holistic
Assessment tends to look at specific parts of the learning environment. Grades are holistic in the sense that they
record academic achievement for a whole project. Final grades, of course, can reduce academic achievement for an
entire semester to a single mark.
Mainly Subtext
Mostly Text
The text of a course is its disciplinary content; grades tend to focus on that. The subtext of a course involves the
transferrable baccalaureate skills, such as critical thinking, creative thinking, writing, and analysis. For example, the
―text‖ of a course in anatomy and physiology includes the names of bones and functions or muscles. The ―subtext‖
of such a course might include scientific thinking, problem solving, and memory improvement. Grades tend to focus
on text; assessment tends to emphasize subtext.
Suggestive
Rigorous
Assessment findings tend to be suggestive and have pedagogical significance. That is, assessment findings shift
pedagogy for reasons that need not be justified statistically, but can be justified when even one student learns better.
In contrast, grades are recorded in a rigorous manner that does have statistical significance.
Usually Goal-Directed
Usually Content-Driven
As with text and subtext mentioned above, grades tend to reflect student control of disciplinary course content
whereas assessment usually aims at the goals for all baccalaureate students, such as synthetic thinking and esthetic
appreciation.
How do you write SLOs for a course or program?
A student learning outcome statement needs to specify who is to perform (student), what action
they are to take, and some result that must come from their action. A student learning outcomes
(SLOs) for a course/program should:
Be written in terms of what the student/graduate will be able to do at the end of the
course/program
Be limited to 2-5 outcomes
Keep them short and simple (KISS)
Make them specific, measurable, attainable, realistic, and timely (S.M.A.R.T)
Establish a target performance level for success (i.e. 70% will …)
Keep the assessment process manageable and meaningful (M&M)
Use Bloom‘s Taxonomy and active verbs (create, analyze, demonstrate, etc.)
Be written in the positive instead of the negative
Reflect measurable standards (benchmarks) or reflect the basic knowledge and skills that
the student will be held accountable for
Reflect a combination of higher order thinking skills and supporting or enabling skills
What is the difference between course assessment vs. class assessment?
Course assessment measures the student learning that takes place in ALL sections of the course
for the entire college. It is not to be confused with assessment of instructors or employment
evaluation.
A course assessment consists of all the classes (sections) being taught; for example, ENG 111. A
class assessment is one section of a course, ENG 111-51A or ENG 111-61B.
3
How important is it to design Course Assessment processes to include students from all
locations that a course is taught?
Inclusion of students from all locations that a course is taught is crucial to the process of Course
Assessment. This means that if a course is taught through dual enrollment or is web-based, at
both campuses, or off-site, the mechanism for gathering data for assessment of that course
must be designed to reasonably include students regardless of the location or delivery-method
of the specific section in which they were enrolled.
Therefore, if an end-of-course or beginning-of-course activity is developed for the purposes of
course assessment, the activity must take place in all sections of the applicable course during
the semester that data is being collected. The practical logistics of this requirement may
influence some choices of such activities. It is important when designing course assessment to
consider the logistics of gathering college-wide data.
Must data towards Course Assessment be limited to data collection at the end of the
semester?
Absolutely not! In fact, some of the most valuable data can be captured in creative ways from
students who completed the course being assessed during the previous semester.
For example, students who are beginning Chemistry 112 could be given the first-day-of-class
assessment covering the course objectives from Chemistry 111. The CHM 112 instructor can
review the results to get an idea of what the students have retained, and then pas those
assessment forms to the individual responsible for collating the CHM 111 assessment data. This
strategy obviously can work for any two-course required-sequence.
Finally, instructors may find that they can make arrangements with colleagues, either in their
same discipline or across disciplines, to collaborate in activities for course assessment. An
example of this could be in a nursing course, where it might be appropriate to include either a
formative or a summative assessment of students’ knowledge of infant to adolescent
developmental psychology. Aggregate student performance information on this assessment
could then be turned over to the psychology faculty for use in their assessment of the
Developmental Psychology course.
What is the difference between course assessment vs. program assessment?
Whereas course assessment focuses on the question of “how can the course be strengthened based on
how well students are mastering course objectives?”, program assessment focuses on student learning
outcomes for the program as a whole, as well as productivity measures related to the viability and
effectiveness of a degree or certificate program. Annual student learning outcome assessments are
done in all programs and a program review is evaluated using a 5-year cycle.
4
What is the difference between direct and indirect assessment?
Direct Assessment Methods: Direct assessment methods give instructors measurable data to
study. Some examples are written exams, oral exams, performance assessments, standardized
tests, licensure exams, oral presentations, projects, demonstrations, case studies, simulations,
portfolios and, juried activities with outside panels.
Indirect Assessment Methods: Indirect assessment methods provide extra information that may
be used to make changes. Examples include questionnaires, interviews, focus groups, employer
satisfaction studies, observations of advisory boards, and job/transfer school placement data.
Example Course Syllabus Using Student Learning Outcomes
Example Revision to Course Syllabus to Show Student Learning Outcomes
Old Course Objectives
New Course SLO Objectives
Students will …
To introduce students to PDCCC and
Virginia Community College System
policies and procedures
To promote support services available to
students
To familiarize students with skills
necessary for successful college
adjustment
To apply the organization skills necessary
for college success (time management,
stress management, note-taking, etc.)
To familiarize students with the Learning
Resources Center web page and
information literacy
To apply effective study skills and memory
techniques
To explore career possibilities and
formulate a tentative career plan
To understand the curricular planning
process
To understand the college transfer process
To teach interpersonal communication
skills
To provide essential facts concerning AIDS,
alcohol and substance abuse
To familiarize students with PDCCC’s Web
page and e-PDCCC
Possess knowledge of Paul D. Camp
Community College’s policies, procedures,
and resources
Demonstrate necessary survival skills for
college success (critical thinking, financial
planning, memory techniques, notetaking, study skills, and time management)
Demonstrate ability to use a computer to
access the Internet, the college website,
the blackboard site, and send and reply to
email
Demonstrate communication skills (oral
and written)
Demonstrate knowledge of personal
development areas, such as, essential facts
concerning AIDS, alcohol, and substance
abuse
Demonstrate an understanding of the
career planning process
5
Creating Student Learning Outcomes
To model writing student learning objectives in a straightforward and non-threatening manner,
the following chart uses levels of understanding from Bloom‘s Taxonomy, combines them with
action verbs, and provides examples for a variety of disciplines.
Example: Student Learning Objectives (SLO) Using Blooms Taxonomy
If I want to measure
knowledge outcomes,
I might write…
If I want to measure
comprehension
outcomes, I might
write…
If I want to measure
application outcomes,
I might write…
If I want to measure
analysis outcomes,
I might write…
If I want to measure
synthesis outcomes,
I might write…
If I want to measure
evaluation outcomes,
I might write…
The student/graduate will…
– Describe the basic components of empirical research.
– Give examples of major themes or styles in music, art, or theatre.
– Recognize in complex text local, rhetorical, and metaphorical patterns.
The student/graduate will…
– Correctly classify a variety of plant specimens.
– Explain the scientific method of inquiry.
– Summarize the important intellectual, historical, and cultural traditions in music, art,
or theatre from the renaissance to modern times.
The student/graduate will…
– Demonstrate in the laboratory a working knowledge of lab safety procedures.
– Apply oral communication principles in making a speech.
– Compute the area of a room.
– Use editing symbols and printers‘ marks.
The student/graduate will…
– Distinguish between primary and secondary literature.
– Diagram a sentence.
– Listen to others and analyze their presentations.
– Differentiate between historical facts and trivia.
The student/graduate will…
– Revise faulty copy for a news story.
– Formulate hypothesis to guide a research study.
– Create a poem, painting, design for a building.
The student/graduate will…
– Compare art forms of two diverse cultures.
– Critically assess an oral presentation.
– State traditional and personal criteria for evaluating works of art.
– Draw conclusions from experimental results.
6
Bloom’s Taxonomy: Action Verb List (Partial List)
Students/Graduates will …
Cognitive
Knowledge
will be able to acquire
will be able to collect
will be able to define
will be able to distinguish
will be able to examine
will be able to identify
Comprehension
will be able to associate
will be able to change
will be able to conclude
will be able to contrast
will be able to
demonstrate
will be able to describe
will be able to determine
will be able to
differentiate
will be able to discuss
will be able to
distinguish
will be able to draw
Application
will be able to apply
will be able to calculate
will be able to label
will be able to list
will be able to name
will be able to quote
will be able to recall
will be able to recognize
will be able to show
will be able to tabulate
will be able to tell
will be able to estimate
will be able to explain
will be able to extend
will be able to
extrapolate
will be able to fill in
will be able to give in
own words
will be able to illustrate
will be able to infer
will be able to
interpolate
will be able to interpret
will be able to make
will be able to predict
will be able to prepare
will be able to read
will be able to rearrange
will be able to reorder
will be able to rephrase
will be able to represent
will be able to restate
will be able to
summarize
will be able to transform
will be able to translate
will be able to change
will be able to choose
will be able to classify
will be able to complete
7
will be able to
demonstrate
will be able to develop
will be able to discover
will be able to employ
will be able to examine
Analysis
will be able to analyze
will be able to arrange
will be able to categorize
will be able to classify
will be able to compare
will be able to connect
will be able to contrast
will be able to deduce
Synthesis
will be able to classify
will be able to combine
will be able to compose
will be able to constitute
will be able to create
will be able to deduce
will be able to derive
will be able to design
will be able to develop
will be able to document
will be able to formulate
will be able to
experiment
will be able to generalize
will be able to illustrate
will be able to modify
will be able to organize
will be able to relate
will be able to
restructure
will be able to show
will be able to transfer
will be able to use
will be able to detect
will be able to
discriminate
will be able to
distinguish
will be able to divide
will be able to explain
will be able to explain
will be able to identify
will be able to infer
will be able to order
will be able to recognize
will be able to select
will be able to separate
will be able to generalize
will be able to integrate
will be able to invent
will be able to modify
will be able to organize
will be able to originate
will be able to plan
will be able to prepare
will be able to produce
will be able to propose
will be able to rearrange
will be able to relate
will be able to rewrite
will be able to specify
will be able to substitute
will be able to
synthesize
will be able to tell
will be able to transmit
will be able to write
Cognitive (Continued)
Evaluation
will be able to apprise
will be able to argue
will be able to assess
will be able to compare
will be able to conclude
will be able to consider
will be able to contrast
will be able to convince
will be able to decide
will be able to decide
will be able to
discriminate
will be able to explain
will be able to grade
will be able to judge
will be able to measure
will be able to rank
will be able to
recommend
will be able to select
will be able to
standardize
will be able to
summarize
will be able to support
will be able to test
will be able to validate
differentiate
will choose to listen (for)
will choose to posturally
respond to
will choose to select
will choose to separate
will choose to set apart
will choose to share
Affective
Receiving
will choose to accept
will choose to accumulate
will choose to combine
will choose to control
will choose to
8
Responding
will choose to acclaim
will choose to applaud
will choose to approve
will choose to augment
will choose to commend
Valuing
will choose to assist
will choose to debate
will choose to deny
will choose to help
will choose to increase
Organization
will choose to abstract
will choose to balance
will choose to compare
Characterization by Value
will choose to avoid
will choose to be rated
high by peers in
will choose to be rated
high by
will choose to comply
(with)
will choose to discuss
will choose to follow
will choose to play
will choose to practice
will choose to spend
leisure time in
will choose to volunteer
will choose to increase
numbers of
will choose to protest
will choose to relinquish
will choose to specify
will choose to subsidize
will choose to support
will choose to argue
will choose to define
will choose to discuss
will choose to formulate
will choose to organize
will choose to theorize
(on)
will choose to be rated
high by superiors in
will choose to change
will choose to complete
will choose to manage
will choose to require
will choose to resist
will choose to resolve
will choose to revise
9
Watch Out for Verbs that are not Measurable
In order for an objective to give maximum structure to instruction, it should be free of vague or
ambiguous words or phrases. The following lists notoriously ambiguous words or phrases which should
be avoided so that the intended outcome is concise and explicit.
48 Bad Words or Phrases
Avoid using verbs that are difficult to measure objectively. Following are examples of verbs difficult to
assess, and should be used with caution:
Appreciate
Comprehend
Experience
Realize
Be aware
Cover
Have faith in
Recognize
Memorize
Enjoy
Internalize
Study
Conceptualize
Familiarize
Know
Understand
Believe
Feel
Learn
Values
Hear
Capacity
Intelligence
See
Think
Listen
Self-Actualize
Depth
Be comfortable with …
Be acquainted with …
Grasp significance of…
Perceive
Gain knowledge of …
Appreciation for …
Acquainted with …
Attitude of …
Adjusted to…
Awareness of…
Capable of …
Comprehension of …
Cognizant of …
Enjoyment of …
Conscious of …
Feeling for …
Familiar with …
Interest in …
Knowledge of …
Self-Confident in …
10
Evaluation Method to Measure Outcomes
Method A
Writing Effective and Measurable Objectives: The A-B-C-D Model
Description
Example
Who is performing the action?
Following completion of the Science
Learning objectives are always
program, the student should be able
stated in terms of student
to plot a quadratic equation using a
outcomes.
graphing calculator in two minutes or
less.
B = Behavior
What will the student be able to
Following completion of the Science
do? Use an action verb that
program, the student should be able
describes an accomplishment that
to plot a quadratic equation using a
is measurable. Be specific. Choose
graphing calculator in two minutes or
a verb that expresses the skill and
less.
cognitive level that you want the
student to exhibit. (See Bloom’s
Taxonomy)
C = Condition
Give the conditions under which
Following completion of the Science
the performance will occur. Be
program, the student should be able
specific. Conditions should
to plot a quadratic equation using a
communicate the situation, tools,
graphing calculator in two minutes or
references, or aids.
less.
D = Degree
Describe the minimum criteria for
Following completion of the Science
acceptable student performance.
program, the student should be able
Define expectations regarding
to plot a quadratic equation using a
accuracy, quality, and speed. Be
graphing calculator in two minutes or
specific
less.
Note: Current educational practices recommend that the audience (student) and the behavior be connected
with the terms “should be able to” since faculty cannot promise that everyone will accomplish the stated
objective.
Element
A = Audience
A-B-C-D Model
Writing objectives isn’t creative writing: Just follow a formula!
Given [Conditions] the [Audience] will [Behavior] by [Degree].
[Audience] will [Behavior] to [Standard] when provided [Conditions].
11
Method B
Specific
Measurable
Attainable
Realistic
Timely
S.M.A.R.T Objectives to Generate Outcomes
A specific objective has a much greater chance of being accomplished
than a general goal (Who, What, Where, When, Which and Why)
General Goal – This year I am going to get into shape.
Specific Objective – This year I am going to join a health club and
workout 3 days a week.
Establish concrete criteria for measuring progress toward the attainment of
each objective you set
Stay on track, reach target dates and experience achievement
How much? How many? How will I know when it is accomplished?
When you identify objectives that are most important to you, you begin to
figure out ways you can make them come true.
You develop attitudes, abilities, skills, and financial capacity to reach
them.
You can attain most any goal you set when you plan your steps WISELY
and establish a time frame that allows you to carry out those steps.
To be realistic, an objective must represent something towards which you
are both WILLING and ABLE to work.
Your objective is probably realistic if you truly BELIEVE that it can be
accomplished.
An objective should be grounded within a timeframe. With no timeframe
tied to it there‘s no sense of urgency.
When you set a timeframe, then you have set your unconscious mind into
motion to begin working on the goal.
12
Methods of Assessment
“How do I assess thee, let me count the ways.”
Writing
Essay
Report
Journal/reflective writing
Book review
Letter of advice
Newspaper article
Lab report
In-class writing exercise
Annotated bibliography
Evaluate accuracy of …
Research paper
Abstract
Internship/field
experience/clinical report
Position paper
Critique
Log
Performing
Demonstration
Role play
Experiment
Simulation exercises
Performance
Presentation
Debate
Interviews
Creating/Developing
Video
Poster
Manual or brochure
Portfolio
Make a list
Experiment/hypothesis test
Concept map
Assignments: Capstone
course/project/experience
Fieldwork/internship/lab/clinical Survey
evaluation
Testing
Projects: group or individual
Written tests: objective
Analyzing
Written tests: essay
Case study
Oral test
Product analysis
Problem set
Quizzes
Discussing
Discussion: classroom or online
Standardized assessment test of
subject
Certification tests
Lab practical
Many assessment methods are applicable to more than one category
13
ELEMENTS OF ASSESSMENT AND PROGRAM REPORT
I.
Assessment of Student Outcomes
Assessment is the process of gathering evidence of student learning, reviewing the evidence to
determine if students are learning what they are expected to learn, and using this evidence to
alter the direction of your course.
For example, you might ―map‖ certain questions on a test to specific learning objectives. After
administering a test, you would examine the students‘ performance on the test questions to
determine how well the students‘ are grasping the intended learning outcomes. If you determine
the performance is satisfactory, then you have evidence that the learning objective is being met.
If you determine the students‘ performance is below your expectations, you should use the
feedback to reevaluate the way the material is presented or review the concepts with students. It
is important to remember that the purpose of the assessment is to create a better teaching and
learning experience.
Students who know what is expected of them in terms of their learning have a framework for
learning and are more successful. Faculty who have a clear idea of what they want their students
to learn are able to align their instructional activities to these outcomes. In these two ways,
clearly articulated outcomes are essential to student learning. Outcomes assessment allows us to
systematically examine the alignment between student learning, instructional or institutional
expectations and instructional activities. To this end, we begin planning for outcomes assessment
with student learning outcomes. A student learning outcome (SLO) is defined as a specific,
measurable competency (knowledge, skills, values, or attitudes) that your students should be able
to demonstrate as a result of participation in a learning activity. SLOs reflect a shift from a focus
from ―What am I teaching‖ to ―What are my students learning?‖ SLOs can be expressed and
measured at the course, program or institutional level.
Course Assessments
Where do we start?
Every course should have a set of college-wide, common, core expectations for student learning.
These expectations are the most important things a student who passes the course should take
away from any section of the course. While individual instructors may add to this course, there
should be a shared understanding of the core skills and knowledge upon which the course is
based. It is these expectations which should be reflected on each course syllabus and which
should be used to determine student learning outcomes for the outcome assessment process.
Student learning outcomes are statements that specify what you want your students to know and
be able to do at the end of the course. For example, student learning outcomes can refer to
knowledge, practical skills, critical thinking skills, etc. that students are expected to develop or
learn.
14
What makes a good learning outcome?
A well-defined student learning outcome specifies actions by students that are observable,
measurable, and must be done by the students themselves. The crucial factor in determining if
your learning outcome is well-defined is whether or not the action taken by the students can be
measured. Do not focus on small details, but rather on general knowledge and/or skills you
expect your students to acquire through your course. Do not merely describe activities or lessons
from the course, but rather articulate the learning that will result from the course. Make sure your
statement is centered not on what you are going to teach them, but rather on what the student will
do. For example, ―upon completion of this course students will be able to identify all the critical
elections in 20th Century America‖ as opposed to ―one objective of this course is to teach about
the critical elections in 20th Century America.‖
Generally speaking, good learning outcomes are:
Learner centered
Key to the course‘s mission
Meaningful for faculty and students
Representative of a range of thinking skills
Measurable
First, and most importantly, good learning outcomes focus on what students can do instead of
the effort we put into teaching them. Second, college-wide outcomes must be essential to the
course’s mission, something that everyone teaching the course agrees is important. Avoid
outcomes that are idiosyncratic or tied to a particular instructor‘s approach to a course. Third,
design outcomes that are meaningful for faculty and students. If you cannot explain why a
certain outcome is important, it probably isn‘t very meaningful. Finally, outcomes often reflect a
range of thinking skills, from low level identification to higher level application of knowledge
or skills.
Good outcome are measurable in some way; they communicate what student learning will be
evaluated in the course. Often courses will have two levels of outcomes; some broader based
outcomes which reflect higher order thinking skills and broad topics, and some more narrow,
lower level thinking skills outcomes which are essential to reaching the broader outcomes.
If the course doesn‘t have expectations for student learning formulated as student learning
outcomes, the development of college-wide common core student learning outcomes maybe one
of the first outcomes of this process. The outcomes should become a standard part of the
syllabus.
When defining SLOs to assess, it is tempting to take the easy route and think only in terms of
learning outcomes that represent lower order skills because they will be simpler to evaluate.
Instead concentrate on the skills and knowledge which are essential for a student to be
considered competent at the end of the semester. While some lower order types of leaning
outcomes may be essential to reaching higher level outcomes, make sure that you define a range
of outcomes which reflect higher order, complex application tasks in addition to any essential
supporting learning outcomes which may reflect lower order thinking skills.
15
Lower order vs. higher order thinking skills
While basic recall of facts is important to any course, your assessment results will be more
meaningful if you have chosen a more complex skill. Moreover, it will likely reflect what is truly
important in your course. Often facts are important because we want students to be able to do
something with that information.
SLOs which reflect higher order thinking skills, use action verbs that are observable and
measurable, as well as ones that reflect higher order skills. Examples of such verbs are solve,
design, write, compare, apply, decide, draw, persuade, investigate, and evaluate.
Refer to the following possible outcomes for an information technology course:
Students will be able to correctly summarize the key differences between open and
closed source software development models.
Students will be able to evaluate the strengths and weaknesses of open and close source
software development models.
While the first outcome is certainly easier to achieve, the second one better represents what
students would have to do with the information in the real world. You will get more useful
information about student learning with the second SLO.
How do you write SLOs for a course or program?
A student learning outcome statement needs to specify who is to perform (student), what action
they are to take, and some result that must come from their action. A student learning outcomes
(SLOs) for a course/program should:
Be written in terms of what the student/graduate will be able to do at the end of the
course/program
Be limited to 2-5 outcomes
Keep them short and simple (KISS)
Make them specific, measurable, attainable, realistic, and timely (S.M.A.R.T)
Establish a target performance level for success (i.e. 70% will …)
Keep the assessment process manageable and meaningful (M&M)
Use Bloom‘s Taxonomy and active verbs (create, analyze, demonstrate, etc.)
Be written in the positive instead of the negative
Reflect measurable standards (benchmarks) or reflect the basic knowledge and skills that
the student will be held accountable for
Reflect a combination of higher order thinking skills and supporting or enabling skills
16
What are some basic examples of well-defined student learning outcomes?
Unclear student learning outcome statement:
The students will understand democracies.
The students will appreciate art from other cultures.
The students will learn about the law of relativity.
The above statements are not well-defined learning outcomes since they are not measurable.
However, these statements can be modified to become well-defined learning outcomes as
follows:
The students will be able to describe the major theories of democracy.
The students will be able to identify the characteristics of art from other cultures.
The students will be able to explain the major tenets of the law of relativity.
Sample Course Assessment Plan for SDV 108 College Survival Skills
Goal/Objective Being
Assessed (SLO)
Students will demonstrate
effective time management
skills 70% of the time.
Evaluation method
(Expected Outcomes)
Keep schedule in Daily
Planner
Students will demonstrateappropriate comprehension of
course material 75% of the
time.
Weekly portfolio
assessment of reading,
writing & other
exercises
85.5% of students exhibit
satisfactory or better
apprehension of course material.
Students will utilize effective
study skills to successfully
complete course work 80% of
the time.
Class exercises &
research
essay/presentation
completion in portfolio
85% of students exhibit
satisfactory or better exercise
completion
Students will exhibit effective
note-taking skills 80% of the
time.
Class Note-Taking
exercises and evidence
of actual class notes in
portfolio
Class attendance &
participation, exhibition
of personal
responsibility using
rubric
Technology assignment
completion, library
resources, and email
sent to instructor
86.5% of student exhibit
satisfactory or better note-taking
skills
Students will exhibit and
demonstrate personal
behavior that prepare them
for success with 80%
proficiency.
Students will demonstrate
ability to use a computer and
information literacy skills
75% of the time.
Findings
79.1% of students show
satisfactory or better planner
usage
84% of students regularly
attended class, participated in
class activities, turn in
assignment & bring required
materials to class
78% of students accessed library
databases, completed the library
orientation worksheet
assignment, and sent email to
instructor using blackboard
Action to be Taken
Criterion has been
met, but important
dates needs to be
included in planner
Criterion has been
met. Continue to
monitor
assignments and
exercises.
Criterion has been
met. Continue to
monitor
assignments and
exercises.
Criterion has been
met. Continue to
monitor note
Criterion has been
met. Continue to
monitor attendance
and participation
Criterion has been
met. Continue to
monitor Internet
technology
assignments and
library research
activities
17
General Education/Core Learning
Outcomes
Communicate effectively orally and in writing
standard English
English 111 Outcomes
Upon successful completion of the course students
will be able to:







Formulate restricted, unified and precise
thesis statements
Organize essay content into introduction,
body, and conclusion paragraphs
Compose restricted, unified, and precise
topic sentences for paragraphs
Write unified and coherent paragraphs that
are well-developed with supporting
materials drawn from the literary text
Apply grammar and usage rules correctly
Choose appropriate diction
Write clear, precise sentences
Apply appropriate methods of mathematics to
solve problems
Comprehend and interpret reading materials
Explain basic literary terms in the genre of poetry,
fiction, and drama (for example, theme, imagery,
rhythm, figurative language, tone, character, plot,
etc.)
Understand and apply the methods, principles, See above.
and concepts of the natural and social sciences
and the humanities
Understand the nature and value of the fine and English 112 requires this.
performing arts
Use computer technology for communication and Write research-based essays using secondary sources
information retrieval
to:
 Synthesize several different sources into an
essay to support its thesis
 Quote, summarize, and paraphrase
responsibly within that paper
Recognize and appreciate cultural diversity
Students study the world‘s literature and write and
discuss a diversity of ideas.
18
Example of Action Plan
Action a department may take after How specific courses planned to change
assessment
their courses after assessment
Change syllabi to prepare students for the English 150 Children‘s Literature professors
rigor of the course
decided to emphasize the intellectual rigor and
copious reading in the class in the syllabus to
make students ―aware‖ that the assignments
and papers would be difficult.
Revise the course outcomes to include Many courses have merged similar outcomes,
more higher-order thinking, greater omitted outcomes based on their lack of
intellectual rigor, and/or sufficiency
intellectual rigor, and/or added language to
outcomes based on Bloom‘s Taxonomy of
high-order thinking.
Based on results from assessment, add or Using the equivalent of an item analysis, the
reduce certain elements of the classroom ELE 135 faculty noticed that many of the
exercises
questions answered incorrectly on their
assessment test were answered so because
students could not ―unlock meaning of
unknown words‖ based on prefixes and
suffixes. Hence, the faculty will investigate
how to emphasize word parts in ELE classes.
Obtain more consistency in large multi- ITE 115 noticed that consistency in multisection courses
section courses is difficult, given that Franklin
Campus and Smithfield site do not have the
same resources.
Although this analysis
delivers a negative truth, it also is
one worth noting.
Reduce grade inflation by linking test and Assessment and analysis of Math 163 showed
course grades to mastery of all outcomes that students‘ scores on the portion of the
exam that were common among all students
were not predictive of their final grade. This
portion, however, did not count toward the
final exam grade. Thus, it was speculated that
some students did not take that part of the
exam as seriously as the weighted part.
Increase contact with adjunct faculty
Math 151 instructors also suggested that the
master syllabus may not communicate the
timing in which certain skills ought to be
taught and this would present problems,
especially to adjunct instructors who are not in
contact with faculty as much as full time
instructors.
Explore active learning strategies and In Physical Sciences 111, the instructor has:
other teaching methods
 Changed the sequence of course topics
19
for better flow
 Introduced additional worksheets for
practice on skills
 Spent more time discussing processes
 De-emphasized memorization
Explore other
outcomes
ways
of
assessing The ENG 05 Developmental Reading/English
faculty decided that since they encourage
students to annotate their texts, the same
strategy ought to be applied when students are
being assessed. Because they were not aware
of this possibility, the faculty hypothesized,
students did not perform to their potential.
Explore technological enhancements MKT 100 has discussed organizing and
(labs, equipment, CD tutorial, etc.), using cataloguing a library or videos relevant to the
the assessment evidence to support a course to better support visual learners.
request for increased funding CIS
Conduct a retreat or workshop for Biology 101 examined their course and came
instructors
up with a plethora of questions. Based on this
analysis, the faculty desires to contact an
expert in assessment to find where and how to
proceed. The faculty emphasizes that their
desire to seek further help is linked to their
belief in assessment and its ability to enhance
student learning.
20
Program Assessments
Assessing student outcomes for programs is the most effective way to determine whether
PDCCC‘s programs are accomplishing the goals and objectives set forth in each program. A
careful analysis of the results of the students' assessment lets faculty and administration know
where improvements need to be made.
When doing annual student outcomes assessments, programs are asked to use multiple
assessment measures, of which at least one must be a direct measure. Faculty from each major is
asked to select any assessment methods that they believe will be effective in measuring whether
students achieved the goals of the program. The assessment of a major will give faculty vital
information concerning the program to be incorporated into the program review. Advisory
committees for OT programs are involved in reviewing curriculum. Some of the assessment
methods used are tests, competency checklists, rubrics, portfolio review, job placement rates,
employer surveys, oral examinations, written examinations, external certification examinations,
skills examinations, student surveys and panel reviews
In setting up student outcome assessments for any program or discipline, there are several
steps to follow:
1. State the program and general education goals/objectives.
2. Determine how and where each goal will be assessed using multiple methods (one
being a direct measure).
3. Record your results/findings.
4. Analyze the data collected by explain the results. Explain how your strategies to
improve your program worked or did not work. This may include a narrative
which describes the assessment process including how the data was analyzed and
the process in which assessment strategies were implemented.
5. Determine whether the goals of the program and general education goals are being
met, and state the actions taken (use past tense) or to be taken to address any concerns
or deficiencies. Close the loop.
21
Examples of Course, Program, and Administrative Unit Assessments
Direct Assessment Methods: Direct assessment methods give instructors measurable data to
study. Some examples are written exams, oral exams, performance assessments, standardized
tests, licensure exams, oral presentations, projects, demonstrations, case studies, simulations,
portfolios and, juried activities with outside panels.
Indirect Assessment Methods: Indirect assessment methods provide extra information that may
be used to make changes. Examples include questionnaires, interviews, focus groups, employer
satisfaction studies, observations of advisory boards, and job/transfer school placement data.
SAMPLE of the matrix used for Program Assessments
(Note: Analysis requires some explanation as to why the objective was met of not met.)
Goal/Objective being assessed
Evaluation Methods
Findings
Actions taken or to be taken
Program Goal 1
Computer Analysis: Students will
be able to diagnose, troubleshoot
and repair computer system
problems 75% of the time
Faculty observation during
class and completion of
task list test.
75% of the students
received a satisfactory
grade.
A handout on the requirements for a
satisfactory grade was provided to
students.
Analysis of Results (Explain how your strategies improved student success or did not improve student success):
This was an increase of 5% over last year (75% vs. 70%). This improvement appears to be due to the addition of simulation
sessions on troubleshooting and repair.
Program Goal 2
Students will demonstrate the
ability to work effectively on a team
75% of the time.
Faculty observation
during class-time devoted
to group project work in
(list course).
Student written selfevaluation in (list course).
Faculty observed that 75%
of the students
demonstrated the ability to
work effectively on a team.
90% of the self-evaluations
indicated good
understanding of effective
teamwork
Have students critique a video showing
a team at work and have them indicate
which principles were well-employed
and which were not.
Analysis of Results (Explain how your strategies improved student success or did not improve student success):
The group projects proficiency improved 8% over last year (75% vs. 67%). Part of this improvement appears to be due to
using a rubric for group projects to identify strengths and weaknesses and early feedback to students so that the students
know where they need to focus their attention to improve on their next group project.
Program Goal 3
Students will demonstrate the
ability to work effectively on a team
75% of the time.
Project assignments in ALL
IST courses are completed
in a timely manner.
All assignments have been
completed in a timely
manner per IST faculty.
80% of students will
receive a favorable review
of portfolios. The portfolios
will be evaluated by
program heads and the
advisory committee.
Where possible the
portfolios will be available
online.
Portfolios were reviewed at
the Spring IST meeting
IST 226/129 – web sites
IST179/180/216-review
written procedures for
troubleshooting, assembly,
safety IST 202/CS200engineering journals
No further action required at this time.
Faculty continue to monitor and assure
that assignments are completed in a
timely manner.
The Advisory Committee
approved that all portfolios
met IST standards.
Analysis of Results (Explain how your strategies improved student success or did not improve student success):
This year was an improvement (80% vs. 75%) from last year where students demonstrated the ability to work effectively in
teams. Part of the improvement appears to be due to an increase emphasis and review by faculty on student portfolios in
various program courses. This goal has met the benchmark goal of 75%.
22
Course Assessment
Course Prefix and Number BIO 101
Course Name: General Biology I
Instructor: All BIO 101 Instructors
Term and date: Fall 2010
(Fictional)
Part I
(Completed sections of Part I should be submitted by August 27 to the Academic Dean for
review.)
List All Student Learning Outcomes Objectives from Course Syllabus
(Note: You course objectives should come from your course syllabus)
Objective 1: Students will utilize the LRC for gathering information and completing four writing
assignments to help them review current events.
Objective 2: Students will be exposed to good laboratory techniques with regard to the proper care and
use of laboratory equipment and supplies.
Objective 3: Students will be able to identify cell types and label parts and know functions of each cell
part.
Objective 4: Students will be able to describe heredity components of living things.
Objective 5: Students will be able to explain the process of protein synthesis.
Objective 6: Students will be able to describe energy transformation as it applies to organisms.
Objective 7: Students will be able to classify organisms.
Objective 8: Students will be able to describe organic and inorganic chemical makeups of organisms.
List Any Primary Core Competencies Objectives
Gen-Ed Objective 1: written communication (If you list any general education or core competencies
here, you need to also show them below with some level of proficiency, evaluation method, and
learning skill level or state which of the above objectives relate to the Gen-Ed Objective)
Gen-Ed Objective 2: scientific reasoning
Note: At least 60-70% of student learning outcome objectives above should be assessed.
For each objective being assessed, list the
objective & measure of success (For objective
#3, students will be able to…with 70% of
proficiency)
Obj. 1: Students will be able to complete 4 writing
assignments with 70% of proficiency.
Evaluation Method
Read current event and write
summary
Learning Skill
(Bloom)
Knowledge
Comprehension
Application
Analysis
23
Synthesis
Evaluation
(Note: Do not list all
skill levels for each
objective, just the
main one or two that
you are evaluating.)
Knowledge
Obj. 2: Students will be able to identify lab safety rules and
lab equipment with 100% proficiency. (Note: Do not set
your objectives with 100% proficiency. If you do, you
cannot improve.)
Obj. 3: SWBAT label cells with 70% proficiency.
Write name of equipment from
picture and write what’s done wrong
in picture
Labeling parts of cell worksheet
Knowledge
Obj. 4: SWBAT describe heredity and genetics (Note: Needs
some level of proficiency. If you do not have some type of
measurement, you cannot tell if you have met your
objective.)
Obj. 5: SWBAT describe and explain protein synthesis with
70% proficiency.
Obj. 6: SWBAT complete energy conversions and
transformations with 70% proficiency.
Obj. 7: SWBAT put organisms into groups with 70%
proficiency.
Obj. 8: SWBAT list chemical symbols and the number of
protons, neutrons and electrons for those essential to life
with 70% proficiency.
Create Punnett squares
Synthesis
Write steps of protein synthesis/draw
and label each step and part
Worksheet with pictures and energy
transformation identification
Test questions 4-6 on Test 1
Comprehension
Test questions 20-25 on Test 1
Knowledge
Knowledge
Knowledge
Actions Implemented:
[What specific actions or new initiatives (if any) did you implement this year to improve your course?
Why? ]
Adopted a new textbook which focuses more on the course objectives and provides software that
should be helpful to the students. (Note: You should be trying new things to improve your course)
General Education (Core Competencies):
(1) Oral Communication, (2) Written Communication, (3) Critical Thinking,(4) Scientific Reasoning,
(5) Quantitative Reasoning, and (6) Information Literacy
Sample Generic Grid for Mapping the Assessment
(Make sure your grid shows a good balance of outcomes and enough attention to higher learning skills.)
Measure of success
Objective 2: Students will …with
70% proficiency
Objective 4: Students will … 80% of
the time
Objective 5: Students will …with
75% proficiency
Objective 8: Students will …with
95% accuracy
Evaluation Method
Common questions 1-8 on Test 1
Learning Skill (Bloom)
Knowledge
Common questions 20-35 on Test 2
Knowledge
30-minute exam essay question,
scored by rubric
Scored by rubric
Comprehension, analysis
Comprehension, analysis, synthesis
Evaluation Method: (rubric, embedded test questions, project, lab test, journal, certification test,
portfolio)
Bloom’s Taxonomy:
(Note: higher level courses should have higher level thinking.)
Knowledge: Recall of previously learned facts
24
Words to use to assess recall: identify, define; describe, state, label, list, match, reproduce
Comprehension: Understanding what is meant
Words to use to assess comprehension: give examples of, classify, explain, describe,
summarize, outline, trace
Application: Use of previous knowledge to approach new situations or problems
Words to use to assess application: predict, construct, prepare, produce, show, use,
implement, design, show how
Analysis: Separate into component parts
Words to use to assess analysis: list the components parts of, break down, differentiate,
distinguish, diagram, illustrate, outline, subdivide, interpret, compare/contrast
Synthesis: Putting elements together so as to form a new concept
Words to use to assess synthesis: adapt, design, compare/contrast, categorize, compile,
assemble, rearrange, give evidence for, give reasons for, formulate, infer, generate,
integrate, plan
Evaluation: Judging by criteria
Words to use to assess evaluation: Develop criteria for, rank, prioritize, explain why you agree or
disagree, which is better, appraise, defend, judge, compare and contrast by criteria, review.
Complete at the end of the academic term
PART II
(Completed sections of Part II should be submitted by January 7 to the Academic Dean. The Dean will
review and send to Director of Assessment & IR by January 7)
Findings/Results:
(At the end of academic term, list each objective number that was assessed and results):
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Analysis & Evidence of Improvement:
(To what factors for each objective did you attribute your findings/results to? Overall, what evidence of
course improvement based on your analysis of results did you find?)
Objective __:
Objective __:
Objective __:
25
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Action Taken to Modify Course to Improve Student Learning (based on results) and Why?
(What will you do differently? Describe how the results obtained from the assessment will be
used to improve student learning for objectives assessed. Why?)
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Objective __:
Summary of Course Changes/Needs to Improve Student Learning:
Changes made or needs for the course are the following:
(1) Teaching methods (more homework, additional exercises, more emphasis on teamwork, providing
review sessions, more hands-on, etc.) changes and/or changes to course syllabi:
(2) Resources needed (equipment, software, student activities support for speakers, field trips, new
textbook, tutors, etc.):
(3) Policy change(s) (attendance, course pre-requisites, etc) that are needed to improve on student
learning outcomes:
26
Course Assessment
Course Prefix and Number :ENG 111
(Fictional)
Course Name: COLLEGE COMPOSITION I
Instructor: All ENG 111 Instructors
Term and date: FALL 2010
Part I
(Completed sections of Part I should be submitted by August 27 to the Academic Dean for
review.)
List All Student Learning Outcomes Objectives from Course Syllabus
Objective 1:
The student will employ the writing process to compose various compositions: narrative,
expository, descriptive, and argumentative that are satisfactory on the collegiate-level in
focus, content, organization, style, and conventions.
Objective 2:
The student will read, interpret, analyze, and evaluate essays based on numerous human
experiences.
Objective 3:
The student will integrate their sources correctly in their research papers based on the revised
2009 Modern Language Association rules.
Objective 4:
The student will synthesize researched information to develop a well-documented research
paper.
Objective 5:
The student will define and apply the elements of logic and critical thinking to develop
argumentative writing.
Objective 6:
The student will use the computer to keyboard writings, exchange e-mails, complete
assignments.
Objective 7:
The student will evaluate his/her own writing and peers' using various writing strategies.
27
List Any Primary Core Competencies Objectives
Gen-Ed Objective 1:
The students will be able to employ written communication skills to develop compositions that are
satisfactory in content, focus, organization, style and conventions 70% of the time.
Gen-Ed Objective 2:
The student will be able to demonstrate verbal and nonverbal effectiveness, appropriateness, and
responsiveness 70% of the time.
Note: At least 60-70% of student learning outcome objectives above should be assessed.
For each objective being assessed, list the
objective & measure of success (For objective
#3, students will be able to…with 70% of
proficiency)
Evaluation Method
Learning Skill
(Bloom)
For objective #1:
Students will be able to compose a variety of compositions
with 70% of proficiency in the areas of focus, content,
organization, style, and conventions.
For objective #2:
Students will be able to read, interpret, analyze, and
evaluate essays based on human experiences with 70%
proficiency.
For objective #3:
Student will integrate their sources correctly in their
research papers based on the revised 2009 Modern
Language Association rules with 80% proficiency.
Instructor Generated Writing Rubric
Knowledge
Comprehension
Application
Instructor Generated Quizzes/Tests
Comprehension
Analysis
Evaluation
Instructor Generated MLA Rubric
Knowledge
Application
For objective #4:
Students will be able to synthesize researched information
to develop a well-documented research paper with 70%
proficiency.
Instructor Generated Argumentative
Research Paper Rubric
Synthesis
For objective #5:
Students will be able to define and apply the elements of
logic and critical thinking to develop argumentative writing
with 80% proficiency.
Journal Responses - Argumentative
Rubric
Knowledge
Application
For objective #6:
Students will be able to use the computer to keyboard
writings, exchange e-mails, complete assignments 85% of
the time.
Instructor Generated Writing and
Computer Rubric
Knowledge
Comprehension
Application
For objective #7:
Students will be able to evaluate his/her own writing and
peers' using various writing strategies 85% of the time.
Peer and Self- Evaluation Writing
Rubrics
Evaluation
Actions Implemented:
[What specific actions or new initiatives (if any) did you implement this year to improve your course?
Why?]
I have implemented Service Learning into this course to allow students more opportunities to connect
28
real life to real learning, writing, reading, responses, and research.
I have also embedded a collaborative teaching assignment that students become experts on the five
characteristics of writing (focus, content, organization, style, and tone). In turn, they teach the
concepts to the class based on their assigned learning styles groups.
PART II
(Completed sections of Part II should be submitted by January 7 to the Academic Dean. The Dean will
review and send to Director of Assessment & IR by January 7)
Findings/Results:
(List each objective number that was assessed and results):
Objective 1: Using a writing rubric, 88% of students employed the writing process to produce
coherent, unified, and effectively developed compositions.
Objective 2: Based on quizzes and tests, 85% of students were proficient on reading,
interpreting, analyzing and evaluating essays.
Objective 3: Based on a MLA rubric developed by the instructor, 75% of students were
proficient in integrating their sources correctly in their research papers using the revised 2009
Modern Language Association rules (MLA).
Objective __:
Objective 5: Using an argumentative rubric, 90% of students were able to compose an
argumentative essay that addressed both sides and that was free of written expression and
grammatical errors.
Objective __:
Objective __:
Analysis & Evidence of Improvement:
(To what factors for each objective did you attribute your findings/results to? Overall, what evidence of
course improvement based on your analysis of results did you find?)
Objective 1: Met Proficiency: The writing rubric showed that for Fall 2010, 88% of students
were proficient in composing a variety of compositions. This was an increase of 4% over Fall
2009. This improvement appears to be due to more practice in-class writings that incorporate
the writing process.
Objective 2: Met Proficiency: Based on quizzes and tests, 85% of students were proficient on
reading, interpreting, analyzing and evaluating essays. This compares to 80% in fall 2009. This
increase in student success appears to be due to having weekly quizzes to the course. For the
past few years this objective has been met or surpassed.
29
Objective 3: Below Proficiency: Based on a MLA rubric developed by the instructor, 75% of
students were proficient in integrating their sources correctly in their research papers using the
revised 2009 Modern Language Association rules (MLA). This was an improvement from last
year which had a 65% proficiency. However, it is still below the proficiency level set by the
department of 80% proficiency. The improvement appears to be from giving specific
assignments that employ use of the revised 2009 MLA documentation style for each class
assignment and allow students to practice with the new methods on a weekly basis.
Objective __:
Objective 5: Met Proficiency: Using an argumentative rubric, 95% of fall 2010 students were
able to compose an argumentative essay that addressed both sides and that was free of written
expression and grammatical errors. This compares to 90% for fall 2009. The improvement
appears to be from adding in fall 2010 the strategy of having the instructor teach the necessity
of reviewing and applying teacher comments, along with peer and self evaluation process.
Objective __:
Objective __:
Action Taken to Modify Course to Improve Student Learning (based on results) and Why?
(What will you do differently? Describe how the results obtained from the assessment will be
used to improve student learning for objectives assessed. Why?)
Objective 1: No modifications are needed at this time.
Objective 2: No modifications are needed at this time. Continuing doing weekly quizzes.
Objective 3: A group power-point presentation and class discussion on use of MLA for research
papers will be added to the course. Students are now receiving immediate feedback on their
papers using the MLA rubric as a guide as to where a student weaknesses are.
Objective __:
Objective 5: No modifications are needed at this time.
Objective __:
Objective __:
Summary of Course Changes/Needs to Improve Student Learning:
Changes made or needs for the course are the following:
(1) Teaching methods (more homework, additional exercises, more emphasis on teamwork, providing
review sessions, more hands-on, etc.) changes and/or changes to course syllabi:
30
Added weekly quizzes to increase student involvement and application
(2) Resources needed (equipment, software, student activities support for speakers, field trips, new
textbook, tutors, etc.):
We adopted a new textbook in fall 2010 which has more chapters which focuses more on the course
objectives.
(3) Policy change(s) (attendance, course pre-requisites, etc) that are needed to improve on student
learning outcomes:
Completing ENG 03 was added as a pre-requisite to taking ENG 111.
31
Administrative of Justice (400) Assessment Plan (Fictional)
Program Student Learning Outcomes Assessment Plan
(Note: Be sure to use multiple measures with at least one being a direct measure)
Objective Being
Evaluation method
Analysis Status
Assessed
(Met, Mixed,
Below
Proficiency Level)
Objective 1: Graduating
ADJ students will
formulate an appreciation
of ethical standards
through their experiences
(classroom/nonclassroom) 70% of the
time
Student proficiency of this learning outcome is
formatively assessed throughout the program by using
multiple measures such as midterm examinations, final
examinations, and numerous Blackboard discussions.
The primary measures are the final exam in ADJ 133,
Ethics and the Criminal Justice Professional, and an
assessment portfolio presented to the ADJ advisory
committee and evaluated using a rubric. (Pilot May
2010 and 2011) Annual exit interview (starting April
2012)
2008-2009:
Met Proficiency
2009-2010:
Met Proficiency
Findings/Results:
Previous Year
For summer 2009, proficiency was 94% based on a
comprehensive exam vs. 92% for Summer 2008.
2010-2011
(Enter the data results that you collected in the
current year)
For summer 2010, proficiency was 98% based on test
scores and a partial portfolio review
For 2009-10, the Advisory Committee evaluation of
portfolios showed a 95% proficiency level. The rubric
used showed no specific area of weaknesses.
Analysis & Evidence of Improvement:
[To what factors do you attribute your Findings/Results to? (For example, why did a particular objective
improve from 57% in 2008 to 64% in 2009? Was this 7% increase due to a new initiative, pedagogy, new
textbook, rubric, tools or activities, collaborations among program faculty or other discipline faculty?)
Which factors will be discontinued, modified, or expanded? (For example, if the 7% increase was due in
part to the use of a communication rubric designed in collaboration with ENG faculty … will AST faculty
continue to use that rubric? If so, in which courses and how applied? Do AST faculty plan to continue
collaboration next year to further refine the rubric? Etc…)
Overall, what evidence of program improvement based on your analysis of results did you find?]
2008-2009: For 2008-09 proficiency improved from 92% in 2008-09 to 94% in 2009-2010. This
improvement appears to have been enhanced by the increase use of Blackboard discussions on ethics.
Using rubrics to assess student’s performance were also helpful to students for quick feedback and
pointing out areas to improvement on weaknesses. The rubric showed no area across graduates that
needed program modifications. In addition, students had to holistically consider professional ethical
standards because of the comprehensive exam which was not required in 07/08. For 09/10 the addition
of 35 ethical dilemmas will enhance the student’s ability to add a practical aspect to their learning.
2009-2010: Program improvement was evidenced by a partial portfolio review using a rubric that
satisfied the Program Head and the Advisory Committee. The review showed a 95% proficiency level. This
32
improvement was attributed to the addition of the 35 dilemmas that added another practical dimension
to the course and program. The advisory committee’s rubric used during the evaluation aided in pointing
out any weaknesses that graduates had. The rubric showed no modifications needed at this time to the
program.
Action Plan Modifications
[What specific actions or new initiatives are you implementing to improve your education program?
Why?]
2008-2009:
Added 35 ethical dilemmas to the Ethics and the Criminal Justice Professional course to enhance the
graduate’s ability to add a practical aspect to their learning.
2009-2010:
Although the course and program goals were met (70%) for both years the use of the portfolio will continue
to be presented to the ADJ advisory committee and evaluated using the course/portfolio rubric. (Pilot May
2010 and 2011) Annual exit interview (starting April 2012)
Objective Being
Assessed
Objective 2:
Communication:
Graduating ADJ students
will engage in listening
skill improvement through
their experiences
(classroom/nonclassroom) 70% of the
time
Evaluation Method
Status
The primary measure is in ADJ 227 and an assessment
portfolio (case briefs and lecture summaries) presented
to the ADJ advisory committee and evaluated using a
rubric.(Pilot May 2010 and 2011 and annual exit
interview starting April 2012) An ADJ Program Survey
was also administered using a 3.0 as the proficiency
benchmark.
Findings/Results:
Previous Year
2008-2009: For Spring 09, proficiency was 100% based
on test scores and a partial portfolio review compared
to 90% for the previous year.
2008-2009
Met Proficiency
2009-2010
Met Proficiency
2010-2011
On an ADJ Program developed Survey, the 2009 ADJ
graduates rated their proficiency as 3.2 on a 5-point scale
with one being low. The benchmark for this objective was
3.0.
2009-2010: For spring 2010, proficiency was 88% based test
scores and a partial portfolio review using a rubric by the
Advisory committee.
On an ADJ Program developed Survey, the 2010 ADJ
graduates rated their proficiency as 3.5 on a 5-point scale
with one being low. The benchmark for this objective was
3.0.
Analysis & Evidence of Improvement:
2008-2009:
33
The case briefs were effective and showed 100% proficiency vs. 90% in the previous year. The ADJ Survey
also showed that 2009 ADJ graduates proficiency as 3.2 which was higher than the benchmark set of 3.0.
This proved that students were listening. Students did not regard the lecture summaries as significant,
however, to their test grade. For 2009-2010 the lecture summaries will have a course test grade value.
2009-2010:
Lecture summaries were reviewed (88% proficiency vs. 100% for 2008-09). Still some students seemed to
not care about the course grade value of the summaries. After a review of the makeup of the class non
major and undecided students accounted for some of the decline in the total but not enough to fall below
the 70%. The ADJ Survey showed an increase of proficiency for this objective (3.5 vs. 3.2 for 2009 ADJ
graduates). These scores are above the benchmark set of 3.0.
Action Plan Modifications
[What specific actions or new initiatives are you implementing to improve your education program?
Why?]
2009-2010:
Although the course and program goals were met (70%) for both years the use of the portfolio will continue
to be presented to the Advisory Committee and evaluated using the course/portfolio rubric. Continue to
monitor the non major and undecided students in the class (Pilot May 2010 and 2011 and annual exit
interview starting April 2012)
Objective Being Assessed
Objective 3:
Graduating ADJ students
will formulate/improve
supervisory skills through
their experiences
(classroom/non-classroom)
70% of the time
Evaluation Method
The primary measure is an assessment portfolio (of
case studies and poster presentations) presented to
the ADJ advisory committee and evaluated using a
rubric. The primary examples will be obtained in ADJ
111, 112, and 231 for Police Science graduates and
from ADJ 245 for Correction graduates (Pilot May
2010 and 2011 and an annual exit interview starting
April 2012). An ADJ Program Survey was also
administered using a 3.0 as the proficiency
benchmark. Focus Groups are administered
throughout the program.
Status
2008-2009:
Below Proficiency
2009-2010:
Met proficiency
Findings/Results:
Previous Year
2008-2009: For Fall 2008, proficiency was not met
62% based on test scores and a partial review of
portfolio material using the rubric
2010-2011
On an ADJ Program Developed Survey, the 2010 ADJ
graduates rated their level of proficiency as 2.5 on a 5point scale with one being low. The benchmark for this
objective was 3.0.
The ADJ Focus group of graduates was conducted. It
identified a need for more checking of portfolios by
faculty to provide more feedback to graduates to keep
them on track.
2009-2010: For Fall 2009, proficiency was met 81%
based of test scores and a partial review of portfolio
34
material using the rubric
On an ADJ Program Developed Survey, the 2010 ADJ
graduates rated their level of proficiency as 3.2 on a 5point scale with one being low. The benchmark for this
objective was 3.0.
Analysis & Evidence of Improvement:
2008-2009:
Based on the partial portfolio material students performed at a level that could be higher. Students
performed only at a 62% level compared to the 70% benchmark set by the program head.
On the ADJ Program Survey, the 2009 graduates scored 2.5 on this objective. This was below the
benchmark of 3.0. The focus group showed a need for greater feedback as they go through the program.
Because of the incompleteness/depth of the case studies the Program head will randomly check their
portfolios during 2009-2010. Also for Fall 2009 the courses will require poster presentations to be
completed. Overall program improvement should see an increase in improved supervisory skills.
2009-2010: The increase in competency test scores from 2008-2009 (62% proficiency) to 2009-2010
(81% proficiency) , the increase proficiency score on the ADJ Survey from 2.5 for 2009 graduates to 3.2
for 2010 graduates , and very positive comments about developing portfolios by the 2010 graduate focus
group was the addition of poster presentations that gave students the in-depth analysis that is needed in
the area of supervision. Poster presentations will in the future require all group members to participate
in the presentation. Overall program improvement was documented by the depth of the material in the
portfolio. The uninterrupted germane reading needs some more thought about how to measure and
value this component. There were no interviews for spring 2010 since no graduates yet under this
assessment plan.
Action Plan Modifications
[What specific actions or new initiatives are you implementing to improve your education program?
Why?]
2008-2009:
Poster presentations are incorporated into ADJ courses listed above to promote supervisory skills.
Students develop portfolios. Instructors are reviewing material as part of test scores and then the
advisory panel/dean will review the material.
2009-2010:
During the pilot phase of the assessment plan for the Administration of Justice Program the program
head has been reading on cognitive loading theory and its relationship to memory. Since one of the main
outcomes for Program students is thoughtfulness/deep thinking further readings on working, short and
long term memory has convinced the program head to consider the following requirements for this
course. Substitute some of the course projects with uninterrupted germane reading by Program
students. This is based on the concept of memory leaching to sources of memory loading/storage devices
and distractions experienced by Program students because of these devices. After a lengthy review of the
literature on memory the program head has come to the realization that a majority of Program students
may be losing the ability to think deeply because of the loading/storage devices and distractions.
Because of the leaching of memory into these devices the brain is modifying itself because long term
memory is not being used as much as in past, therefore a student may not rely on their long term memory
35
for the development of deep thoughts, for example, before these devices many people relied on their
memory for telephone numbers and this exercised their memory just as reading has done for many years.
In order for deep thought to occur the brain needs to have a great deal of information deposited to make
the connections necessary to develop deep thought however these devices have made it more difficult to
acquire/maintain information that has been externalized instead of being internalized.
…
Objective Being Assessed
Objective 9:
Students will
demonstrate an
understanding of the
concepts of normal and
abnormal behavior,
including focuses on the
psychological and
sociological aspects of
criminal and other
deviant behavior patterns
with 80% proficiency
Evaluation Method
Student proficiency of this learning outcome is
formatively assessed throughout the program by
using multiple measures such as midterm
examinations, and final examinations, The primary
measure is the final exam in ADJ 247, Criminal
Behavior. An ADJ Program Survey was also
administered using a 3.0 as the proficiency
benchmark.
Status
2008-2009:
Met Proficiency
2009-2010:
Below Proficiency
:
Findings/Results:
Previous Year
2010-2011
For spring 2010 proficiency on a comprehensive test
was 76% vs.92% for spring 2009 vs. 90% for spring
2008.
On an ADJ Program Developed Survey, the 2009 ADJ
graduates rated their level of proficiency as 3.8 on a 5-point
scale with one being low. The benchmark for this objective
was 3.0.
On an ADJ Program Developed Survey, the 2010 ADJ
graduates rated their level of proficiency as 2.9 on a 5-point
scale with one being low. The benchmark for this objective
was 3.0.
Analysis & Evidence of Improvement:
2008-2009:
For this objective, student proficiency was 92% compared to 90% for 2007-08. This was supported by the
ADJ Survey of 2009 graduates that rated their proficiency as 3.8. The benchmark for this objective was
3.0. Strategies used to improve performances from the previous year was an added class presentation
and class paper on psychological and sociological aspects on criminal behavior in the ADJ 247 course.
2009-2010:
Proficiency was only 76% for this year compared to 92% for 2008-09 and 90% for 2007-08. This drop was
supported by the 2010 ADJ Survey on graduates which had a proficiency of 2.9 compared to 2009 ADJ
graduates with 3.8. This is below the benchmark set of 3.0. This drop appears to be due to the adoption of
a new textbook with a different publisher for ADJ 247 which does not address certain skills required for
students in this program.
36
Action Plan Modifications
[What specific actions or new initiatives are you implementing to improve your education program?
Why?]
2008-2009:
To improve public speaking, research skills, and knowledge about a specific aspect of criminal justice, a
second presentation was added.
2009-2010:
The program head has gone back to the original textbook, but with the new edition. Students will
complete lecture summaries after each class then those summaries will be evaluated for areas of
weakness and these areas will receive additional attention to be incorporated in the courses ADJ
107/247.
Program Improvement Summary Report:
[Based on the data analysis, how has your program improved this year? What are the strengths
and weaknesses of your program? What new action plan initiatives are you planning for next year
(For example, at any time in the recent past did the AST faculty discuss using new office simulation
software instead of the traditional use of textbook activities? Are faculty planning to implement
something like that next year? Etc…)? What equipment/resource needs do you have for improving
your program?]
2009-2010:
After reviewing the findings/results for the year, we found that the Program has had some success
but could use some improvement in Objective 9. Students will complete lecture summaries after
each class then those summaries will be evaluated for areas of weakness and these areas will
receive additional attention to be incorporated in the courses ADJ 107/247. Time constraints will
not allow for the Advisory Committee to review the ‗lecture summaries‘ for this report; however
the review will take place as soon as the course is complete and the Advisory Committee can
meet. The Program Head has seen as a result of the ‗lecture summary‘ requirements and increase
in student questions, attention and recall. The program head has also approved going back to the
original publisher for the textbook used in ADJ 247.
Since this program subscribes to the philosophy of andragogy, the assessment strategy focuses
mainly on the affective domain which includes empirical research/observation to a great extent.
Based on the data, the Program will continue treating the Program students as adults, confer with
the Advisory Committee about the assessment test pilot, and make the necessary changes to the
Program with consequential testing.
37
Example: Annual Program Assessment__Administration of Justice (Fictional)
For Academic Year: 2010-2011
Program: Administration of Justice
Submission Date: May 15, 2011
MISSION:
Administration of Justice: The Administration of Justice program at ________ will provide
a foundation that combines a Christian values base with the theories, principles, and
practices necessary for a successful career in Administration of Justice.
OUTCOMES: Graduates will be able to:
1.
2.
3.
4.
Provide educational, remedial, and rehabilitative services to families and communities;
Organize communities and neighborhoods for social action;
Promote family and community interests in public, private, and governmental settings;
Maintain community resources of information, instruction, and assistance to all members of a
community;
5. Apply creative problem-solving on behalf of community members, especially those at risk of
violence in high crime areas;
6. Identify, analyze, and respond to problem situations involving civil rights, law enforcement, and
legal issues; and
7. Show competence in written and oral communications.
ASSESSMENTS:
Employer Internship Survey (Conducted in spring of Sr. internship)—Outcomes 1,2,3,&4
Student Internship Survey (Conducted in spring of senior internship)—Outcomes 1,2,3,& 4
Major Field Test (annually, senior capstone course)—Outcomes 5 & 6
Senior Project/Presentation/Portfolio (annually, senior capstone course)—Outcomes 1-7
LOCATION:
All results of surveys, tests, and portfolio will be maintained in the office of the ADJ Program Head.
DISSEMINATION/DISCUSSION:
The senior Project/Portfolio is evaluated according to a rubric by a team of evaluators including
at least one department professor of Administration of Justice, one local police officer, and the
director of security on campus.
Results of the surveys, major field tests, and portfolios are compiled, distributed, and discussed
with the faculty and dean at department meetings.
Results are also distributed to the VP of Instruction & Student Development.
The ADJ Advisory Committee considers results at their fall annual meeting.
38
RESULTS:
All ratings on the Employer Internship Survey met or exceeded last year’s results in the range of 90-96%
for 2010-2011 as compared to 88-94% for 2009-2010 administration, except in one area. Business
supervisors for the interns rated them lower than prior years (80%) in the area of communication
regarding community resources. (Outcomes 1-4)
Ratings on the Student Internship Survey were similar and met or exceeded the prior year, in the range
of 95%=100% as compared to 94-99% for the prior year. However, students’ comments confirmed that
some of them did not understand how to communicate the resources which are available in the
communities of their internships. (Outcomes 1-4)
Major Field Test scores were lower than the prior year (88% for 2010-2011; 92% for 2009-2010);
however, they are within department’s established standard of success, set at 80% (Outcomes 5-6)
Summary scores for the rubric scoring of the random sampling of Portfolios indicated that 80% of the
students achieved at the acceptable or better levels for all of the performance criteria except in the area
of “innovative education” for high-crime prone communities. This area continued to have a low score
(50% acceptable or better) for the second year in a row. In addition, students’ oral presentations of
projects indicated a less than acceptable (2.5 on a scale of 5 with one being low) overall score in oral
communication for the second year in a row.
MODIFICATIONS:
For Outcomes #1-3—No modifications are recommended at this time.
For Outcomes #4, 5, and 7__The department has met and revised the curriculum to include a field trip
as a part of a capstone course (prior to their internship experiences) to a high crime area to collect
information from a designated, high-need constituency in the community (seniors, disabled citizens,
etc). Students will work as groups to design and communicate an “innovative education” plan regarding
the safety resources available to the appointed constituency.
TIMELINE, REQUIREMENTS, AND APPROVALS:
The proposed capstone course curriculum and design (see attached outline) will go before the academic
policies committee for approval in fall 2011. If approved by the committee, the faculty senate, the
assistant VPAA and the VPAA, and if funding is provided for the field trip and training experience (see
attached budget estimates) the additional course will be added for seniors in the fall 2011.
Approval by Academic Dean __________________ Date: ____________________
39
Transfer AA&S Degree Programs
(Business Administration, Education, Science, General Studies)
Assessment Plan
WRITTEN
COMMUNICATION
INSTRUCTORS –
2008-2009:
Mixed Proficiency
2009-2010:
Mixed Proficiency
2010-2011:
Given prior instruction,
students will develop
compositions that are
satisfactory in content,
focus, organization, style,
and conventions 70% of
the time.
Student proficiency of this learning outcome is formatively assessed throughout the
program by using multiple measures such as midterm examinations, final examinations,
numerous Blackboard discussion questions, as well as written papers across the
program. The primary measures are the final exam in ENG 111 (College Composition I)
using a rubric , the STAGE test developed by the VCCS. and PHI 115 (Capstone
Course), as well as, a Writing Prompt and rubric developed by the Virginia Community
College System (VCCS) using a six-point scale. Student proficiency of this outcome is
formatively assessed in ENG 111 class by using the argumentative research essay,
which will be evaluated using modified VCCS rubric.
Findings/Results
Previous Years
For 2008-2009 proficiency, 76% of students enrolled in English 111
courses performed at a rate of 70% or better on their argumentative
essays in areas of focus, content, organization, style, and conventions.
For 2008-09, pending graduates scored 74.2% proficiency for written
communication in the capstone course.
For fall 2008, proficiency was 65% based on final exam test vs. 76% for
fall 2007.
For 2009-2010 proficiency, 84% of students enrolled in English 111
courses performed at a rate of 70% or better on their argumentative
essays in areas of focus, content, organization, style, and conventions.
For fall 2009, proficiency was 64% based on final exam test.
For 2008 graduates, the weakest area on the writing prompt using the
VCCS rubric was in Organization (4.55) and the highest in conventions
(5.35).
2010-2011
(Add your new data for 2010-2011)
2010 graduates had a mean score of 6.89 (tenpoint scale with 1 being low) on the communication
section of the STAGE test. Using item analysis, the
weakest area was demonstrating the ability to use
standard English.
2010 graduates were 85% proficient (based on
80% accuracy) in the capstone course module on
written communication. Using a rubric, the weakest
area (30% proficiency) was in Organization:
Organize content with effective transitions and
effective beginning and ending paragraphs.
2010 graduates using a rubric developed by
English faculty and the VCCS had mean scores
(from 1-6 with 1 being low) of 4.28 on rhetorical
knowledge, 4.06 on critical thinking, 4.10 on
organization, 4.08 on content & clarity, 4.10 on
style, and 3.88 on conventions. The weakest area
(3.88) is on conventions.
Analysis & Evidence of Improvement:
[To what factors do you attribute your Findings/Results to? (For example, why did a particular objective
improve from 57% in 2008 to 64% in 2009? Was this 7% increase due to a new initiative, pedagogy, new
textbook, rubric, tools or activities, collaborations among program faculty or other discipline faculty?)
Which factors will be discontinued, modified, or expanded? (For example, if the 7% increase was due in part to
the use of a communication rubric designed in collaboration with ENG faculty … will AST faculty continue to use
that rubric? If so, in which courses and how applied? Do AST faculty plan to continue collaboration next year to
further refine the rubric? Etc…)
Overall, what evidence of program improvement based on your analysis of results did you find?]
40
2008-2009:
Students have a mastery of focus, style, and content development thus providing more collegiate-level writing
throughout the program. Proficiency was 76% on the argumentative essays using a rubric. This was supported by
the 74% proficiency on the writing module in the capstone course. Both of these measures meet the proficiency
level of 70% set by the department. However, on comprehensive tests, student proficiency was only 64% in fall
2008.
This competency is being expanded by having students' writing in non-English courses in the area of focus,
content, organization, style, and conventions, Non-English faculty ( history, religion, technology, and
psychology) incorporated writing into their disciplines and generate a method of evaluation such as a rubric to
provide students feedback on their writing in addition to the content in the areas of focus, organization, style,
conventions, along with content.
The English faculty will revise the departmental rubric that was used to evaluate final exams and the
argumentative research essay so that common rubrics are used amongst all English faculty for each type of essay
assigned.
2009-2010:
Students have a mastery of focus, content, style, and organization thus providing more collegiate-level writing
throughout the program.
The 8% proficiency improvement for students enrolled in English 111 on their argumentative essays in areas of
focus, content, organization, style, and conventions (84% proficiency in 2009-10 vs. 76% in 2008-09) and the
increase in performance in the capstone course (85% in 2009-10 vs. 74% proficiency in 2008-09) was a result of
inclusion of collaborative argumentative writing exercises embedded into English 111 courses, adoption of new
textbooks for English 111, employment of journal writing into English 111, incorporation of peer and instructor
evaluations, and learning resource presentations which offered resources where students could obtain
information to improve writing. Students’ proficiency on tests still appears to be an issue with only 65%
proficiency in fall 2009. This is, however, a 1% improvement over last year.
To improve areas of organization as noted in 2008 graduates' writing, English instructors used practice exercises
to teach students how to embed transitional words into the content of their writing, how to analyze writing with
and without transitions to examine the effectiveness of using transitions to aid organizational skills, and how to
develop topic sentences that help structure the essay. In addition, demonstrations of how to use graphic
organizers as means to help develop ideas in an organized manner were also used along with new textbook
adoptions for English 111 which provided exemplars of each genre of writing by which students used to
enhance their organization.
To improve the use of Standard American English, Modern Language Association formatting of document was
required for final submissions of work in all English courses and the writing handbook and newly adopted text
with exemplars were used for English 111 to help students improve word choice and use of Standard American
English.
The 11% percent increase in graduates' proficiency in the capstone course module (85% in 2009-10 vs. 74% in
2008-09) on written communication was a result of requiring students in English 111 to use argumentation and
critical thinking skills to help develop writing. In addition, in 2008, English 03 - Preparing for College Writing II
was added as a bridge course between English 01 and English 111 to help teach more complex foundational
41
writing and grammar skills to students so that they would be better equipped to handle the rigor of English 111.
2010-2011:
Action Plan Modifications
[What specific actions or new initiatives are you implementing to improve your education program? Why?]
2008-2009:
The English faculty added English 03, a bridge developmental course between English 01 and English 111, to
provide students with more complex skills in writing, grammar, and research.
The English faculty and psychology faculty conferred with students about their writing and areas of
needed improvement at least one time a semester.
In all courses at PDC, faculty have the option to have a Library Help tab added to their Blackboard
course which contains specific materials and resources that aid students in developing college-level
writing for various disciplines.
2009-2010:
The English faculty adopted new textbooks of which contains exemplars of each genre of writing along with
essays readings to strengthen each student's focus, content, organization, style, and conventions.
The English faculty added journals as a means of reflection, critical thinking, and writing to provide more
opportunities for students to practice their writings in English 111.
English faculty used graphic organizers and other visuals to improve organizational skills of writers.
English faculty obtained more external resources such as books, multi-media resources, and other
external resources to aid students in development strong argumentative writing skills.
English faculty conferred with each student at least once a semester about how to improve his/her
collegiate-level writing to level of proficiency.
In the capstone course, redesigned assignments to demonstrate clearly the correlation of organization and
transitioning of concepts.
2010-2011:
42
Example Administrative Unit: Assessment and IR
(Use multiple measures direct and indirect)
(Note: This example does not show action plans for any objective not meeting benchmark)
Goal/Objective
Research and
Assessment is adequate
& appropriate to the
college mission with a
benchmark performance
rating of 3.0 or higher
Assessment
Measurements
Faculty & Staff
Survey based on a
5-pt. scale from 15 with 1 being low
Results
For 2006-07 the performance
rating was 3.7 compared to
4.0 for 2010-11;
GAP Analysis
A GAP Analysis on the
college’s policies and
procedures was done in 2006
which identified a need to
have better ways to track
student groups for course
completion rates, graduation
rates, grade distribution
Focus Groups
Two Focus Groups composed
of (1) support services staff
and (2) classified staff was
done in 2006-07 to identify
college weaknesses. There
were none reported for
assessment and research.
Evidence of Improvement & Analysis
(Analysis Status: Met, Mixed, Below
Benchmark)
Met Benchmark
This goal has improved when comparing
2006-07 survey data with 2010-11 data
by +0.3 points. Strategies that
contributed to this improvement
included: (1)The Office of Assessment &
IR wrote an Achieving the Dream grant
for over $450,000 to help the college in
its strategies to improve on student
success based on data and assessment
support; (2) The Assessment & IR office
has updated its SAS statistical software
Package from 8.2 to 9.0; (3) The Office
of Assessment has a color printer to
better present data to its various
stakeholders; (4) Various queries and
crystal reports were developed in
PeopleSoft to provide the data needed;
and (5) the VCCS and SCHEV worked
together to make data more available to
colleges in regard to transfer
stakeholders.
Action Plan/Modifications:
No modifications are recommended at this time.
Assessment results are
Faculty & Staff
For 2006-07 the performance Met Benchmark
used for the
Survey based on a
rating was 3.2 compared to
improvement of
5-pt. scale from 1- 3.7 for 2010-11;
This goal has improved when comparing
programs and services
5 with 1 being low
2006-07 survey data with 2010-11 data
with a benchmark
by +0.5 points. Strategies used to make
performance rating of 3.0
the improvements included: (1) placing
or higher
program/services data on the college
web page, (2) placing program working
documents on a common drive for
faculty to access and to enter data, (3)
professional development training
provided to faculty on assessment, as
well as face-to-face mentoring, and (4)
had program faculty do annual program
43
assessments instead of waiting for the
5-year Program Assessment Report.
Relevant information
from IR, such as that
required for decisionmaking, is readily
available with a
benchmark performance
rating of 3.0 or higher
Any request made to
Assessment and IR were
completed in a timely
manner with a
benchmark performance
rating of 3.0 or higher
Assessment and IR plays
a significant role in
making college
Action Plan/Modifications:
No modifications are recommended at this time.
Faculty & Staff
For 2006-07 the performance Met Benchmark
Survey based on a
rating was 3.4 compared to
5-pt. scale from 1- 3.7 for 2010-11;
This goal has improved when comparing
5 with 1 being low
2006-07 survey data with 2010-11 data
by +0.3 points. Strategies used to make
the improvements included: (1)
developed an Assessment & IR web
page (2) placing the Fact Book on the
college Assessment & IR web site, (3)
placing the Benchmark Report on the
college Assessment & IR web site, (4)
used Achieving the Dream data in
making changes such as adding another
level in Developmental English and in
increasing the placement test cutscores, and (5) placed a number of
resource documents and assessment
tools on the Assessment & IR web site
to assist the administration in making
decisions.
Action Plan/Modifications:
No modifications are recommended at this time.
Faculty & Staff
For 2006-07 the performance Met Benchmark
Survey based on a
rating was 3.6 compared to
5-pt. scale from 1- 3.8 for 2010-11;
This goal has improved when comparing
5 with 1 being low;
2006-07 survey data (3.6 rating) with
2010-11 data (3.8 rating) by +0.2 points.
The percentage of assessment and IR
Percent of IR
For 2007, 92% (22/24) of
reports has also improved from 95%
reports completed reports were completed by
(22/24) being completed by the
by requested time the requested data compared requested time in 2007 to 97% (28/29)
to 97% (28/29) for 2009;
in 2009. There were also no complaints
for 2009-10.
Zero Complaints to For 2009-10, there were no
supervisor or
complaints made to
president
supervisor or president for
not completing data requests
in a timely manner
Action Plan/Modifications:
No modifications are recommended at this time.
Faculty & Staff
For 2006-07 the performance Met Benchmark
Survey based on a
rating was 3.6 compared to
5-pt. scale from 1- 3.8 for 2010-11;
This goal has improved when comparing
44
improvements with a
benchmark performance
rating of 3.0 or higher
5 with 1 being low
2006-07 survey data with 2010-11 data
by +0.2 points.
Action Plan/Modifications:
No modifications are recommended at this time.
Example Administrative Unit: Admissions (Fictional)
Goal/Objective
Assessment
Measurements
The College’s admission
policies seeking are
efficient for students and
staff. A benchmark of
3.0 was established.
Student Survey
based on a 5-pt.
scale from 1-5 with
1 being low
Results
Evidence of Improvement & Analysis
Multiple Assessments
Focus Groups
Gap Analysis
Although achieving an overall
rating of 3.2 surveyed
students indicated the
College’s Admissions process
is cumbersome and should be
streamlined.
(Analysis Status: Met, Mixed, Below
Benchmark)
Community members were
invited to participate in an
Admissions focus group.
Findings indicate that most
community members are
unaware of the steps required
to enter the College. When
apprised of the steps involved
participants indicated that the
process should be
streamlined.
Strategies that lead these results:
1) Customer service training provided to
all Admissions Office staff.
2) A computer workstation was added
to the Admissions Office for students to
input application data.
3) A brochure detailing the admissions
process was prepared and disseminated
locally.
Admissions staff reviewed all
policies and procedures
relating to College admission.
Possible efficiencies could be
achieved by better use of
available technologies.
Mixed Benchmark
The achieved survey scores exceeded
the benchmark.
Although the overall benchmark of 3.0
was exceeded it was indicated that the
College’s admission process could be
improved and streamlined.
Action Plan/Modification
(If status is mixed or below benchmark, some type of actions should occur.)
45
Assessment Techniques
There are many techniques that may be used to assess student learning outcomes. In a number of
cases, these assessment techniques may be embedded in course assignments or activities as
measures of students‘ achievement of program goals as well as their attainment of the college's
general education goals.
Capstone Courses
Capstone courses are designed to enable students to review, evaluate, integrate, and synthesize
information and skills gained from other courses in the program or major. These courses are the
optimum place to assess many program or major goals and general education goals. A capstone
course is one which completing students take as a culminating experience that gives them the
opportunity to tie together the knowledge and skills of other program courses. If your program
has such a course, you may want to consider the performance in this type of capstone course as
an assessment method. Likewise, some programs assign a capstone project which can be
evaluated.
Criteria-Referenced Statement of Summative Learning
These statements show that graduates are learning a skill that‘s important in their disciplines. For
example a statement might note that, ―89 percent of our AAS degree recipients in Allied
Health/Medical Tech solved 20 simulated tasks concerning drug side-effects using the
Physician’s Desk Reference.‖
Internships, Field experiences, Clinical Evaluations
Internships, field, or clinical experiences are also ideal for assessing many program or major and
general education goals. When these occur at the end of the program or major, they often serve
as capstone experiences. It is especially useful to have external experts assess the performance
of your students.
Authentic Assessment
In some courses, opportunities can be found to ask students to engage in a simulation of a reallife problem that they must solve using the knowledge and skills they have gained in the course.
A single project can be structured to assess both mastery of course content and attainment of
program or major goals as well as certain general education goals such as communication skills,
life-long learning skills, critical thinking skills, and social and education values. For example,
students might be asked to assume the role of a city council member who must make a decision
concerning a controversial issue. Students might then be asked to research both sides of the
issue and to deliver a persuasive speech or to write an action plan.
Ill-defined or Ill-structured problems
An ill-defined problem is one that is not highly structured and cannot be resolved with a high
degree of certainty. Experts may disagree about the best solution. Examples: determining what
really happened at Waco or solving the nuclear waste storage problem or predicting the effect of
global warming or deciding if there is such a thing as global warming. Dealing with ill-defined
problems requires the integration of many skills, abilities, areas of knowledge.
46
Portfolios
An accumulation of student-produced work, a portfolio may be designed to assess a student‘s
attainment of program or major goals. The same portfolio may also be used to assess general
education goals such as communication skills or the development of skills to enhance life-long
learning, such as the ability to use the library and other appropriate sources to retrieve
information. Portfolios that contain early or unrevised work as well as later or revised work can
assess the growth of skill development. Rubrics to judge portfolios must be clear and shared
with the student.
The Advisory Committee (who are working professionals in the field) judged the work in the
portfolios using detailed criteria. This process assessed the individual student‘s work so that the
student could remedy any problem areas during the last semester, and the analysis of the
portfolios as a group indicated areas of concern for the program The students then had
something tangible to take with them on job interviews which showcased their work.
Curriculum Analysis Review
This is a common assessment activity used by a number of occupational/technical programs.
The Advisory Committee is particularly useful in curriculum review because they are generally
practicing in the field and are aware of advances or changes. Often the advisory committee can
give valuable insight by reviewing the goals and objectives to help plan future directions of a
program. Tying a curriculum to a national standard may be a particularly valuable assessment
technique.
The advantage of using this as one aspect of a program‘s assessment is that by using the advisory
committee, local business/industry are getting a voice in whether the curriculum is meeting their
needs. It is also an inexpensive assessment tool.
However, keep in mind that although we need to be sensitive to the needs of local business and
industry, it is our obligation to prepare students to work outside our service area as well as within
our own region. Generally, we can assume that the skills and knowledge needed in a certain
field in our own region will serve a student well anywhere, but there may be instances where that
does not prove to be the case.
Grades
Grades can be used to assess student learning by using primary trait analysis (PTA) to identify
the factors that count for scoring and explicitly stating the criteria for the evaluation of the
assignment, project, presentation, product in the form of a rubric.
Course-embedded Assessment
Program or major goals and general education goals may be assessed through assignments
embedded in required courses. For example, writing assignments, such as summaries or reports,
and oral presentations may be used to assess student' mastery of course content as well as their
writing, reading, critical thinking or speaking skills and use of the library or other information
source. With some planning, a single assignment or project can be designed to assess a number
47
of different program or major goals as well as general education goals.
Critical Incidents
Students can be asked to describe an incident, either real or imagined, that illustrates or
illuminates key concepts or principles. An explanation of the concepts or principles illustrated
should accompany the description of the incident.
Case Study Presented with a realistic example of an application in the field, students must
respond with an analysis that demonstrates their mastery of course content and their ability to
apply the information and skills they have learned. A case study is an examination of a specific
phenomenon such as a program, an event, a person, a process, an institution, or a social group.
The end product of a case study is a rich, thick description of the phenomenon being studied that
illuminates the student's understanding of the phenomenon through the application of the
knowledge and skills they have gained.
Journals
Journals or learning logs have been used in composition courses for years as a tool for increasing
student writing and motivation for writing and for assessing students' writing skills. However, a
journal that focuses on students‘ social and educational attitudes and values may be also useful to
assess students‘ achievement of general education goals. Journals may also be used to assess
student attainment of program or major goals.
Writing Samples
Writing assignments can be used as a measure of students' mastery of course content
and attainment of program or major goals. Such assignments may also be used as a direct
measure of the general education communication skills goal as well as an indirect assessment
of critical thinking skills. Examples of writing samples include essays, research or term papers,
answers to essay questions on tests, book reports, summaries, lab reports, and the like.
Oral Presentations/ Oral Exams
Depending on the nature and content of the course, oral presentations can be tailored not only to
assess students' mastery of course content but also their attainment of general education goals
such as critical thinking, general knowledge and historical consciousness, understanding the
impact of science and technology, and educational and social values. Oral presentations based
on course content can be used as a direct measure of students‘ communication skills.
Certification Tests
Programs in which a student must pass a certification examination in order to be certified to
work in the field, such as nursing, may want to consider using the results of that test as an
assessment technique.
One advantage of doing that is that successful results demonstrate credibility of the curriculum.
One disadvantage is that many organizations will not disclose students‘ results to the college
(although individual students might).
48
Exit Interviews
There are different types of exit interviews, but they commonly fall into two categories. In one
type of exit interview the program head and students discuss topics similar to those found on
student surveys. Topics can be very detailed and may result in information that you hadn‘t
thought to request. Sometimes students will say things that they do not wish to put in writing.
The other type of exit interview is actually more like an oral examination. (Call it an exit
interview has the advantage of not scaring students to death.) This method has been used very
successfully by the Administration of Justice program, where the interviews are conducted by a
panel made up of advisory committee members. It has the advantage of giving students practice
in the kind of interviews that they face for the hiring process and future promotion boards and
also assessed their proficiency in both oral communication and knowledge of their subject area.
Focus Groups
Focus groups are structured but informal discussions with small groups of students. Students
may be asked about issues that are pertinent to the program. Focus groups can also be conducted
with faculty, advisory committees, administrators and other employees.
External Evaluation/Review
This is a type of peer review where a consultant(s) from either business or another institution
examines a program from an outside perspective. This may involve such things as visiting
classes, interviewing faculty and students, interviewing advisory committee members, examining
curriculum goals and objectives, reviewing final exams, and interviewing local business and
industry. This method provides the opportunity for the exchange of ideas with a faculty member
of another institution.
Course Tests and Exams
Common test questions drawn from course content and included on tests and exams in all
sections of the course can be used to assess both program or major goals and some general
education objectives. A locally developed test gives you the opportunity to determine if specific
desired outcomes are being successfully attained. It can be tailored to meet the objectives of
your program. However, preparing a test takes a great deal of preparation and study.
Rubrics
For scoring consistency with longer open-ended assignments such as essays, research papers, or
performances, a rubric should be developed. A rubric is a criterion based scoring tool that
specifies levels of achievement (e.g. exemplary, satisfactory, and unsatisfactory) for each
dimension of the outcome. As part of the rubric, criteria are provided that describe what
constitutes the different levels of achievement.
There are two major types of rubrics: holistic and dimensional (analytic) also known as primary
trait rubric. Both detail the particular qualities that separate excellent from poor student work
along a spectrum, but the first groups the dimensions together, while the second keeps them
separate.
The holistic rubric looks at the instrument as a whole; students receive one overall score based
49
on a pre-dimension scheme used by everyone. The dimensional (analytic) rubric yields subscores for each dimension, as well as a cumulative score which is the sum, either weighted or unweighted, of the dimensional scores.
Each type of rubric has its strengths and weaknesses. Holistic rubrics allow you to look at a
student‘s overall performance, and often it corresponds better to the grade that pops into our
heads immediately after we finish looking at the student work. The dimensional (analytic) rubric
provides more information about what is working and what is not. For example, perhaps students
are doing a good job with learning the mechanics or writing, but not so well with learning
writing development. A dimensional rubric will provide information with this level of detail,
whereas a holistic rubric will not.
Regardless of the type of rubric, it is important tht it be shared with students well before the
assessment is administered. It is unreasonable to expect students to perform well on an
assessment if they do not have a clear understanding of the standards being used to evaluate it.
Surveys
Surveys may be used to assess the degree to which students perceived that they have attained
program or major goals as well as certain general education goals. Items that elicit this
information may be included on surveys developed by program or major faculty and
administered to current and/or prior students and on surveys sent to employers of program or
major graduates.
The use of surveys is a way to gain information that may directly impact a program. There are
many types of surveys. The ones most often used are graduate surveys, employer surveys and
student surveys. Surveys allow you to get direct feedback from a number of perspectives such as
employers and graduates. Results sometimes raise issues that would not be apparent in other
types of assessment.
One disadvantage is that it is often time-consuming and expensive. It requires careful planning
since a survey that is not thought through thoroughly may give you little useful information.
Standardized Tests
Standardized tests are nationally normed and may also be used to assess students‘ perception of
their attainment of general education goals. These tests best assess reading comprehension,
critical thinking, scientific reasoning, the ability to solve math problems, and writing skills such
as knowledge of grammar and correct usage. Additionally, there are major field tests which may
be used to assess student learning.
When administered pre and post, standardized tests can be an effective way to measure
achievement in a particular area. They have the advantage of credibility since they are nationally
normed. However, these tests are often expensive and do not always match well with the
curriculum. Our use of standardized tests in assessment has been limited in the past. We have
found that although it is good for detecting general problem areas, it is sometimes quite difficult
to discern more specific areas needing attention.
50
How to Design Rubrics for Scoring Essays, Projects, and Performances
Follow These Steps
1. Decide whether you want a holistic or analytic
rubric.
2. Construct a primary trait scale (a rubric).
3. Obtain consistency in instructions and conditions.
4. Norm the scorers.
A scoring rubric applied consistently by faculty teaching the course is a good way to assess
essays, projects, and performances. A rubric describes the primary traits of a high-level essay or
project, a poor essay or project, and the levels in between. That is, a rubric lists the criteria for an
A, a B, a C, etc., or for a score of 6, 5, 4, etc.—depending on how many levels of differentiation
are desired. Instructors use the rubric to score the essay, project, or performance.
1. Decide whether you want a holistic or analytic rubric.
An analytic rubric measures each part of the student work separately; a holistic rubric combines
them. To illustrate, here are analytic and holistic rubrics to assess Spanish journals in a beginning
Spanish course
Analytic Rubric for Spanish Journal
Comprehensibility
4. Entries are completely understandable.
3. Entries are usually understandable.
2. Entries are difficult to understand.
1. Majority of entries are incomprehensible.
Usage
4. Although there a few errors, verb tenses, sentence
structure, and vocabulary are correctly used.
3. Some use of appropriate verb tenses and correct
sentence structure and vocabulary, but incorrect
usage or vocabulary interfere.
2. Many errors make comprehension difficult.
1. The majority of entries are incomprehensible.
Risk Taking
4. Student has taken some chances, employing
sentence
structures on the edge of what we have been
studying.
3. Student writes mostly safe entries, but is
generally
current with the textbook.
2. Student writes only safe entries, and is not current
with the textbook.
1. Student writes only simple structures.
Holistic Rubric for Spanish Journal
Note that several traits (comprehensibility, usage,
risk taking, and variety of subject and form) have
been combined into a single scale.
4. The content of the journal is comprehensible.
Although there are errors, verb tenses, sentence
structure, and vocabulary are correctly used. The
author has taken some chances, employing sentence
structures or expressing thoughts that are on the
edge of what we have been studying. The entries are
varied in subject and form.
3. There is some use of appropriate verb tenses and
correct Spanish sentence structure and vocabulary,
but incorrect usage or vocabulary interferes with the
reader‘s comprehension.
2. The reader finds many of the entries difficult to
understand, or many entries are simplistic or
repetitious.
1. The majority of entries are incomprehensible.
Source of holistic rubric: Barbara Walvoord and
Virginia Anderson, Effective Grading: A Tool for
Learning and Assessment, 1998.
51
Variety
4. Entries are highly varied in subject and form.
3. Entries are somewhat varied in subject and form.
2. Entries show only a little variety in subject and
form.
1. Entries show no variety in subject and form.
EXAMPLE: MTH 163 Proficiency Learning Objectives Rubric
Using Comprehensive Test
Sample size consisted of ______ students.
Number of
students
below
Proficiency
1 point
Number of
students
at
Proficiency
2 points
Number of
students
above
Proficiency
3 points
1. Student’s ability to use mathematical logic and reasoning to solve
content related problems
2. Student’s ability to interpret and use content related formulas
3. Student’s ability to make inferences based on interpretation of
graphs, tables, and/or schematics
4. Student’s ability to solve content related problems by using
algebra, geometry, and/or statistics
5. Student’s ability to determine reasonableness based on
estimated answers
6. Student’s ability to recognize and communicate appropriate
methods to solve content related problems
7. Student’s ability to represent mathematical information
numerically, symbolically, and/or visually with graphs and charts.
Above Proficiency indicates that a student correctly answered 90% or more of Quantitative Reasoning Objectives items.
At Proficiency indicates that a student correctly answered between 70% and 90% of Quantitative Reasoning Objectives items.
Below Proficiency indicates that a student correctly answered 70% or less of Quantitative Reasoning Objectives items.
52
Public Speaking Assessment Rubric
A.
B.
C.
D.
E.
F.
G.
H.
A.
B.
C.
D.
Verbal Effectiveness – 50 points
Idea development, use of language, and the organization of ideas are effectively used to achieve a purpose.
Advanced
Developing
Emerging
Ideas are clearly organized,
A. The Main Idea is evident, but
A. Idea ―seeds‖ have not yet
developed, and supported to
the organizational structure
germinated; ideas may not be
achieve a purpose; the purpose
may need to be strengthened;
focused or developed; the main
is clear.
ideas may not always flow
purpose is not clear.
The introduction gets the
smoothly.
B. The introduction is
attention of the audience.
B. The introduction may not be
underdeveloped or irrelevant.
Main points are clear and
well-developed.
C. Main points are difficult to
organized effectively.
C. Main points are not always
identify.
Supporting material is original,
clear.
D. Inaccurate, generalized, or
logical, and relevant.
D. Supporting material may lack
inappropriate supporting
Smooth transitions are used.
in originality or adequate
material may be used.
The conclusion is satisfying.
development.
E. Transitions may be needed.
Language choices are vivid and
E. Transitions may be awkward.
F. The conclusion is abrupt or
precise.
F. The conclusion may need
limited.
Material is developed for an oral
additional development.
G. Language choices may be
rather than a written
G. Language is appropriate, but
limited, peppered with slang or
presentation.
word choices are not
jargon, too complex, or too
particularly vivid or precise.
dull.
Nonverbal Effectiveness – 50 points
The nonverbal message supports and is consistent with the verbal message.
Advanced
Developing
Emerging
The delivery is natural,
A. The delivery generally seems
A. The delivery detracts from the
confident, and enhances the
effective – however, effective
message; eye contact may be
message – posture, eye contact,
use of volume, eye contact,
very limited; the presenter may
smooth gestures, facial
vocal control, etc. may not be
tend to look at the floor,
expressions, volume, pace, etc.
consistent; some hesitancy may
mumble, speak inaudibly,
indicate confidence, a
be observed.
fidget, or read most or all of
commitment to the topic, and a
B. Vocal tone, facial expressions,
the speech; gestures and
willingness to communicate.
clothing, and other nonverbal
movements may be jerky or
The vocal tone, delivery style,
expressions do not detract
excessive.
and clothing are consistent with
significantly from the message.
B. The delivery may appear
the message.
C. Filler words are not distracting.
inconsistent with the message.
Limited filler words (―ums‖) are D. Generally, articulation and
C. Filler words (―ums‖) are used
used.
pronunciation are clear.
excessively.
Clear articulation and
E. Over dependence on notes may
D. Articulation and pronunciation
pronunciation are used.
be observed.
tend to be sloppy.
E. Over dependence on notes may
be observed.
53
SACS Standards Relating to Assessment
•
2.5 The institution engages in ongoing, integrated, and institution-wide research-based
planning and evaluation processes that incorporate a systematic review of programs and
services that (a) results in continuing improvement and (b) demonstrates that the
institution is effectively accomplishing its mission.
•
2.7.3 The institution requires in each undergraduate degree program the successful
completion of a general education component at the collegiate level that
(1) is a substantial component of each undergraduate degree
(2) ensures breadth of knowledge, and
(3) is based on a coherent rationale.
•
3.3.1 The institution identifies expected outcomes for its educational programs and its
administrative and educational support services; assesses whether it achieves these
outcomes; and provides evidence of improvement based on analysis of those results.
•
3.4.1 The institution demonstrates that each educational program for which academic
credit is awarded (a) is approved by the faculty and the administration, and (b) establishes
and evaluates program and learning outcomes.
•
3.4.12 The institution places primary responsibility for the content, quality, and
effectiveness of its curriculum with its faculty.
•
3.4.13 For each major in a degree program, the institution assigns responsibility for
program coordination, as well as for curriculum development and review, to persons
academically qualified in the field.
•
3.5.1 The institution identifies college-level competencies within the general education
core and provides evidence that graduates have attained those competencies.
•
3.7.2 The institution regularly evaluates the effectiveness of each faculty member in
accord with published criteria, regardless of contractual or tenured status.
•
3.7.3 The institution provides evidence of ongoing professional development of faculty as
teachers, scholars, and practitioners.
•
4.2 The institution maintains a curriculum that is directly related and appropriate to its
purpose and goals and to diplomas, certificates, or degrees awarded.
54
PDCCC Library
Teaching Resources & Assessment Bibliography
Angelo, Thomas A. and K. Partricia Cross. Classroom Assessment
Techniques. San Francisco, CA: Jossey-Bass, 1993.
Blythe, Hal, and Charlie Sweet. It Works for Me!: Shared Tips for Teaching.
Stillwater, OK: New Forums, 1998.
Blythe, Hal, and Charlie Sweet. It Works for Me, Too!: More Shared Tips for
Effective Teaching. Stillwater, OK: New Forums, 2002.
Boylan, Hunter. What Works: Research-Based Best Practices in
Developmental Education. Boone, NC: Appalachian State U, 2002.
Cushman, Kathleen. First in the Family: Advice about College from FirstGeneration Students; Your College Years. Providence, RI: Next
Generation, 2006.
D'Errico, Deanna, ed. Effective Teaching: A Guide for Community College
Instructors. Washington: The American Association of Community
Colleges, 2004.
Farnsworth, Kent, and Teresa Bevis. A Fieldbook for Community College
Online Instructors. Washington: Community College Press, 2006.
Friday, Bob. Create Your College Success : Activities and Exercises for
Students. Belmont, CA: Wadsworth, 1988.
Gabriel,
Kathleen
F.
Teaching
Unprepared
Students:
Strategies
for
Promoting Success and Retention in Higher Education. Sterling, VA:
55
Stylus Publishing, 2008.
Gallien Jr., Louis B., and Marshalita S. Peterson. Instructing and Mentoring
the African American College Student: Strategies for Success in Higher
Education. Boston: Pearson, 2005.
Jewler, A. Jerome, John N. Gardner, and Mary-Jane McCarthy, eds. Concise
Edition. Your College Experience: Strategies for Success. Belmont, CA:
Wadsworth, 1993.
Holkeboer, Robert. Right from the Start : Managing Your Way to College
Success. Belmont, CA: Wadsworth, 1993.
Johnson, Elaine B. Contextual Teaching and Learning : What It Is and Why
It's Here to Stay. Thousand Oaks, CA: Corwin P, 2002.
Kanji, Gopal K. 100 Statistical Tests. 3rd ed. London: Sage, 2006.
Leamnson, Robert. Thinking About Teaching and Learning: Developing
Habits of Learning with First Year College and University Students.
Sterling, VA: Stylus, n.d.
Lieberg, Carolyn. Teaching Your First College Class: A Practical Guide for
New Faculty and Graduate Student Instructors. Sterling, VA: Stylus,
2008.
Linehan, Patricia. Win Them Over: Dynamic Techniques for College Adjuncts
and New Faculty. Madison, WI: Atwood Publishing, 2007.
Magnan, Robert. 147 Practical Tips for Teaching Professors. Madison, WI:
Atwood Publishing, 1990.
Mamchur, Carolyn. A Teacher's Guide to Cognitive Type Theory and Learning
Style.
Alexandria:
Association
for
Supervision
&
Curriculum
56
Development, 1996.
McGlynn, Angela P. Successful Beginnings for College Teaching: Engaging
Your Students from the First Day. Madison, WI: Atwood Publishing,
2001.
Nilson, Linda Burzotta. Teaching at Its Best : A Research-Based Resource for
College Instructors. 2nd Ed. Bolton, MA: Anker, 2003.
Palloff, Rena M. and Keith Pratt. The Virtual Student : A Profile and Guide to
Working with Online Learners. San Francisco: Jossey-Bass, 2003.
Palomba, Catherine A. and Trudy W. Banta. Assessment Essentials:
Planning,
Implementing,
and
Improving
Assessment
in
Higher
Education. San Francisco: Jossey-Bass, 1999.
Pregent, Richard. Charting Your Course: How to Prepare to Teach More
Effectively. Madison, WI: Atwood Publishing, 2000.
Roueche, John E. and Suanne D. Roueche. High Stakes, High Performance :
Making Remediial Education Work. Washington: Community College
Press, 1999.
Roueche, John E., Eileen E. Ely, and Suanne D. Roueche. In Pursuit of
Excellence: The Community College of Denver. Washington: Atwood,
2001.
Sarasin, Lynne C. Learning Style Perspectives : Impact in the Classroom.
Madison, WI: Atwood, 1999.
Schuh, John H., M. Lee Upcraft, et.al. Assessment Practice in Student
Affairs: An Applications Manual. San Francisco: Jossey-Bass, 2001.
Sims, Ronald R., and Serbrenia J. Sims, eds. The Importance of Learning
57
Styles : Understanding the Implications for Learning, Course Design
and Education. Westport, CT: Greenwood, 1995.
Stevens, Dannelle D. and Antonie J. Levi. Introduction to Rubrics: An
Assessment Tool to Save Grading Time, Convey Effective Feedback
and Promote Student Learning. Sterling, VA: Stylus Publishing, 2005.
Student Survival Guide. New York: College Entrance Exam Board, 1991.
Taylor, Terry. 100% Information Literacy Success. Clifton Park, NY:
Thomson, 2007.
Upcraft, M. Lee, John H. Schuh, and John H. Schuh. Assessment Practice in
Student Affairs : An Applications Manual. San Francisco: Jossey-Bass,
2000.
Vernoy, Mark, and Diana Kyle. Behavioral Statistics in Action. Boston:
McGraw-Hill, 2002.
Walvoord, Barbara E. Assessment Clear and Simple: A Practical Guide for
Institutions, Departments and General Education. San Francisco:
Jossey-Bass, 2004.
Walvoord, Barbara E. and Virginia Johnson Anderson. Effective Grading: A
Tool for Learning and Assessment. San Francisco, CA: Jossey-Bass,
1998.
Weimer, Maryellen and Joan L. Parrett. How Am I Teaching? Forms and
Activities for Acquiring Instructional Input, Madison, WI: Atwood
Publishing, 2002.
58
Weimer, Maryellen and Rose Ann Neff. Teaching College: Collected
Reading’s for the New Instructor. Madison, WI: Atwood Publishing,
1998.
59
Internet Resources for Assessment
General Principles of Assessment: http://www.tcc.edu/welcome/collegeadmin/OIE/SOA/principles.htm
Writing Measurable Learning Outcomes: http://www.adprima.com/objectives.htm
Evaluation Methods to Measure Outcomes by Programs: http://www.unf.edu/acadaffairs/IE/alc/
Types of Measures: http://www.provost.wisc.edu/assessment/manual/manual2.html
Action Strategies to Closing the Loop: http://www.siue.edu/~deder/assess/catmain.html
Rubric Creations: http://rubistar.4teachers.org/index.php , http://pareonline.net/getvn.asp?v=7&n=3 ,
http://imet.csus.edu/imet2/nicher/toohotwebquest/evaluation.html ,
http://pareonline.net/getvn.asp?v=7&n=25 ,
http://www.teach-nology.com/web_tools/rubrics/
Rubric Generator-Rubistar: http://rubistar.4teachers.org/
Towson Assessment Resources: http://pages.towson.edu/assessment/office_of_assessment.htm
NC State Assessment Resources: http://www2.acs.ncsu.edu/UPA/assmt/resource.htm
College of Du Page Resources: http://www.cod.edu/outcomes
Assessment Peer Review Electronic Journal: http://PAREonline.net
Virginia Assessment Group (VAG): http://virginiaassessment.org/RPAJournal.php
National Council on Measurement in Education (NCME): http://ncme.org
American Educational Research Association: http://aera.net
Examples of Critical Thinking Scoring Rubrics:
Holistic Critical Thinking Scoring Rubric: http://66.132.144.88/pdf_files/rubric.pdf
Analytic Critical Thinking Rubric: http://www.neiu.edu/~neassess/pdf/CriThinkRoger-long.pdf
Writing Learning Objectives
Basic Guidelines (and Examples) for
http://www.mapnp.org/library/trng_dev/lrn_objs.htm
How do I write an instructional objective?
http://edtech.tennessee.edu/~bobannon/objectives.html
How to Write Clear Objectives
http://tlt.its.psu.edu/suggestions/research/Write_Objectives.shtml
How to Write Learning Objectives in Behavioral Form
http://www.adprima.com/objectives.htm
Understanding Objectives
http://edweb.sdsu.edu/courses/EDTEC540/objectives/ObjectivesHome.html
Guidelines for writing learning objectives in librarianship, information science and
archives administration
http://www.unesco.org/webworld/ramp/html/r8810e/r8810e00.htm#Contents
Quick Guide to Writing Learning Objectives
60
http://www.nwlink.com/~donclark/hrd/templates/objectivetool.html
Writing Learning Objectives
http://www.arl.org/training/ilcso/objectives.html
Writing good work objectives
http://home.att.net/~nickols/workobjs.htm
Writing instructional objectives: The what, why how and when.
http://www.sogc.org/conferences/pdfs/instructionalObj.PDF
Blooms Taxonomy
Affective Domain
http://www.itc.utk.edu/~jklittle/edsmrt521/affective.html
Assessing Learning Objectives Bloom's Taxonomy
http://www.ion.uillinois.edu/resources/tutorials/assessment/bloomtaxonomy.asp
Bloom‘s Taxonomy
http://www.officeport.com/edu/blooms.htm
Cognitive Domain
http://www.itc.utk.edu/~jklittle/edsmrt521/cognitive.html
Psychomotor Domain
http://www.itc.utk.edu/~jklittle/edsmrt521/psychomotor.html
Instructional Design
http://carbon.cudenver.edu/~mryder/itc_data/idmodels.html#isd
Assessment
Curriculum Development Performance Criteria
http://its.foxvalleytech.com/iss/curric-assessment/CRITCOND.html
How to Write an Assessment Based on an Objective
http://www.adprima.com/assessment.htm
Performance Criteria
http://its.foxvalleytech.com/iss/curric-assessment/CRITCOND.html
Multiple Choice Questions and Bloom‘s Taxonomy
http://web.uct.ac.za/projects/cbe/mcqman/mcqappc.html
61
Download