- ·-

advertisement
0
....I
V>
Cl)
Q)
E
OJ
0
...,u
u
Cl)
u
·-0>
,_ z
·..a
:s ..c
...,,_
0:::
Q)
0
u..
"C
c:
rn
:s
0
b.O
c:
c:
,_
rn
Q)
·..,
....I
c:
Q)
"'C
..,:s
V>
>
E
Q)
""O
ro
u
<(
00
0
0
N
~
L{)
rl
'­
(])
>
+->
...a
::J
Q)
u
ro
LL
E
>
0
z
'
Defi;nitions -tll\llfu\IRllilillll!iliilllllllliitll!iiiljil!!1!!1k11j)jjllllll!IMll\\l!jl!)-iil,WIJll!lliiiiilli1
D S',tUdent Learning Outcomes (SLO) Iii
Ii
Ii
A specific, measurable competency (Knowledge.,
Skills, Values, or Attitudes) that your students should
be able to demonstrate as a result of participation in
a learning activity.
SLOs can be expressed and measured at the class,
program or institutional level.
SLOs are not grades, retention rates, graduation
rates, enrollments, FTES, or completion rates.
SLOs reflect a shift from a focus from "What am I
teaching" to "What are my students learning?"
A good strategy in working with SLOs is using rubrics
I .,I'
'
,''
, .,,I
'1 ,,1
'i
1:
'
,Defi·nitions
lil'i!ll!lll!lllflilt'llillllllilllllll!Jll!illMM-8•""•---"'--"''-"''
--··
"'."'llfl""lllM"''"""'•""lllilll"''"'l•"ii"'ll""____________________
D Rubric
A rubric is a set of categories which define
and describe the important components of
the work being completed, critiqued, or
assessed.
Each category contains a gradation of levels
of completion or competence with a score
assigned to each level and a clear
description of what performance need to be
met to attain the score at each level.
,j
'
.···,
I' Why' u'se
'
i,
'
, :·;
·,I,, 'I'·
a,Ru6ric?
',
:\''
:.11,,
Ii'
,,,'
!:1!
IJl!llllillllll1!!!J!ijJl!lll!ljlll!Wfliillll!lllillMiilllM!!llm1\liill!l!M!M•fil-lllll1illMl!llRll
l!lil li1lll l!lil
Saves faculty time in assessrnent which provides timely feed.back
Provides meaningful detailed feedback that the student/instructor/college
can act on
:o Because the rubric is discussed with students at the beginning they have
a better understanding of the instructor's expectations
Encourage critical thinking
D
D
!iii!
Facilitate communication with others
D
O
lJ
11
Helps new faculty to not only be consistent in teaching assignments identified by the course
syllabus, but also the expectations for student performance
Allows faculty teaching the same course to share rubrics which promotes grade consistency
Helps student support services assist students with specific learning problems identified by
the rubric
Help us refine our teaching methods
D
li1lll
Rubrics encourage self-assessment and self-improvement by students to think, reason, and
make judgments based on data
Rubrics speeds \JP grading time enormously, thus allowing assignment of more complex
tasks instead of focusing on rote memory skills using multiple choice and short answers
questions
Rubrics will allow faculty to close the loop by identifying problem areas so that
improvement to their course/program can be made.
Level the playing field for diverse populations
D
Rubrics act as a translation device in our new diverse environment by spelling out in detail
what is expected
,·:
I
TyPes,·.or ···.Rw br!Cs
;;;i,'
'
(What do you want it to do?)
~!llllll!ll!!!lilllllll!lllllllllllilllillllillllll!llil!mllllll~liillli!lllllll~~111~111mllll~ma~··~~mm1r-~~~~~~~~~~~~~~--'~~~~-
• Analytic: Information to/about individual
student competence
D Communicate expectations
D Diagnosis for purpose of improvement
and feedback
D Provide specific feedback. along several
dimensions
Holistic: Overall examination of the status
of the performance of a group of students?
D Provide a single score based on an overall
impression of a student's performance on a task
11,,
I
• '
How n1ah)' points On Cl Rubrit
Scale?
11lfi!liU~W-i11iil!llili!iill!lllilii\jliiil!!IJ.'l\1ill!l!lr!lj!l!ilji!liliillilliMii1m1•11!~l~illliiJiilllllll•llllJflJililljijlji
> Consider both
>
>
the nature of the
performance and purpose of scoring
'Recommend 3 to 6 points to describe
student achievement at a single point
in time
If focused on developmental
curriculum (growth over time) more
points are needed
Performance
Performance
Performance
Performance
Scale
(numeric
w/descriptor)
Scale
(numeric
w/descriptor)
Scale
(numeric
w /descriptor)
Scale
(numeric
w/descriptor)
Identifiable
performance
characteristic
reflecting this
level
Identifiable
performance
characteristic
reflecting this
level
Identifiable
performance
characteristic
reflecting this
level
Identifiable
performance
characteristic
reflecting this
level
Score
1­
u nsatisfactory
2­
Developing
3­
Satisfactory
Exem pla ry
Research &
Gather
Information
Does not collect any
information that
relates to the topic.
Collects very
little
information ­
some relates to
the topic.
Collects some
basic information
- most relates to
the topic.
Collects a great
deal of
information - all
relates to the
topic.
Fulfill Team
Role's
Duties
Does not perform
any duties of
assigned team role.
Performs very
little duties.
Performs nearly
all duties.
Performs all
duties of assigned
team role.
Share
Equally
Always relies on
others to do the
work.
Rarely does the
assigned work ­
often needs
reminding.
Usually does the
assigned workrarely needs
reminding.
Always does the
assigned work
without having to
be reminded.
listen to
Other
Teammates
Is always talkingnever allows anyone
else to speak.
Usually doing
most of the
talking - rarely
allows others to
speak.
Listens, but
sometimes talks
too much.
Listens and
speaks a fair
amount.
4-
Average
Score
I
,·,
1,
'
I,
'
· ·. Student;Learnf.n£J,Outcomes'· Using'.·.··.:.···
Rubrics ·. ·
· ·
--~-1i!l1l!!!!!l!lllPIJ!llll-
D How do you get the evidence?
111 Look at your course/program objectives in
your course syllabus or program
111 Design tasks (or "questions" or "test items")
that will produce the evidence you ne·ed to
determine whether or not a student has met
the student learning outcome (SLO)
Iii Develop rubric(s) using multiple levels of
performance
SLO Students will
demonstrate
critical thinking
skills
Evidence/Performance
Task
Students can recognize
assumptions in an argument.
Ask students to identify
assumptions in
argumentative essay.
Students can distinguish between
fact and opinion
Provide newspaper column
& ask students to identify
statements that express
opinions.
Students can produce a well­
reasoned argument.
Ask students to write an
essay defending or opposing
a given position.
Students can generate hypotheses.
Give example of scientific
result & ask students to
provide possible
explanations for why result
occurred.
Holistic:
··Advantages: quick scoring, provides overview of
student achievement
• Disadvantages: does not provide detailed
information, may be difficult to provide one overall
score
Analytic rubrics: ·Advantages: more detailed feedback, scoring more consistent across students and graders •Disadvantage: time consuming to score
Step 1
Re-examine the SLO's to be assessed by the task.
Step 2
Identify specific observable attributes that you want to
see (as well as those you don't want to see) your
students demonstrate in their product, process, or
performance.
Step 3
Brainstorm characteristics that describe each attribute.
Step 4
Write thorough narrative descriptions for excellent work
and poor work incorporating each attribute into the
description.
Step 5
Complete the rubric by describing other levels on the
continuum that ranges from excellent to poor work for the
collective attributes.
Step 6
Collect samples of student work that exemplify each level.
Step 7
Revise the rubric, as necessary. Be prepared to reflect on
the effectiveness of the rubric and revise it prior to its
next implementation.
'
Score of 3
The response is successful in the following ways:
It demonstrates an ability to analyze the stimulus material thoughtfully & in depth.
It demonstrates a strong knowledge of the subject matter relevant to the question.
It responds appropriately to all parts of the question.
It demonstrates facility with conventions of standard written English.
Score of 2
The response demonstrates some understanding of the topic, but it is limited in
one or more of the following major ways:
It may indicate a misreading of the stimulus material or provide superficial analysis. It may demonstrate only superficial knowledge of the subject matter relevant to the question
It may respond to one or more parts of the question inadequately or not at all.
It may contain significant writing errors.
Score of 1
The response is seriously flawed in one or more of the following ways:
It may demonstrate weak understanding of the subject mater or of the writing tC)sk.
It may fail to respond adequately to most parts of the question.
It may be incoherent or severely underdeveloped.
It may contain severe and persistent writing errors.
Score of 0
Response is blank, off-topic, totally incorrect, or merely rephrased the
question.
-----------------------------------------------
Step 1 Re-examine the SLO's to be assessed by the task.
Step 2 Identify specific observable attributes that you want to
see (as well as those you don't want to see) your students ·
demonstrate in their product, process or performance.
Step 3 Brainstorm characteristics that describe each attribute.
Step 4 Write thorough narrative descriptions for excellent work
and poor work (and steps in between) for each individual
attribute.
Step 5 Complete the rubric by describing other levels on the
continuum that ranges from excellent to poor work for
each attribute.
Step 6 Collect samples of student work that exemplify each level.
Step 7 Revise the rubric, as necessary. Be prepared to reflect on
the eff~ctiveness of the rubric .and revise it prior to its.
next implementation.
Pts.
Analysis
Pts.
Subject
Matter
Pts
Mechanics
3
Analyzes
stimulus material
thoroughly
3
Demonstrates
strong
knowledge of
subject mater 3
Demonstrates
facility with
conventions of
written English
2
Provides
superficial
analysis
2
Demonstrates
some
knowledge of
subject matter 2
Contains
significant
writing errors
not affecting
comprehension
1
Does not provide
analysis 1
Demonstrates
weak
understanding
of subject
matter
1
Canta.ins
severe &
persistent
errors affecting
comprehension
Rubric Template
(Describe here the task or performance that this rubric Is designed to evaluate.)
~,~:Iii Dev'e;pin9 ·­ Accom:lished -­ -E~-;;~:lar\1
~-==~-
Stated Objec;tive
or.Performance
------
Stated Objective
_·or Performance
---
Description of
Identifiable
performance
characteristics
reflecting
mastery of
performance.
Description of
Identifiable
performance
characteristics
reflecting a
beginning level
of performance.
Description of
Identifiable
performance
characteristics
reflecting
development
and movement
toward mastery
of performance.
1
Description of
identifiable
performance
characteristics
reflecting
mastery of
performance.
Description of
Identifiable
performance
characteristics
reflecting a
beginning level
of performance.
Description of
identifiable
performance
characteristics
reflecting
development
and movement
toward mastery
of performance.
Description of Description of
identifiable
identifiable performance
performance characteristics characteristics
reflecting the
reflecting mastery of highest level of
performance. performance.
Description of
identifiable
performance
characteristics
reflecting a
beginning level
of performance.
--
-
Stilted Objective
or Performance
- -
Description of
Identifiable
performance
characteristics
reflecting
development
and movement
toward mastery
of performance.
Score
------
Description of
identifiable
performance
characteristics
reflecting the
highest level of
performance.
1
-----
Description of
Identifiable
performance
characteristics
reflecting the
highest level of
performance.
-
-
Description of
Description of
identifiable
identifiable
performance
:>tilted_Oojectlve ­
performance
characteristics
_Qr Performance
characteristics
reflecting
reflecting a
development
beginning level and movement
-,of performance. toward mastery
,of performance.
Description of Description of
identifiable
Identifiable performance
performance
characteristics
characteristics
reflecting the
reflecting
highest level of
mastery of
performance.
performance.
Description of
Description of
identifiable
identifiable
performance
Stated Ob}ective
performance
characteristics
or Performance ­
characteristics
reflecting
reflecting a
development
beginning level and movement
' lof performance. toward mastery
,of performance.
Description of
Identifiable
performance
characteristics
reflecting
mastery of
performance.
Description of
Identifiable
performance
characteristics
reflecting the
highest level of
performance.
VCCS Writing Sample Scoring Grid 6
clear & consistent
com'):)etence
5
reason:ably consistent
comnetcnce
4
adequate competence
3
developing competence
2
inadequate competence
clcarly-Stl1cd purpose ths.1
add.""CSSCS the V¥riting tlSk in
_~ 1 a thot..'ghtful W(i'f
effectively addresses the
\Vriting task and shows
depth
3ddrcsses the writing task
but may bck complexity
inconsistcntsen..<::e ofpurpose;
loose relation to writing task
confuscdsCl\Scofpurposc:no
evidence of connection to writing
tosk
~
~
Iwcll-org-.mizcd
conren~ v.i1EI gcncrntly wcO...,rgaru,:cd
effective i:raositlons;
appropriate
]
some signs of logical
I
inodcquatc org3'lization; may
have abrupt or illogical shifts &
I
confused organization; no
transitions; beginning and endlli.g
with
organiz:atio-n "l!.ith
bc:ginoing. middle and end
&. transroons between
ineffective flow of ideas
do not relate to content
incompJcte development of
content may be vague.
!>implistic. or stereotypical
superficial development:.
~
effective beginning and
~tions and relevant
~
coiling pamgra:phs
beginning o.nd ending
s-.ib:.."taotial. logical and
concrete devclop:nent of
.adequately and
thoughtfully developed
cootc:i.t v.'ith spccillc
<lct::ils or c:xampl1$
parti:.Uly developed content
\vith some details or
facility v.ith language:
mccbaniC31 aod usage
I
I
I incompetence
I abscnceofony
purpose: or rcl:ltion to
writing bsk
I no <Mdcncc ofan
orgao.izati:oml plan o•
inteot to develop
ff
0
!paragraph
ideas; effective use of
structure
~
'§
i:::omplc:s and appropriut~
redt.mdant details~ in:dequato
pru-ai;rnphlng
we::ik ide2:S. wlth no
supporting details
:llld in~ppropriatc
conclusions
inap;cropri:lte. imprecise or
inadequate: language: limited
sentence v::uicty
inadequate and simplistic
iangu.ag:e. with no v;i.ricty and
errors mword choice
no control ovcri,vord
cboi~ excessive
c:rrors in mc::inins:
repeated w~csses in
mech:mics and usage: pattern
offla\VS
mechanical and usage cnors: that
!n.tcrferewith writer's puq)osc
errors so severe that
in3dequate. inappropriate or
pw.igrapbing
u
E
~
~
.g=
;:=
appropriate a."ld precise word
choice: language and
sentence strocturc are alive..
m<'.tttlre.. and varied
sentence-level style
adequate language use., with
some imprecise \vord
choice: some scn:tCnce
variety
fc\v mechanieal and usage
errors: evidence ofS'JpcriOr
control ofWction
some mccb:mlcaL
proofreading or usage
errors
errors that do not interfere
with meaning
mature :angc of
vocabiiliuy and control of
mcchanic:ru. and usage
.,...Titet~ s 3dczs arc
hiddon
e
':,Js' 'i ~ 1%~er~<;~0!fr °~~1i1?;,1 /~,~,;;:1 Stif~ci~Y ~qtl, i1-·£,,, 1~:'.:',?'1 :!~b:~fj~'f~:1~~~~:-~R§~i-~:#~::-~:,11 ·'.if~f-~¥~;·;-~~.W;\~,~~~'.fP!i~~~;-:'.,:,·,:·-!·-1: :'.}a·1:-~~?--~~-':~f1':~~:?i :\~]:~):t:'.~',-"L:-:_,:,;[~.:-1.;~1}~~.i~~~'.~W~:::I\i:'.'.'.1'(:1: :,: ;:.~.:
,' ~=1I Iso:ur.ces
lS t:YVe11-:t.n 1 ~ra1
1 r·appr:opna1e use o 1r ::· 1" •. · 'r' ut,.m·a)'.. ,oe:IDcons1s:tent'':. 'c', !·'WJ~,c1 umcy,:1ol:'·'J.OnllW3.1C),('r:!l/i· -1· oc\ll'O.,e:nt:materia: 1 -.1.l'om-; ,;t',-, .·i' '--" 1·11< r.':-uon:owco\.uom ': ·,J !' ,.,·;_,·,:;
Id, '"'I ' I 'nt::.:!.:i ~11 ,',I 1 I 11 I1 u r I I ' ~til. 1 I"..'~,, 1,:,,.·,·11'··: i,: :-,_,_,~·1';·1-~J;, 1:.:d·,: :\-,J,'~·:'.·7:,,:,'~~l:,;··1···,1;. ·:·;,1:1 :,-,-~ -~-::,'.: .r'~:·:.i_1,~·.: \'·~r :1:: ..,:,:,l,!',1:,1>.:.:,·+111/,~.',l;·y;.::i)1·1i.:,.,,'11 :.r:1I ') '·'· i~,,Sld'r::.,11:-.,.;,•1 1.1"1; 1•j \'O·j;&1'·1'1 :~~·,!."j~ :·.'!,'·:'·,:,·~r!:.1.\'1''.': 1,l,~-~·~·1;;-,:'. JJ'i)j'.''. 1 ''_~·;;,,-.f;';li,!' !1'.:1·,:·'/
1
cl~
~
a'
1;;1.1.
1
OCitlm~ CU",,.
r
JI
L1 I uoeumen..... on-ana
r,>.,j :1: 1:c'Or·.USe
1
·llnFntj~ Y'·,.'.l'.;1.";',i:,,,:.
1
1
1
i:cn.auonisty e.1:'." ,,,1.:.11,1:! 1.·.,.:.rl:.,.i,,'1·,;''. ;1 i;;QU~ e1:-sources 1···uut..'nroer:\:· 1 :,.;_..,,,,1.~:Hou.~;.uC:(Sources1;·')!::,,r1:.
*Objective on outside documentation recommended by participants of the pilot to be omitted.
Non-Scorcabfo (NS)
• is iJJegible; i.e.. includes so many undecipherable words that no sense can be made of the
response
•
or
• is incoheren~ i.e., words arc legible but synta...'C is so garbled that response makes no sense
or
• is a blank paper
'
Report ofthe VCCS Task Force on Ass=ing Core Competencies 36
Oral Communication Assessment Rubric
Verbal Eff<><'fh·eness
idea dovelovn1~11t, use ofla119.,/la£e, and the or?,anizafio11 ofidtias are efii.>.elh<e/\• used to achf(Jl'O a p111vose
Ad\'!lnced (5)
De\·elouin2 (3)
E1nen!:i112 (1)
A.
Ideas are clearly organized, A
The ).-fain idea i~· e\'idenl, I.mi
A.
Iclei\ "~ee<ls'' have not yet
developed, and supported to
the org.1niz.~lional stn1Cfure. n1ay
gerrninated; ideas 1uay not be
acJ1ie\·e a purpose; the purposeneed to be sfrengtl~necl; idei'\s
focu~ed or developed; the ntain
n1ay not always 110,v sn1oothly.
is dear.
purpose i'> not clear.
B.
Tite introduclion gels 1he
B.
The introducfjo111nay not lx
B.
TI1e introduction is attention of the. audience. \\'ell-de\·eloped.
unde.\'doped or irrele\'anL
lo.Jain points are dear ::ind
l\fa.in poinl~ are-clill1n1lt to
c.
c h.f.ain points ;u-e no! ahvay!t
c.
organized effectively.
identify.
dear.
D.
Supporting 1naterial jc,
D.
SuppOfting: Jll3tcria1 Oll)' lack D.
Tnaccvrate, .generalized. or
origjnal, logical, aud relevant
(facls, ex:uuple-,, etc.)
E.
S1nooth transitions are
use<t
F.
TI1e condu~ion is
inappr(lpriate Mipporting
niaterial tuay be u~ed.
Tra11~ilions 111ay be needed.
E.
T11e concluc,io1~ is abn1pt or
F.
Iinllted.
G.
Langu.1g_e choice~ nuy bi:­
linlitecl. peppered n·ith sfang or
jargon, top con1plex, or too
in orig.inality or adequate
developn1ttlf.
E.
Transition' lll.l)' he .1\\'J..."\v,1rd.
The conduc,ion nuy need
F.
additional developn1ent.
G.
l.JJ_1guage is appropriate. but
\Vord choices are nof patiicularly
vivid or precise.
!.ali!.fy~ng.
Lang\lage choices are \'ivid
and precise.
.i\·faterial i!. developed for
H.
an oral ralher llian a \\Titten
orese11lation
G.
d1tll.
Nonve-rlJ,,l Effecfivene-ss
Tlie 1101n'erbal JJlf'J5nP.a stt1n1(111s and Js co111iston1 uitli tha1\vbal llWJ.'iOJ.!tl
A<l\";tnced (S)
The-ddi\'ery is natural,
confidenl, and enhance~· J~.
Jnessag_e -p<i~Hue. eye.
C(}Jllact, sn1V..'lth ges.-ture!.-, facial
express.-iont;, Yolu111C._ pace:, etc.
indicate confidence, a
conuititn~nt fo !he. topic, and a
\\'ilfo1gnec,s to COl11Jll\lnit:1te.
B.
Tiic yocal tone~ delisery
f:.tyle, and dothing are
consistent '\•ilh the: n1e!.-sag,e.
Li1nittd filler \\'Ord!>
c.
("u1ns.'') are uc,td.
D.
CJe.•r :irticulation and
pronunciation are used.
A.
I
De\~lopiui::
E1ncrID112. (l}
(J)
The. deJiY.:ry ge11traUy seen1s
effe-cli\'e-ho\\"e\"(T, tffecfi\·e
ll~ <Jf \·ohmu·,, e.ye, c:onl::icl. \·oc.11
control, etc. nl::\Y nol be
coa~iHe11t; -!.C<illf hesitancy >li3-Y
be ob~rve<l
\1ocal tone, fad al
B.
expre-s!.i(lllS:~dothing and other
JXin\·ubal expn:sc,-i0i1S--do not
<letracl ~igJ:1ifkMtly frorn theA
Jnessag-~
c.
FiHtI \\'C1rds: are-not
di!.tracting.
D.
Generally, anin1la!ion and
prc11n111ciation art. clear.
E..
0Yer <ltpen.:tence on n..1tes
rnay hf'. oh~f\.'f'<I.
A.
T11e. deHvtf)' detra(ls fro1u
th~ 1nr~sage;
eye conlacf n1ay
be \'ff)' lioiited; 1he. presenter
n1ay tend ti:o Jo.:1k al the floor,
nminble. speak inaudibly,
fidget, or read tn<Jst <•r aU of the
speech~ ge-sf\1re-:0> and
1n0Yen1N1t5- »lay be.je1ky (If
exce!.si\"e_
B.
T1te deli\'ff)' fil.1)' appe,u
inconsistent \Vilh the-. n:ie!.':.3ge.
Filter words ("u111s,") are
C.
used exce-.ssivefy.
Artin1folic•n ;1ml
D.
pronunc:iatiou fen<{ to be
1loppy.
E.
-
Over dependenc:e on note.s
tnay be ol~erved.
Appropliafeness
Idea dlll·clopmmit, use ofla11g11ago, and th£!. orga11iz,111on ofid~as for a sM,("ifk a11tltc11c.:<, selfing, (llld occasion artJ
annrontiato.
c·o1111111n1f((lff011Is1'('SllCCtful.
A1h'i11tced (S)
A.
B
c.
choke~ r.uggt-~I
DP\·eI-0uh12 (3)
I:1nere_i112 (I)
L:\nguage used is not
disrespectful or offen$ive.
Topic se1ecffon and exan1pfe·_5
B.
are not inappropriate-for Ille­
audience. occ.1sicn, or ~el1ing:
~on1e- effort to 1n::ike the 1nateri1:il
relevant to :n1diente- i11te1e-Sf!:.. the
occa!:.ion, or setting, is e.vident.
c. The deJiYef)' f>tyle. tone of
yoke, ::ind clolhing choice-!:. do
no! !>een1 out-ot:place or
disrespeclfi1' to !he audience.
L,1nguage js questionable or
in:tpproprfate for a p,1rtkul:tr
audience.. occa-siou, or i;etting:,
Son1e biased. or vnde,1r
language rnay be u~e-d.
B.
Topic seleclion does not
refale to :tndience needs aud
interests_
c. 1he <feli\'el)' style 111,1)' not
n1alch the p;u1iculnr m1<lience
OfOC(\lsion-lhe presenter's
tonf:' Qfvoice C•r olhtr
nl:innerisn1s 1nay crt'.1te
alienafion fron1 lite audience:
clothing chokes n1ay ako
convey disre~pe-ct fot lhe­
audience-.
A.
Lang\1<\ge h flmili<1r to lhe
:mdi€nc~. i'lpproprfa!e for th~
setting. <ind free ofbfas; 1J1e
p1e~e-nte-r nuy "ccde-~·wikh" (use
:i clifferEnt !Jnguage fotm) whett
app1op1iate.
Topk ~eleclion and example.s
11.1e interesting nnd iefo\'Jnt for the audience and occ;nion, Delh-e1y ~1yle Jud cloil1ing
an awMt!lt-~$ of
t:>:pt<:l."llio1n and 1101111~.
Respon$iYelH'S$
A.
.
Cm11111u11tcatton 1110• /xi modtfied bnsed on \'t>rl>al and 1101n·t>rbalfe<ulback. Spe(1ktrsllisftwers de111011strafil acii\'l1 listeninr._ behal-iors. Advanced
A.
r~1
lbe pte~Hllfr \Hf~
I
materials
lo l:.:ep the audience engaged.
B.
c
D.
}.fa1e1i:d h rnodifie-d or
cbrified a$ iUeded gfre11 ;n1dience
verb11l imd nonvEtb;il fte-dt):i.tk.
Reinf<'.'rcing \·e1b:il lfr.feili11g
1es]X'n~e~ ~uch a;. pa1<1.phrft~i.ng. <:·r
1e$faling :ue used if neede1t \'\:hen
:uuweiing l}\IN-lfon~; re~pon~.:s toaudiHite <)tLt~lfon~ are fN"n~ed
and rele\';int
N<'nYHbal t>e-haYion.
••ft.' n'.e<i
lei kt<"p t11e audiMce tuga_gcd '>Ud1
3'> maiotainiogeye coota.::t,
modifying detP.·.;iy ~t)"te if cceded,
and U$iitg iefoft•tdog n01n·ub..1l
li'>lf!liug ft~pNne~ {nMding,
leaning fonv;;nl. efc-.) when
3ll\WNing !Jllf:S!ions.
A.
DeYe)oll!nP (3'
111e pre~t-nter i~ able lo keep
the audience tng3ged 1110~( of the
tin1e.
B.
\Vhen feedb<Kk i11".litate!> a.
need for idea cbrification. th<-·
!>peaker tn:ikeo; an 11tten1pt lo
clarify or restate. ide-as·.
Responses to au<lit1xe
c.
<1,1e.~tions are generally relevant
hul Jiule elaOOralion ffL"lY ~
offf1ed.
Genefally. the. !>peaker
ll.
deinon!>fratt".> audience a\vareness
thro11gh '>Udt nonvesb..11
beha\1(1(!> a~' lc111e, 1)X1Yen1e11t,
and e.ye. cont,1c-t \Vilh lhe. \Vhok
audience; wn1e reinfctfdng
nonverba.l Ji~1ening re!>pon-sts are
perioclkally \lX-<l \vhtn
,'Hl!.\\'tring qne'>ti-0ns.
E-1nen:iiuo ll)
A. The. pre~enter i~ no! able lo
keep 1he audience e11g;:igetl.
B. The: Yeth::il vr 11.(lffretba!
feedb:t<k froill th~- :lndience­
n1:ly ~u~ge-~1 :t latk of
c.
intere~t or Ccnfu~ion.
Re~pon~e!> to :n1die11ce
q11e~fi(i11s 1nay be.
undeYeloped or uncle,1r.
D. The non\'efh::it asp~ls of
delivery do not indicate ::in
3.\vareness of audie11ce
re,1c:tions~ reinforcing
HOO\'flbal listening
re.!>pon~.!> ~uch as using eye
contact. fa<ing the per.son,
etc. are- not used \\'he11
."HIH\'ering ques1ions.
E. Poi~f. or con1po.,nre. i~ lo~1
during any d1sfraclion~.
Couite'>y ofNorthwt~lRegi011;tl £dut11t1(•n lah..~11k1y 199S~Paula. 11-faH~ {Bfo.nck) U~H'Y (5(13) 275-9577
Objective 4 Ruhl'ic - Information Literacy Assessment
Objective 4: The information literate student, Individually or as a member of a group, uses Information
effectively to accomplish a specific purpose.
Part 1: The student applies new and prio1· information to the planning and creation of a pait1cular product
or performance.
•
Rubric for Part J:
4: There is a clear, concise purpose statement and outline of !he project. The materials include
strong evidence that all of the following were addressed in an excellent manner.
The individual co11sldel'ed hifo!'maflon sources ofa variety ofresource types, examined a
substantial n111i1ber ofsources, and i11c/11ded sources ofan appropriate scholarly lei'el.
17ie sources al'e appropriately presented and there is a clea!' indication that appropriate
sources have been selected and inappropriale sources hm'e been excluded.
3: There is a concise purpose statement, and satisfacto1y outline of the project. The materials
hlclude evidence that most of the following were adequately addressed:
The lndll'ldual considered ltiformatlon sources ofa variety ofresource types, exa111/]1ed a
substantial number ofsources, and /11c/11ded sources ofan appropriate scholarly level.
The sources are appropriatelypresented and !here Is a clear i11dicatio11 that appropriate
sources have been selected and inappropriate sources have been excluded.
2: A purpose statement is given, but it does not cleal'ly identify the product thal is to be produced
by the project; the outline of the project is sketchy at best. The materials include limited evidence
that less than half of the following were reasonably addressed.
The Individual considered itiformation sources ofa variety o.fresource types, examined a
substantial 1111111ber ofsources, and included sources ofan appropriate sclwlarly level.
The sources are appropriatelypresen/ed and there is a clear /11dica/lo11 that approprlale
sources hm'e been selected and /11app1'opr/a/e sources have been excluded.
1: A supediciul prn]lose statement is given; the outline of the project is umelated lo the purpose
statement. The materials include little evidence that any oflhe following were even superficially
addressed.
The individual considered i1ifor111alio11 sources ofa variely qfresource /;~Jes, examined a
substa11tlal 1111111ber ofsources, and Included sources ofan appropriate scholarly level.
The sources are appropriatelypresen/ed and there is a clear indical/011 that appropriate
sources have been selecled and hmppropriale sources have been excluded
t
Report ofthe VCCS Tnsk Force ou Assessing Coro Con1petenoies 48
Part 2: The student revises !he development prncess for the product or performance.
Rubric for Par! 2:
4: The student's log of Information gathering/evahmting activities follows the purpose statement;
the log shows the student's clear evaluation of the usefulness of the Information gathered for the
project. The log Indicates the student's reflection on past successes and failmes in the
development of the project.
3: The student's log shows evidence ofresearch geared toward the purpose statement, using
sources from the course and I - 2 sources outside the course; the log shows a listing of resources
used and abandoned, but no explanation about how these decisions were made.
2: The student's log reflects that only encyclopedias and in-course texts were used as information
sources; the log reflects heavy emphasis on the student's personal experiences as the major source
of information for the project.
1: The student's log indicates the amount of lime spent by the student on the project, and only the
student's personal experiences are used as sources of information for the project.
'
Part 3: The student communicates the product or performance effectively to others.
Rubric fo1· Pait 3:
4: The delivery method for the project is appropriate to the discipline/program context and the
intended audience; the delivery inc01vorntes a variety of applications, which enhances the
communication of the purpose and resulls of the project, Jnformation technology, when utilized,
provides a major enhancement.
3: The delivery method for the project Is appropriate to the dlseipllne/prngram context and is
satisfacto1y for the intended audience; the delivery incorporates an adequate val'iety of
applications, which enhances the comm1micatlon of the purpose and res1llts of the project.
Information technology, when utilized, prnvides some enhancement.
2: The deliveiy method for the project distracts from the purpose 01· results of the project, The
•
delivery incorporates a limited amount of applications, which marginally enhance the
communication of the purpose and resulls oflhc project. Information technology, when utilized,
prnvides little enhancement.
1: The delive1y method for the pl'Oject is Inappropriate from the purpose or results of the project.
The delivery il1corporates a limited amount of applications, which detract from the communication
of the puqiose and results of the project. Information technology, when utilized, provides no
enhancement.
Reporl of the VCCS Task Force on Assessing Core Competencies 49
Scoring level
Interpretation
Analysis &
Evaluation
Presentation
4
Aecom pl ished
•Analyzes insightful questions
• Refutes bias
• Critiques content
• Examines inconsistencies
·Values information
• Examines conclusions
• Uses reasonable
judgment
• Discriminates rationally
• Synthesizes data
• Views information
critically
• Argues succinctly
• Discusses issues
thoroughly
• Shows intellectual
honesty
·Justifies decisions
• Assimilates information
3
•Asks insightful questions
• Detects bias
• Categorizes content
• Identifies inconsistencies
• Recognizes context
• Formulates conclusions
• Recognizes argument
• Notices differences
• Evaluates data
• Seeks out information
·Argues clearly
• Identifies issues
• Attributes sources
naturally
• Suggests solutions
• Incorporates information
• Identifies some questions
• Notes some bias
• Recognizes basis content
• States some inconsistencies
• Selects sources adequately
•Identifies some
conclusions
• Sees some arguments
• Identifies some
differences
• Paraphrases data
• Assumes information
valid
• Misconstructs arguments
• Generalizes issues
• Cites sources
• Presents few options
• Overlooks some
information
• Fails to question data
·Ignores bias
• Misses major content areas
• Detects no inconsistencies ·.Chooses biases sources • Fails to draw conclusions
• Sees no arguments
• Overlooks differences
• Repeats data ·omits research • Omits argument
• Misrepresents issues
•Excludes data
• Draws faulty conclusions
• Shqws intellectual
dishonesty
Competent
2
Developing
1
Beginning
Student Outcome: Critical Thinking (Analytic)
1- seginn.ing
3 -·Competent
2. - Developing
4 -!\ccomplished
Interpretation
Qt.iestiOns
Fails to question data
Identifies some questions
Asks insightful questions
Analyzes insightful questions
Recognizes bias
Ignores bias
Notes some bias
Detects bias
Refutes bias
Uf'\derstands
Misses major content areas
Recognizes basic content
Categorizes content
Critiques content
Detects no inconsistencies
States some inconsistencies
Identifies inconsistencies
Examines inconsistencies
Chooses biased source
Selects sources adequately
Recognizes context
Values information
content
Identifies
incorisistencies
Understands
context
Analysis· a'nd Evaluation
ReaC:hes conclusions
Fails to draw conclusions
Identifies some conclusions
Formulates conclusions
Examines conclusions
Develops arguments
Sees no arguments
Sees some arguments
Recognizes arguments
Uses reasonable judgment
Discriminates
Overlooks differences
Identifies some differences
Notices differences
Discriminates rationally
Synthesizes data
Repeats data
Paraphrases data
Evaluates data
Synthes·1zes data
Gathers'information
0 mits research
Assumes information valid
Seeks out information
Views information critically
Presentation
Makes arguments
Omits arguments
Misconstructs arguments
Argues dearly
Argues succinctly
Identifies issues
Misrepresents issues
Generalizes issues
Identifies issues
Discusses issues thoroughly
Gives attribution
Excludes data
Cites sources
Attributes sources naturally
Shows intellectual honesty
Reaches·conclusions
Draws faulty conclusions
Presents few options
Suggests solutions
Justifies decisions
lncorpprates
information
Shows intellectual dishonesty
Incorporates information
Assimilates information
Overlooks some information
'
Score
r
/ifornia.
cademic
1rss
\.
,, Holistic Critical Thinking Scoring Rubric Dr. Peter A. Facione Santa Clara University Dr. Noreen C. Facione, R.N., FNP University of California, San Francisco (c) 199~. Peter A. Facionv, Noreen C, Facione, and Tite CalifomiaAcademic Press. 217 La Cruz AYe., li.fillbrac, CA 9.-1030. Pem1ission is hereby grunted to students, faculty, s.1iin: or administrators at public or nonprofit cduc,l_tional inslitutions for unlimited dupliration ofthe critical thinking scoring rubric, rating fom1, or in~tnrc!ions herein for local teaching, assessment, re;search, or other educational aud noncommercial uses, provided that no part of the scoring ruhric is al!ered and that ttFadonc and FadoneH me cited as authors. (P,\ F49:R4.2:062694) Holistic Critical Thinking Scoring Rubric
FacioneandFacione
4
Consistently does all or almost all of the following:
Accurately ititerprets evidence, statements, graphics, questions, etc.
Identifies the salient arguments (reasons and claims) pro and con.
Thoughtfully analyzes and evaluates major alternative points of view.
Draws warranted, judicious, non-fallacious conclusions.
Justifies key results and procedures, explains assumptions and reasons.
Fair-mindedly follows where evidence and reasons lead.
3
Does most or many of the following:
Accurately interprets evidence, statements, graphics, questions, etc.
Identifies relevant argul!lents (reasons and claillls) pro and con.
Offers analyses and evaluations of obvious alternative points of view.
Draws warranted, non-fallacious conclusions.
Justifies some results or procedures, explains reasons.
Fair-mindedly follows where evidence and reasons lead.
2
Does most or many of the following:
Misinterprets evidence, statements, graphics, questions, etc.
Fails to identify strong, relevant counter-arguments.
Ignores or superficially evaluates obvious alternative points of view.
Draws unwarranted or fallacious conclusions.
Justifies few results or procedures, seldom explains reasons.
Regardless of the evidence or reasons, maintains or defends views
based on self-interest or preconceptions.
1
Consistently does all or almost all of the following:
Offers biased interpretations of evidence, statements, graphics,
questions, information, 01· the points of view of others.
Fails to identify or hastily dismisses strong, relevant cotmter-arguments.
Ignores or superficially evaluates obvious alternative points of view.
Argues using fallacious or irrelevant reasons, and unwarranted claims.
Does not justify results or procedures, nor explain reasons.
Regardless of the evidence or reasons, maintains or defends views
based on self-interest or preconceptions.
Exhibits close-mindedness or hostility to reason.
(c} 1994, Pcttr A. Facione, Nore.;-n C. Fadone, and The California Academic Pf\.'SS.
(S~ cover p..1g~
for c<mditional pern1ission to dup!kate.)
Holistic Critical Thinking Rating Forni
Rater's Name:
Date: - - - - - -
Project/Assignmen tiActivity Evalua tcd:
ID or Name
Score
ID or Name
Score
.
Instructions for Using the Holistic Critical Thinking Scoring Rubric 1. Understand the construct.
This four level 111bl'ic treats critical thinking as a' set of cognitive skills supported by
certain personal dispositions. To reach a judicious, purposive judg111ent a good criti­
cal thinker engages in analysis, interpretation, evaluation, inference, explanation, and
tneta-cognitive self-regulation. The disposition to pursue fair-1nindedly and open-1nind­
edly the reasons and evidence \Yhcrever they lead is crucial to reaching sound, objec­
tive decisions and re.solutions to co1nplex, ill-structured proble1ns. So are the other
critical thinking dispositions, such as syste1naticity, reasoning self-confidence, cogni­
tive 1naturity, analyticity, and inquisitiveness. [For details on the articulation of this
concept refer to Critical Thinking: A State1nent of Expert Consensus for Purposes of
Educational Assessn1ent and Instruction. ERIC Docu1ucnt 1\1untbcr: ED 315 423.]
2. Differentiate and Focus
Holistic scoring requires focus. In any essay, presentation, or clinical practice setting
1nany cle111cnts n1ust cotne together for overall success: critical thinking, content
kno\vledge, and technical skill (crafts1nanship). Deficits or strengths in any of these
can dra\v the attention of the rater. 1-Io\Vever, in scoring for any one of the three, one
1nust atte1npt to tbcus the evaluation on that ele1nent to the exclusion of the other t\vo.
3. Practice, Coordinate and Reconcile.
Ideally, in a training session \vith other raters one \Vill exa1nine sa111ple es­
says (videotaped presentations, etc.) \Vhich are paradig1natic ofeach ofthe four levels.
\Vithout prior kno\vledgc of their level, rater_s \Viii be asked to evaluate and assign
ratings to these san1pfes. Afler con1paring these preli111inary ratings, collaborative
analysis \Vi th the other raters and the trainer is use.d to achieve consistency of expec­
tations an1ong those \vho \Vill be involved in rating the actual cases. Training, prac­
tice, Rnd inter-rater reliability are the keys to a high qtmlity assess1nent.
Usually, t\vo raters \viii evaluate each essay/assig111nent/projcct/perfonnancc.
Tfthey disagree there are tl1rce possible \Vays that resolution can be achieved: (a) by
1nutual conversation behvcen the f\\'O raters, (b) by using an independent third rater,
or (c) by taking the average of the f\Vo initial ratings. The averaging strategy is strongly
discouraged. Discrcpmlcies behveen raters of 1nore than one level suggest that de_­
tailed conversations about the CT construct and about project expectations arc in or­
deJ'. This rubric is a fu.ur level scale, half point scoring is inconsistent 'vilh its intent
and conceptual structure. Fu11her, at this point in its history, the art and science of
holistic critical thinking evaluation canno~justif)' asserting half-level differentiations.
If,vorking alone, or \Vithout paradig1n sainples, one can achieve a greater
le\ el of internal consistency by not assigning final ratings unlil a nun1ber of essays/
projects/perfon11ances/assig111nents have been vie\ved and given prelhninary ratings.
Frequently natural clusters or groupings of sin1ilar quality soon co1ne to be discern­
ible. At that point one can be 111ore confident in assigning a finner critical thinking
score using this four level rubric. After assigning prelhninary ratings, a revie\\' of
the entire set assures greater internal consistency and fairness in the final ratings.
1
General Education Critical Thinking Rubric (Short Version) Northeastern Illinois University -
Quality
Criteria
1. Identifies &
Explains Issues
2. Recognizes
Stakeholders and
Contex1s
(Le.. cultural/social.
educational.
technological. political
scientific. cconon1ic.
ethical. personal
experience)
No/Limited Proficiency
Some Proficiency
Proficiency
Fails to identify,
summarize, or explain the
main problem or question.
Represents the issues
inaccurately or
inappropriately.
Identifies main issues but
does not summarize or
explain them clearly or
sufficiently
Successfully identifies
and summarizes the main
issues, but does not
explain why~1ow they are
problems or create
questions
Fails accurately to identify
and explain any empirical
or theoretical conte:1.-is for
the issues.
Presents problems as
Shows some general
understanding of the
influences of empirical and
theoretical conte:1.-is on
stakeholders, but does not
identify any specific ones
relevant to situation at
hand.
Formulates a vague and
indecisive point ofview, or
anticipates minor but not
major objections to his/her
point of view. or considers
weak but not strong
alternative oositions.
Identifies some of the
most important
assumptions, but does not
evaluate them for
plausibility or clarity.
having no connections to
other conditions or
conte:1.-is.
3. Frames Personal
Fails to formulate and
Responses and
clearly express own point
Acknowledges Other of view, (or) fails to
Perspectives
anticipate objections to
his/her point ofview, (or)
fails to consider other
versoectives and oosition.
4. Evaluates
Fails to identify and
Assumptions
evaluate any of the
important assumptions
behind the claims and
,recommendations made.
5. Evaluates
Evidence
Fails to identify data and
information that counts
as evidence for truthclaims and fails to
evaluate its credibility.
Successfully identifies
data and information that
counts as evidence but
fails to thoroughly
evalultte its credibility.
High Proficiency
Clearly identifies and
summarizes main issues and
successfully explains
why/how they are problems
or questions; and identifies
embedded or implicit issues,
. addressing their relationships
to each other.
Correctly identifies all the Not only correctly identifies
empirical and most ofthe all the empirical and
theoretical contexts relevant
theoretical contexts
relevant to all the main
to all the main stakeholders,
stakeholders in the
but also finds minor
stakeholders and conte:1.-is and
situation.
shows the tension or conflicts
of interests among them.
Formulates a clear and
Not only formulates a clear
precise personal point of
and precise personal point of
view concerning the
view, but also acknowledges
issue, and seriously
objections and rival positions
discusses its weaknesses
and provides convincing
as well as its strengths.
replies to these.
Identifies and evaluates
all the important
assumptions, but not
the ones deeper in the
background - the more
abstract ones.
Identifies all important
evidence and
rigorously evaluates it.
Not only identifies and
evaluates all the important
assumptions, but also some
of the more hidden, more
abstract ones.
Not only identifies and
rigorously evaluates all
important evidence offered,
but also provides new data
' or information for
consideration.
Rating
(1.2.3.4pts)
-
---
Quality
Criteria
6. Evaluates
Implications,
Conclusions, and
Consequences
No/Limited Proficiency
Some Proficiency
Proficiency
Fails to identify
implications,
conclusions, and
consequences of the
issue, or the key
relationships between the
other elements of the
problem, such as context,
assumptions, or data and
evidence.
Suggests some
Identifies and briefly
implications,
discusses implications,
conclusions, and
conclusions, and
consequences, but
consequences
without clear reference to considering most but
context, assumptions,
not all the relevant
data, and evidence.
assumptions, contexts,
data, and evidence.
High Proficiency
Identifies and thoroughly
discusses implications,
conclusions, and
consequences, considering
all relevant assumptions,
contexts, data, and
evidence.
Rating
0 .2.3.4ots)
92
RUBRIC CONSTRUCTION AND USE IN DIFFERENT CONTEXTS
Class: Hist. Early Japan
What students did
Assign111ent
objectives
on various aspects
of the assignment
Content
Na1nes, dates, and events are
0 Accurate ·,; ,/ ,/ ,/
0 Mostly accurate ,/,/,/,/ ,/
0 Inaccurate ,/,/,I,/,/,/,/,/,I
What I can do next timechanges in instruction
and this assign1nent
q;ve mote 711izw. /'.ooh fike ({,'!) '"'
tfo;1? Ifie l'tfla>•h for {heassir}'IM
wor •611! 11v/ lfie!J111emfc(n,, @ifi113.
They are used:
0 Appropriately,/,/,/,/,/,/,/
0 Mostly approp. ,/,/,/,/,/ ,/
,/ ,/
0 lnapprop. ,/,/,/
Research
Historiography
Used:
Internet,/,/,/,/,/,/,/,/,/,/,/,/,/
1V11fc{, /f,eir referencer. Mm; nwf
lo nffow 110 more {/11111 lhtee ?11fe1'11ef
,/ ,/ ,/,/ ,/
SOUl'Ctf,
:llddcfuspel'iodin fihrmy lo fet1,.,,
dn!11611m.
1Jo chss 4<ernifl 111i>"} primo'!J
Books,/,/,/,/,/,/,/,/,/,/,/,/,/
Journals,/,/,/,/,/,/
Databases,/
Prin1ary documents/'",/
JOCll'Ct5
Recognize authorial biases ii./
? ffif11{1Jfi'!) \Je!JO{ f/f
,/ ,/ ,/ ,/,/ ,/ ,/ ,/ ,/,/ ,/ ,/
Recognize different schools ,/ ,/
,/,/,/,/,/ ,/ ,/,/,/,/,/,/,/,I
Writing skills
Understand what a book
critique is.and can \Vrite one,/
,/,/,/,/,/,/,/,!,/,/,/,/,(,/,/,/
Understand \\.hat A research
paper is and can \Vrite one ,/,/
,/,/,/,/ ,/ ,/,/,/,/,/,/,/,/,I
Mi; limde 011 cf{nfiom sums lo he
. worhillj, 1111dso m• fhey11di119
rnhliM{or {he pnpw.
Kno\V \vhen nnd ho\V to cite
sources ,/,/,I,/,/,/,/,/,/,/,/,/,I
,/,/,/,/,I,/
Figure 6.8 Rubric used by instructor to sun1111arize ho\v students con1plcted the
assign1nent.
Program Outcomes and Performance Criteria
Performance criteria are a means to focus on specific expectations of a program. They facilitate the
curriculum delivery strategies, and assessment procedures. There Is an important first step that must
come before the development of performance criteria, and that is deciding on program outcomes.
These are usually communicated to students In the program description, and are stated in terms that
inform the students about the general puroose of the program and expectations of the faculty. The
primary difference between program outcomes and performance criteria is that program outcomes are
intended to provide general Information and thus are not measurable, while performance criteria
indicate concrete measureable expectations. Performance criteria are developed from program
outcomes.
Sample program outcomes:
• Students will have an understanding of the social influences that affected technology in
culture.
• Students will work effectively as a member of a team.
• Students can apply the principles of math and science to a technical problem.
• Students will have appreciation for the need to be lifelong learners.
Performance criteria indicate what concrete actions the student should be able to perform as a result bf
participation in the program and state minimum criterion for evaluation. Once program outcomes have
been identified, the knowledge and skills necessary for the mastery of these outcomes should be listed.
This will allow the desired behavior of the students to be described, and will eliminate ambiguity
concerning demonstration of expected competencies. Performance criteria are made up of at least two
main elements; action verb and content (referent). The expected behavior must be specified by name,
using an observable action verb such as demonstrate, interpret, discriminate, or define.
Sample performance criteria:
• Students will know of a professional code of ethics. (knowledge)
• Students will be able to locate technical information independently. (comprehension)
• Student will solve research problems through the application of scientific methods.
(application)
Sources: Cunningham, G.K. (1986). Educational and Psyc/wlogical Measurement. New York: MacMillan Publishing. McBeath, R.J., Ed. (1992). Instructing and Evaluating in Higher Education: A Guidebook for Planning Learning Outcomes. Englewood Cliffs, NJ: Educational Technology Publications. COGNITIVE
learning is demonstrated by knowledge recall and the intellectual skills: comprehending information, organizing ideas, analyzing and synthesizing
data, applying knowledge, choosing among alternatives in problem-solving and evaluating ideas or actions.
oetinii:foll>
'
Knowledge
arrange, define, describe, duplicate, identify label, list, match,
memorize, name, order, outline, recognize, relate, recall, repeat,
Remember'ing previously learned
Memory of specific facts, terminology,
information
rules, sequences, procedures,
classifications, categories, criteria,
methodology, principles, theories, and
reproduce, select, state
structure
Comprehension
I Classify, convert, defend, describe, discuss distinguish, estimate,
Grasping the
m~aning
of information
explain, express, extend, generalize, give examples, identify, indicate,
infer, locate, paraphrase, predict, recognize, rewrite, report, restate,
review, select, summarize, translate
Application
Analysis
Synthesis
Evaluation
Stating problem in own words, translating
a chemical formula, understanding a flow
chart, translating words and phrases from
a foreign language
Applying, change, choose, compute, demonstrate, discover,
dramatize, employ, illustrate, interpret, manipulate, modify, operate,
practice, predict, prepare, produce, relate, schedule, show, sketch,
sole, use write
Applying knowledge to actual situations
Analyze, appraise, break down, calculate, categorize, compare,
contrast, criticize, diagram, differentiate, discriminate,. distinguish,
examine, experiment, identify, illustrate, infer, moder, outline,. point
out, question, relate, select, separate, subdivide, test
Breaking down objects or ideas into simple
parts and seeing how the parts relate and
are organized
Discussing how fluids and liquids differ,
Arrange assemble, categorize, collect1 combine, comply, compose,
construct, create, design, develop, devise, design, explain, formulate,
generate1 integrate, manage, modify, organize, plan, prepare,
propose, rearrange,. reconstruct, relate, reorganize, revise, rewrite, set
up, summarize, synthesize, tell v.rrite
Rearranging component ideas into a new
Writing a comprehensive report on a
whole
problem-solving exercise, planning a
Appraise, argue, assess, attach, choose, compare, conclude, contrast,
defend, describe, discriminate, estimate, evaluate, explain, judge,
Taking principles learned in math and
applying them to figuring the volume of a
cylinder in an internal combustion engine
detecting logical fallacies in a student's
explanation of Newton's 1:.t law of motion
program or panel discussion, writing a
comprehensive term paper
Making judgments based on internal
evidence or external criteria
Evaluating alternative: solutions to a
problem,. detecting inconsistencies in the
justify, interpret, relate, predict, rate, select, summarize, support,
speech of a student government
value
representat[ve
Gronlund, N.E. (1981). Measurement and Evaluation in Teaching, 4"' ed. New York, Macmillan Publishing. McBeath, RJ. (Ed.). (1992). Instructing and Evaluating in Higher Education: A Guidebook for Planning Learning Outcomes. Englewood Cliffs, NJ: Educational Technology Affective learning is demonstrated by behaviors indicating attitudes of awareness, interest, attention, concern, and responsibility,
ability to listen and respond in interactions with others, and ability to demonstrate those attitudinal characteristics or values which are
appropriate to the test situation and the field of study.
Receiving
Asks, chooses, describes, follows, gives, holds, identifies,
Willingness to receive or attend.
Listening to discussions of
controversial issues with an
locates, names, points to, selects, sits erect, replies, uses
open mind, respecting the
rights of others
Responding
Valuing
Answers, assists, complies, conforms, discusses, greets,
Active participation indicating
Completing homework
helps, labels, performs, practices, presents, reads, recites,
positive response or acceptance of
assignments, participating in
reports, selects, tells, writes
an ideas or policy
team problem-solving activities
Completes, describes, differentiates, explains, follows,
Expressing a belief or attitude about
Accepting the id.eas that
forms, initiates, invites, joins, justifies, proposes, reads,
the value or worth of something
integrated curricula is a good
reports, selects, shares, studies, works
way to learn, participating in a
campus blood drive
Organization
Adheres, alters, arranges, .combines, compares,
Organizing various values into an
Recognizing own abilities,
completes, defends, explains, generalizes, identifies,
internalized system
limitations, and values and
integrates, modifies, orders, organizes, prepares, relates,
developing realistic aspirations
synthesizes
Characterization
I Acts, discriminates, displays, influences, listens, modifies,
by a value or
performs, practices, proposes, qualifies, questions,
value complex
revises, serves, solves, uses, verifies
The value system becomes a way of
A person's lifestyle influences
life
reactions to many different
kinds of situations
Gronlund, N.E. (1981). Meosurement and Evaluation in Teaching, 4" ed. New York, Macmillan Publishing. McBeath, R.J. (Ed.). (1992). Instructing and Evaluating in Higher Education: A Guidebook for Planning Learning Outcomes. Englewood Cliffs, NJ: Educational Technology PSYCHOMOTOR learning is demonstrated by physical skills: coordination, dexterity, manipulation, grace, strength, speed; actions which demonstrate the
fine motor skills such as use of precision instruments or tools or actions which evidence gross motor skills such as the use of the body in dance or athletic
performance.
Perception
Chooses, describes, detects, differentiates, distinguishes, identifies,
Using sense organs to obtain cues needed to
isolates, relates, selects, separates
guide motor activity
I Listening to the sounds made by guitar
strings before tuning them, recognizing
sounds that indicate malfunctioning
equipment
Set
Begins, displays, explains, moves, proceeds, reacts, responds, snows,
Being ready to perform a particular action:
starts, volunteers
mental, physical or emotional
Knowing how to use a computer mouse,
having instrument ready to play and
watching conductor at start of a musical
performance, showing eagerness to
assemble electronic components to
complete a task
Guided
Assembles, builds, calibrates, constructs, dismantles, displays, disserts,
response
fastens, fixes, grinds, heats, manipulates, measures, mends, mixes,
I Performing under guidance of a mode!:
imitation or trial and error
Using a torque wrench just after
observing an e>:;pert demonstrate its use,
experimenting with various ways to
organizes, sketches
measure a given volume of a volatile
chemical
Mechanism
(same list as for guided response)
I Being able to perform a task hab·1tually with
Demonstrating the ability to correctly
some degree of confidence and proficiency
execute a 60 degree banked turn in an
aircraft 70 percent of the time
Complex or
[ (same list as for guided response)
overt response
Performing a task with a high degree of
Dismantling and re-assembling various
proficiency and skill
components of an automobile quickly
with no errors
Adaptation
Adapts, alters, changes, rearranges, reorganizes, revises, varies
Using previously learned skills to perform
Using skills developed learning how to
new but related tasks
operate an electric typewriter to operate
a work processor
Origination
Arranges, combines, ccimposes, constructs, creates, designs,
Creating new performances after having
Desigrting a more efficient way to
originates
developed skills
perform an assembly line task
Gronlund, N.E. (1981). Measurement and Evaluation in Teaching, 4'' ed. New York, Macmillan Publishing. McBeath, RJ. (Ed.). (1992). Instructing and Evaluating in Higher Education: A Guidebook for Planning Learning Outcomes. Englewood Cliffs, NJ: Educational Technology Program Review Evaluation Report by Curriculum Committee
Name of P r o g r a m - - - - - - - - - - - - - - - - - - - - ­
_ _Individual Member Evaluation
_ _ Educational Programs Committee Summary Report
Date
Rate the program on the criteria below using the attached three-level rubric. Enter comments for each criterion to provide helpful feedback to the program pointing out
strengths; weaknesses, and recommendations for improvement.
Criteria
Indicate Rating {3=strong;
2=moderate, l=weak)
Comments
1. Evidence of effective strategic planning
2.
3. Documented need for the program
4.
5. Extent to which student outcomes assessment is
used to improve learning
4a. Level at which performance indicator benchmarks
are attained.
4b. Degree to which decision~making is influenced by
performance indicators
5. Level at which strengths, weaknesses, and
opportunities are identified, addressed, and written into
an action plan
6. Evidence of student Jearni'ng
.
7. Evidence that students are successful after leaving
the program
8. Evidence that program is using its advisory
committee for feedback and suggestions
Total Score
Note: The following ranges-can be used in determining the overall recommendation
given a program:
Effective Program= 21-24; Moderate Effective Program (Satisfactory with Improvements)= 16-20; Program Needs Improvement= 10-15 Modified from JSRCC
Program Evaluation Rubric for Educational Programs Committee
I·.·•>.::.• .•)'. :·;.:•·•· cf(terta•.'·'•' ,.,.,,.,.•
:·:::1;1: 1:•:
1. Evidence of effective strategic
planning
,.,:+\'"' · , ., ' :,/!;·:,;h':c
"•:'.''st·•····''
:·''· •\} ;:, .:} · :·.
rO.Qg·.:' ···;·:.11.1·:,•:
•
•
•
•
Five years of annual plans which
support college mission
Clear and measurable
goals/objectives
Annual assessment plan integrated
in program plan
Evidence of use of planning and
assessment results for program
improvements
· ·.· ,·.·:·•.·.· • .·.· , ,,.· , ·. ···Moae'rate··, · ·
•
:".;;;:y ;.·:::':;::<, .Ji
Some annual p!ans not evident
•
Moderate support for college
•
Some clear and measurable
goals/objectives
Some assessment plan integration
in annual program plan
Some evidence of use of results for
program improvements
•
•
mission
'
!·<!!·:-·
1:•:•;,:, 'H\i . -- ,:
Little
or
no
evidence
of
annual
•
'!:::1 .:.•:
•
program planning
Little or no connection to
•
college mission
Few or immeasurable
•
•
'
'
goals/objectives
Little or no evidence of
assessment integration into
annual program planning
Little or no evidence of results
being used for program
improvements
!
2. Documented need for the
program
Analysis of current labor market
data shows strong need
(local/regional data preferred)
Analysis of future industry changes
and need
Sources are cited
•
Analysis of current labor market
data shows moderate need
(local/regional data preferred)
•
•
Some analysis of future industry
changes and need
•
•
•
Enrollment trends are increasing
•
Some sources are cited
Relatively fiat or slightly declining
enrollment trends
•
Five years of assessment planning
and results
•
Some evidence of assessment
planning and results
•
Outcomes objectives are
appropriate and measurable
•
•
Outcomes objectives are specific
and defined
•
Outcomes objectives are somewhat
appropriate and measurable
Outcomes objectives are somewhat
specific and defined
•
Evidence of results used for
program improvements
•
Some evidence of results being
used for program fmprovements
•
Majority of benchmarks are met or
exceeded
•
Some benchmarks are met or
exceeded
•
•
•
3. Extent to which student
outcomes assessment ls used
to improve learning
4a. Level at which performance
indicator benchmarks are attained
•
Analysis of current labor market
data shows little or no need
(local/regional data preferred)
Little or no analysis of future
industry changes or need
Few or no sources cited
•
Declining enrolment trends
•
Little or no evidence of
assessment planning and
results
•
Outcomes objectives are not
appropriate and/or measurable
•
Outcomes objectives are not
specific and/or defined
Little or no evidence of results
being used for program
improvements
•
0
Few or no benchmarks are met
or exceeded
Program Evaluation Rubric for Educational Programs Committee
,, ;•. :·········.·, H·.;'i '>Cri~l!i;ia}•.'
< · sfrohg;: '( .··
•· ,;;;::·. ;•·;. ;.•.··..1·+ ·,
4b. Degree to which decision­
making is influenced by
performance indicators
5. Level at which strengths,
weaknesses, and opportunities are
identified, addressed, and written
into an action plan •
•
6. Evidence of student learning
•
7. Evidence that students are
successful after leaving the program such as 75% employment in related
field benchmark •
•
•
•
•
8. Evidence that program is using
its advisory committee for feedback and suggestions
•
•
•
•:1,;;,::,1 .•: .' ·•·.·•..
Strong evidence that
benchmarks are used in program
planning
Action plan clearly addresses
identified strengths,
weaknesses, and opportunities
from assessment and planning
results
Outcomes objectives provide
clear and strong evidence of
student learning
Strong evidence or related job
placements (O/Tl
Related employment
benchmarks met or exceeded
Employer benchmarks met or
exceeded
Strong certification pass rates (if
applicable)
Strong evidence of successful
transfer (AA&S)
Minutes show that committee
meets at least twice a year
Minutes reflect strong evidence
that committee is involved in ongoing program review
Strong evidence that committee
recommendations are being
addressed
'" i' · { i,•IV!ild!!rafe· , ··•• · ••,\, ..:< ,,:x:::::':::·-
:·. :::
•
Some evidence that benchmarks
are used in program planning
•
•
Action plan somewhat addresses
identified strengths,
weaknesses, and opportunities
from assessment and planning
results
•
•
Outcomes objectives provide
some evidence of student
learning
Some evidence of related job
placements (O/T)
Related employment benchmark
somewhat met
Employer benchmarks
somewhat met
Moderate certification pass
rates (if applicable)
Some evidence of successful
transfer (AA&S)
Minutes show committee meets
at least once a year
Minutes reflect some evidence
that committee is involved in ongoing program review
Some evidence that committee
recommendations are being
addressed
•
•
•
•
•
•
•
•
•
•
•
0
•
•
•
0
0
I
"l!§..:;;i,,, ,'/·.···
little or no evidence that
benchmarks are used in
program planning
No action plan or action plan
does not adequately address
identified strengths, weaknesses, and
opportunities from
assessment and planning
results
Outcomes objectives provide
little or no evidence of
student learning
little or no evidence of related job placements (O/T)
Related employment benchmark not met
Employer benchmarks not
met
Low certification pass rates
(if applicable)
Little or no evidence of
successful transfer (AS&S)
limited or no minutes to document active committee
Little or no evidence that
committee is involve in on­
going program review
Little or no evidence that
committee
recommendations are being
addressed
9 Principles of Go-od Practice fo~ Assessing Student Learning
'
··
'·
.
1. The assessment of student learning begins with educational values. Assessment is
not an end in itself bt,Jt a vehicle for educational improvement. Its effectlve practice, then,
begins with and enacts a vision of the kinds of learning we most value for students and
strive to help them achieve. Educational values should drive not only what we choose to
assess but also how we do so. Where questions about educational mission and values
are skipped over, assessment threatens to be an exercise in measuring what's easy,
rather than a process of improving what we really care about.
2. Assessment is most effective when it reflects an understanding of lea~ning as
multidimensional, Integrated, and revealed in performance over time. Learning is a
complex proce·ss. It entails _not only what students know but what they can do with what
they know; it involves not only knowledge and abilities but values, attitudes, and habits
of mind .that affect both academic success and performance beyond the classroom.
Assessment should reflect these understandings by employing a diverse array of
methods, including those that call for actual performance, using them <JVer time so as to
reveal change, growth, and increasing degrees of integration. Such an approach aims
for a more complete and accurate picture of learning, and therefore firmer bases for
improving our students' educational experience.
3. Assessment works best when the programs it seeks to improve have clear,
explicitly stated purposes. Assessment is a goal-oriented process. It entails comparing
educational ·performance with educational purposes and expectations -- those derived
from the institution's mission, from faculty intentions In program and course design, and
from knowledge of students' own goals. Where program purposes lack specificity or
agreement, assessment as a process pushes a campus toward clarity about where to
aim and what standards to apply; assessment also prompts attention lo where and how
program goals will be taught and learned. Clear, shared; implementable goals are the
cornerstone for assessment that is focused and useful. '
4: ·Assessmentrequires-attentlon to outcomes but also and equally to the.
··experiences that lead to those outcomes. Information about outcomes is of high
importance; where students "end up" matters greatly. But lo improve outcomes, we need
to know about student experience along the way -- about the curricula, leaching, and
kind of student effort that.lead io particular outcomes. Assessment can help us
understand which students learn best under what conditions; with such knowledge
comes the capacity to improve the whole of their learning.
5. Assessment works best when it is ongoing not episodic. Assessment is a process
whose power is cumulative. Though isolated, "one-shot" assessment can be better than
none, improvement is best fosternd when assessment entails a linked series of activities
undertaken over tim_e. This may mean tracking the process of individual students, or of
cohorts of students; it may mean collecting the same examples of student performance
or using the same instrument semester after semester. The point is to monitor progress
· toward-intended goafs In a spirit of continous impr~vemenL Along the way, the
·assessment process itself should be evaluated and refined in light of emerging insights.
6. Assessment fosters wider improvement when representatives from across the
educational community are involved. Student le'arning is a campus-wide
responsibility, and assessment is a way of enacting that responsiqjlily. Thu~, _while
assessment efforts may start small, the aim over time is to involve people from across
the educational community. Faculty play an especially important role, but assessment's
qµestions can't be fully addressed without participation by student-affairs educators,
librarians, administrators, and students. Assessment may also involve individuals from
I
I
I
beyond 'the campus (alumni/as, trustees, employer~) whose experience can enrich the'
sense of appropriate aims and standarcjs for learning .. Thus understood, assessment is.
not a task for small groups of experts but a collaborative activity; its aim is wider, better-.·
informed attention to student learning by all parties wit~ a stake in its Jmpro\/ement.
·
7. Assessment makes a difference when it begins with issues of use and illuminates.,
questions that people really ca.re about. Assessment recognizes the valu.e of
information In the process of improvement. But to be useful, information must be
connected to issues or questions .that people really care about. This implies ass.e.ss111ent. ,
approaches that prod UC() evidence_ io<iJrf)i_ey<int parties wllrfirid cf[aJ5f!c\ s~_g_ge~\lve-;-· ..
and applicable to decisions that n(led to be made. It means thinking in adl,'.ance al;>out
how the information will be used, and by whom, The pqint of assessment is no.I to Q<Jthsr
data and return "results"; it is a process that starts with the questions of decision~ ..
makers, thaLinvolves them in the 9<1.thering and interpreting of oata, and that informs: " .
and .helps guide continous improvement.
. .
. . . . ..
B. Assessment is most likely to. lead to improvement when it is part of a larger· setof.
conditions ttwt promote change. Assessment a,lone changes little. Its. greatest ..·..
contributiori comes on campuses Where the quality of teaching and learnlhg.ls vlsll~ily ....
valued and worked at. On such. campuses, the push lo improve educational performance
is a visible and primary goal of leadership; improving the quality of undergraduate
education is central lo the institution's planning, budgeting, and personnel cjeclslpr)S. On
. such campuses, information about learning outcomes is seen as an integraJp;;irt of ...
decision making, and avidly sought.
9. Through assessment, educators meet responsibilities to students and to the ..
· public. There is a compelling public stake in education. As educators, we have a
responsibility to the publics that support or depend on us to provide information abo.ut
the ways in which our students meet goals and expectations. But that responsibility goes
beyond the reporting ofsuch information; our deeper obligation .. to ourselves, our
students, and society .. is to improve. Those to whom educators are accountable have a
·
corresponding obligation to support such attempts at improvement.
Authors
Alexander W. Astin; Trudy W. Banta; I<. Patricia Cross; Elaine El-Khawas; PeterT. Ewell; Pat
Hutchings; Thepdore J. Marchese; Kay M. McClenney; Marcia Mentkowski; Margaret A. Miller;
E. Thomas Moran; Barbara D. Wright This document was developed under the auspices of the
MHE Assessment Forum (Barbara Cambridge is Director) with support from the Fund for.the.
Improvement of Post-Secondary Education with additional support for publication and
disseminatioFJ from the Exxon Education Foundation. Copies may be made without restriction.
MHE site maintained by: Mary C. Schwarz mjoyce@aahe.org
II !
I
i
I
I
I! Modification Date: Thursday, July 25, 1996.
PDCCC library
Teaching Resources & Assessment
Bibliography
Blythe, Hal, and Charlie Sweet. It Works for Me!: Shared Tips for
Teaching. Stillwater, OK: New Forums, 1998.
Blythe, Hal, and Charlie Sweet. It Works for Me, Too!: More Shared
Tips for Effective Teaching, Stillwater, OK: New Forums, 2002.
Boylan, Hunter. What Works: Research-Based Best Practices in Developmental Education. Boone, NC: Appalachian State U, 2002. Cushman, Kathleen. First in the Family: Advice about College from
First-Generation Students; Your College Years. Providence, RI:
Next Generation, 2006.
D'Errico, Deanna, ed .. Effective Teaching: A Guide for Community
College Instructors. Washington: The American Association of
Community Colleges, 2004.
Farnsworth, Kent, and Teresa Bevis. A Fieldbook for Community
College Online Instructors. Washington: Community College
Press, 2006.
Friday, Bob. Create Your College Success : Activities and Exercises for
Students. Belmont, CA: Wadsworth, 1988.
Gallien Jr., Louis B., and Marshalita S. Peterson.Instructing and
Mentoring the African American College Student: Strategies for
Success in Higher Education. Boston: Pearson, 2005.
Jewler, A. Jerome, John N. Gardner, and Mary-Jane McCarthy, eds.
Concise Edition. Your College Experience: Strategies for Success.
Belmont, CA: Wadsworth, 1993.
Holkeboer, Robert. Right from the Start : Managing Your Way to
College Success. Belmont, CA: Wadsworth, 1993.
Johnson, Elaine B. Contextual Teaching and Learning : What It Is and
Why It's Here to Stay. Thousand Oaks, CA: Corwin P, 2002.
Kanji, Gopal K. 100 Statistical Tests. 3rd ed. London: Sage, 2006.
Leamnson, Robert. Thinking About Teaching and Learning:
Developing Habits of Learning with First Year College and
University Students. Sterling, VA: Stylus, n.d.
Lieberg, Carolyn. Teaching Your First College Class: A Practical Guide
for New Faculty and Graduate Student Instructors. Sterling, VA:
Stylus, 2008.
Linehan, Patricia. Win Them Over: Dynamic Techniques for College
Adjuncts and New Faculty. Madison, WI: Atwood Publishing,
2007.
Mamchur, Carolyn. A Teacher's Guide to Cognitive Type Theory and
Learning Style. Alexandria: Association for Supervision &
Curriculum Development, 1996.
McGiynn, Angela P. Successful Beginnings for College Teaching: Engaging Your Students from the First Day. Madison, WI: Atwood Publishing, 2001. Nilson, Linda Burzotta. Teaching at Its Best : A Research-Based
Resource for College Instructors. 2"d Ed. Bolton, MA: Anker,
2003.
Palloff, Rena M. and Keith Pratt. The Virtual Student : A Profile and
Guide to Working with Online Learners. San Francisco: Jossey­
Bass, 2003.
Palomba, Catherine A. and Trudy W. Banta. Assessment Essentials:
Planning,Jmj;llementing, and Improving Assessment In Higher
Education. San Francisco: Jossey-Bass, 1999.
Roueche, John E. and Suanne D. Roueche. High Stakes, High
Performance : Making Remediial Education Work. Washington:
Community College Press, 1999.
Roueche, John E., Eiieen E. Ely, and Suanne D. Roueche. In Pursuit of
Excellence: The Community College of Denver. Washington:
Atwood, 2001.
Sarasin, Lynne C. Learning Style Perspectives : Impact In the
Classroom. Madison, WI: Atwood, 1999.
Sims, Ronald R., and Serbrenia J. Sims, eds. The Importance of
Learning Styles : Understanding the Implications for Learning,
Course Design and Education. Westport, CT: Greenwood, 1995.
Taylor, Terry. 100°/o Information Literacy Success. Clifton Park, NY:
Thomson, 2007.
Schuh, John H., M. Lee Upcraft, et.al. Assessment Practice In Student
Affairs: An Applications Manual. San Francisco: Jossey-Bass,
2001.
I
Student Survival Guide. New York: College Entrance Exam Board,
1991.
Upcraft, M. Lee, John H. Schuh, and John H. Schuh. Assessment
Practice In Student Affairs : An AR.Rlicatlons Manual. San
Francisco: Jossey-Bass, 2000.
Vernoy, Mark, and Diana l<yle. Behavioral Statistics In Action. Boston:
McGraw-Hill, 2002.
Walvoord, Barbara E. Assessment Clear and Simple: A Practical Guide
for Institutions. Departments and General Education. San
Francisco: Jossey-Bass, 2004.
Rubrics on the Internet-A Selection of Twentv-Two Possibly Helpful Sites
"Far better an approximate answer to the right question, which is often vague, than an
exact answer to the wrong question, which can always be made precise." John Tukey
(Annals of Math Stat, 1962, V33, p1-67)
1. California State University, Fresno http://www.csufresno.edu/cetl/assessmenUassmnt.lltml Site begins with the quote above and has two sections of interest -Assessment
Links, which has a subheading for rubrics, and, on the home page, General
Education Scoring Guides, with its PDF section on Using Scoring Guides.
2. Eastern Illinois University
http://www.eiu.edu/-acaffair/2000assessmentplan.rtf
Site includes EIU's Plan for the Assessment of Student Learning. Within that PDF
file are some very nice Primary Trait Analysis rubrics.
3. Florida Atlantic University
http://iea.fau.edu/insl/airOO.pdf
While not rubrics per se, the Institutional Effectiveness Checklists (pp. 18-22) within
this PDF file can be very helpful in terms of assessing the scope of assessment!
4. North Dakota State University
hl!p://www. ndsu. nodak.ed u/ndsu/manncdon/assessment/assessment . technigues/h
omework assiqnments.htm
For individual faculty, the Homework Assignments short article may be useful in
thinking about questions behind rubrics and includes bullets under Critical Thinking
that are essentially criteria that could easily anchor scoring in a rubric.
5. Northern Virginia Community College
http://www. nv. cc. va. us/assessment/a uthentic%20assessment. htm
This is a short introductory page on Performance Assessment, Authentic
Assessment, and Primary Trait Analysis. If you click on rubric, you go to a nice
template for rubric creation from San Diego State University's College of Education:
b.!fil:l/edweb .sdsu .edu/triton/iuly/rubrics/Rubric Template .html
Many people like the notion of four scoring levels, which approximate four years of
college or four grades above failing.
6. Rose-Hulman Institute of Technology
http://www.rose-hulman.edu/irpa/Gloria/Revised self-assessmen Summer2004.pdf
This is not a rubric for scoring student work, but it is a rubric for the assessor's work.
!ll!Q://www.rose-hulman.edu/irpa/Gloria/Curriculum%20Map.PDF
This is not a rubric for scoring student work, but it is a checklist for understanding the
degree to which faculty are supporting the progression of student learning. This is
part of Rose-Hulman's impressive e-portfolio approach.
7. Southeast Missouri State University
http ://www2. semo. eelu/provost/assmt/rubric. him
This is a tool for evaluating assessment plans/reports.
8. Southern Illinois University Edwardsville
h!!p://www.siue.edu/-deder/assess/cats/pta.html
http://www.siue.edu/-deder/assess/cats/rubex.html
http://www.siue.edu/-deder/partrub.html
The first link gives a basic discussion of Primary Trait Analysis (PTA) .. The second js
a PTA example for scoring a science paper. The third is a rubric for evaluating
student participation, using both positive and negative attributes.
9. Towson State University
http:/lpages.towson.edu/assessment/rubrics external links.him
Here is a robust array of helpful links and examples, including a number of rubrics
developed by Towson State faculty.
10. Higher Learning Commission (formerly North Central Association)
http://www.ncahigherlearningcommission.org/resources/assessment/AssessMatrixO
3.pdf
This is a matrix for evaluating an institution's assessment culture.
11.AAHE -American Association of Higher Education
http:/lwww.aahe.org/assessment/web.htm#Rubrics
Although a few links inside this site section don't work, others are quite useful in
understanding how to create and use rubrics. For a quick course in rubrics, this site
is one of the most helpful. Included are these two meta-sites, with many examples:
http://members.tripod.com/-ozpk/01 rubric
http://intranet.cps.k12.il.us/Assessments/ldeas and Rubrics/Intro Scoring/intro sco
ring.html
12. University of Washington
http://faculty.washington.edu/krurrnne/guides/bloom.html
Bloom's Taxonomy is a frequent source for differentiation and categorization of
verbs used to define and assess student learning. This is a great site on Bloom's.
13. From Ephraim Schechter's meta-site, the section on Assessment Rubrics:
• Examples of rubrics for general education outcomes from Bowlinq Green State
University. Also has links to other information about assessment rubrics.
• Examples of rubrics for general education outcomes from Brenau University.
Select Forms and Rubrics and follow the drop-down menus. Note: this site's
pages "lock" your browser. You can't use the (b)ack button to return to the
Internet Resources list.
• Examples of rubrics for general education outcomes from California State
University, Fresno, with suggestions for developing and using rubrics. (Scroll
down to General Education Scoring Guides.)
• Hints for developing/designing rubrics, and links to examples, from Kansas State
University.
• Washington State University rubric for critical thinking.
• AAHE's list of rubric tools and guidelines.
• Sites designed for K-12 education, but useful as models and adaptable for higher
education performance assessments.
·o Assessment Matters! has lots of rubric examples, plus other K-12-oriented
assessment links.
o Assessment and Rubric Information: Examples of evaluation
scales/rubrics for various student and faculty activities.
o Rubric generators create rubrics for various topics.
o Steps in creating an assessment rubric, from the WebQuest site at San
Diego State University.
14. NW Regional Educational Laboratory
http://www. nwrel. org/assessrnenl/pc;lfRubrics/Read ing G rades4-12 Rubrics. pdf
This is a good example set for how a rubric can be formulated to help students self
assess. The rubrics are on Traits of an Effective Reader.
15.RubiStar
htJQ://rubistar.4teachers.org/index.php
"RubiStar is a free tool
to help a teacher make quality rubrics."
Great site!
16. Chicago Public Schools
http://intranet.cps.k12.il.us/Assessments/ldeas and Rubrics/ideas and rubrics.html
This is a truly wonderful site, which has drill-down how-to sections and The Rubric ·
Bank collection of actual rubrics. Even though this is K-12 in focus, the concepts
and categories are often applicable to higher education as well. Be sure to browse
this one!
17. University of Wisconsin - Stout http://www.uwstout.edu/soe/profdev/rubrics.shtml This one has a lot of links, a number of which are broken. The examples come from
both postsecondary and K-12 sources, with a focus on assignment rubrics. Includes
Rubric instructions for "How to Score a Rubric on a Computer Using Any Decent
Database" and "Assessment of Electronic Portfolios."
18. Mount Royal College, Alberta, Canada http://www.mtroyal.ab.ca/cr/resources.php?mocle=6&menu=3 Within this site, there are some enticing links: Creating a Rubric and Rubric
Template, Custom Rubric Generator, and The Rubric Machine.
19. Insight Assessment
bttp://www.insightassessment.com/pclf files/rubric.pdf
This is a PDF file of the Peter A. and Noreen C. Facione Holistic Critical Thinking
Rubric, including a good explanation of what is involved in training raters.
20.Stylus Publishing
http://styluspub.com/resources/introductiontorubrics.aspx
This new site expands on Stevens and Levi's Introduction to Rubrics. It shows
several rubrics, but also invites readers to share their own rubrics and join in online
discussions of rubrics. The authors are faculty at Portland State University and their
e-mail addresses are listed.
21. College Board
http://www.collegeboard.com/student/testing/sat/prep one/essay/pracStart.html
Check out the new SAT Writing sample. The site offers both a sample prompt and
the Scoring Guide (rubric). What kind of rubric is it?
22. Washington State University
)illp ://wsuctproject. wsu. eel u/ctr. him
You can review the WSU Critical Thinking Rubric at this site.
Download