Eastern Michigan University Peggy Liggit – Director of Academic Assessment (I)

advertisement
Eastern Michigan University
Peggy Liggit – Director of Academic Assessment (I)
Howard Booth – Professor of Biology
Mary Brake – Professor, Mechanical Engineering Technology
1
Peggy Liggit
2
Shared Assumption:
Conscious Competent Learning Model
Unconscious
Conscious
Incompetent
Incompetent
Conscious
Competent
Unconscious
Competent
Most faculty who achieve tenure are
operating at this level in their discipline
3
“The connections made by good teachers are held not in their methods
but in their hearts – meaning heart in its ancient sense, as a place where
intellect and emotion and spirit and will converge in the human self.”
--Parker Palmer, Courage to Teach.
“You should never worry about your good ideas being stolen in
educational reform, because even when people are sincerely motivated
to learn from you, they have a devil of a time doing so.”
--Michael Fullan, Change Forces: The Sequel.
“One important feature of embedded assessment is that it “blurs the
lines” between teaching and assessment.”
--James J. Gallagher, Improving Science Teaching and Student
Achievement through Embedded Assessment.
4
Why There are Few ‘Good’ Models
Michael Fullan (1999, 2006)
Innovative ideas are “difficult to disseminate and replicate.”
Not easily transferable - difficult to capture “subtleties of the
reform practice.”
The inability to replicate another’s model, “replicating the
wrong thing”
“The reform itself, instead of the conditions which spawned
the success.”
Problems with scale: small -scale may not work well on a
wider-scale.
5
Please get out your pen and paper.
6
Reflect and write about a
misconception/problem in student learning
that you did something about.
What/how were you teaching at the time?
What activity were you doing to determine student’s
thinking/reasoning?
What did students understand/not understand about
what you were teaching?
What did you do to help students better understand?
7
Care to share?
8
You have just
documented a, presumably,
unconscious competent activity.
Teach,
Assess,
Analyze, and
Adjust
= Embedded Assessment
9
Include an activity that intentionally
focuses on a learning outcome and
reveals what students’ are
“thinking” – not just
focusing on a
final or
correct
answer.
Teach
Teach
Teach
Gather information about learning through
work samples that expose
students’ ideas/skills e.g. constructing
Assess
representations
of what they
Assess
know.
Assess
Start
Analyze
Analyze
Analyze
Adjust
Adjust
Decide and
implement next
instructional moves.
Adjust
Analyze what
students struggle
with as learning
happens in
real time?
Embedded Assessment Cycle
“Blurring the lines” between instruction and assessment.
10
• Read how other’s have found misconceptions
• Try it yourself!
11
Embedded Assessment:
Identifying Misconceptions Exercise
Located ~10 diverse articles about misconceptions in
varied disciplines. Looked for these trends:
1) What was the misconception?
2) How was the misconception discovered?
3)What was the suggestion on how to improve student
understanding about the misconception.
12
Example Articles:
Mapping the structures of resonance
(Malde. 2009. National Association of Teachers of Singing)
“Body Mapping helps people discover and correct
misconceptions about the way their bodies are built…”
13
Misconceptions Articles: Results
Misconceptions found through various activities:
performance or application activities,
disciplinary writing,
problem solving “show your work” exercises
Most remedies included instruction with an activity
(hands-on or some kind of problem set.)
14
Embedded assessment supports the
“conscious competent” learning model
AND
serves as documentation for reporting
on accountability,
…thus humanizing the process.
15
Writing Better Exam Essays
in Human Physiology
Howard Booth
16
ASSESS
HUMAN PHYSIOLOGY
TEACHING IMPROVEMENT PROJECT
ANALYZE
TEACH
ADJUST
History and evolution into a SOTL project
My conviction: science students need to communicate in their discipline
Revealing misconceptions: Student writing is a prime place to asses their
understanding of content but poor writing skills distorts this
assessment and lowers exam scores disproportionately.
(Good students were losing many essay points due to poor writing)
The challenge: good writing is hard work for teachers and students
Traditional resolutions: ignore it, require it, or teach it
Hypothesis: If students understood and applied good organizational and
proofreading techniques to their writing, essay scores would move up to be
comparable to the rest of the exam and provide a truer assessment of their learning.
Goal of the project: to improve the Big Essay scores (worth 25% of the exam; 4 exams per semester)
-Through a focused effort to teach students better essay writing skills
-Through a secondary focus on improving organization and coherence
Students Learning Outcome 2: Students will communicate scientific knowledge and concepts in written form.
Students Learning Outcome 3: Students will synthesize scientific knowledge in higher level thinking problems.
17
ASSESS
TEACH Phase 1: Getting Organized
The SOTL Experiment
ANALYZE
TEACH
ADJUST
Control group: Student exam and essay scores from 2008 (before essay focus)
Experimental Group: Fall 2009 and Winter 2010 classes
-All students: essay score and exam scores
-Subgroup: student scores from Rewrite Workshop
Time Line --- Methods and materials
Pre exam 1: All lecture instructions “how to take the exam” (5 min)
Post exam1:
-All lecture illustration of an excellent essay answer (10 min)
-Essay rewrite workshop: For those “performing poorly” or “want to do better”
Instruction-discussion on writing & organization skills (40 min)
Re-write exam1 essay: take home, self timed (30 min)
Re-Grade: scores, written comments & personal discussion
Post exam 2: lecture discussion/illustration of a concept map of essay (15 min)
Pre exam 4: Opinion survey on rubric & essay scoring
Inclusion of an additional rubric statement.
18
ASSESS:
Essay 1
Example
“Bill”,
A good
Student
89%=“A-” on
Non-essay
Part of exam
Earning
A
Poor
essay
Score
16 of 28
is
57%=“D”
19
ASSESS
TEACH Phase 2: The Workshop
ANALYZE
TEACH
Doing the “Experiment”
Workshop Highlights
•
•
•
•
•
•
ADJUST
Blue print - how to write a good paragraph: Theme introduced in
topic sentence, supporting sentences in logical order, a conclusion.
Science writing is concise, detailed, accurate, breadth of the theme
Building blocks- analyze the question (see printout)- finding out what to include
Main theme, mentioned areas of interest, required 9 terms/phrases
Organizing the construction site- organization of logical sequence
Outline areas of interest and terms-- flow diagram & concept map
Building it-- apply above -- also, some hints on good construction
Set aside enough time, avoid “wordy” superficial statements,
use the space provided- not more , not less. Write neatly, 14-18point
too small hard to read, too large wastes space. Spell correctly.
Building inspection-- proof read, all “areas of interest” included?’
& terms used correctly, explained, circled and numbered?
match your essay with rubric - avoid losing unnecessary points
Avoid building disasters-- study ! If your blocks crumble----if you don’t know the content, good writing won’t help much
20
ASSESS:
“Bill’s”
Rewrite
Scored
92%
On
The
Rewrite
34.9%
Better
From a
“D”
Up to
an “A-”
21
ASSESS
ANALYZE: What we were looking for
and what we found
The Results
ANALYZE
TEACH
• The quality of the re-writes improved:
F’09, 13/28 (46%) to 23/28 (82%) a gain of 36 %
ADJUST
• Improvement carried into exam 2 and beyond
13/28 to 19.8/28 (up 70.1%) on exam 2.
• Overall exam scores moved up over 5% from 74.4% to 79.7% .
• Numerous positive student comments about the experience and their increased
comfort with essay writing. “ I loved the essay format- very helpful” F’09.
•Participation increased from 15/60(25%) F’09 to 38/61 W’10(62%) (up 37%)
•Rubric Opinion Survey to assess student misconceptions or concern on scoring
•Change was greatest in the poorest writers, about 65% improved dramatically
Data Chart of Essay Score Averages before (W’08) and now (W’10)
Exam 1
Exam 2
Exam 3
Exam 4
17.8
18.3
18.5
19.8
63.5%
65.2%
66.1%
70.8%
16.8
21.0
22.0
22.0
% chang e number/28 possibl e
59.9%
75.5%
78.6%
77.8%
W'08 C ontrol all students' essays
61.7%
63.3%
63.0%
66.1%
All students Averag e
% chang e number/28 possible
Workshop student averages
And see
the Graph
next slide
22
ASSESS
ANALYZE : Results Graph and Summary
ANALYZE
TEACH
Graph of Essay Score Averages
Before and Now
ADJUST
90.0%
W'08 Control
80.0%
W'10 All
Students
W'10 Workshop
students
70.0%
60.0%
50.0%
40.0%
30.0%
20.0%
Series1
control
RESULTS SUMMARY
1. W'10 Workshop participants impoved most from a D+ to a B
2. W'10 all students improved modestly from a C- to a C+
3. W'08 control students improved the least from C- to a C
4. Workshop participants improved most from exam I to exam 2
and continued to improve in exams 3&4
5. All W'10 students benefited compared to W'08 control
6. Essay scores no longer "Disproportionately low"
Workshop 5% above, AllW'10 at all exam averages.
All
W'10
Series2
Workshop
Series3
10.0%
0.0%
1
2
Exams
3
1, 2, 3
&
4
4
23
I c o mpleted th e s u rvey f o r a B onu s pt. Name: _ ______ _ _____ _ _____ _ _
-- - -- -- - -- -- - -- -- -- --- -- -- --- -- -- --- -- ---tear o ff o r c u t h ere -- --- ------- ----- - ------ - --
ANALYZE:
Rubric
Survey Results
Opi n ion Su rvey on Big Essay P o in ts Dis trib ut ion (sc o ri n g r u bric)
P lease circ le th e lett er b efo re
a= s tro n g ly agre e
b = agree
c= n eith er agree n or d isagree
d= d isagree
e= s tro n g ly d isagree
th e sta tem e n t to in dica te y o ur le v el of agree m en t w ith it.
5p t
4p t
3p t
2p t
1p t
(An y co mm en ts ex p la in in g y ou r th o ug h ts w o u ld also b e v er y h elp fu l,. T h an ks)
40 s tude n ts resp o n ded ( o u t o f 53 still atten d in g n ear th e en d of t h e se m es ter )
Conclusions:
1. Understanding
the essay
scoring is not
much of a
problem
2. It’s easy to fix it
for the few who
were concerned
(add a statement to
the exam)
1. ( a b c d e ) A s I read th e essa y ques tio n itse lf , I thin k I h a v e a clear idea o f h ow I ca n ear n fu ll p o in ts.
Resu lts: 4 @ 2p t, 3@ 3p t, 24 @ 4p t 9 @ 5p t = a v erage 4 .0 o u t o f 5.0
An a ly sis: M o st d id n' t see ru b ric sc o rin g as a p ro bl em , b u t 7 (2 0% ) d id h a v e co n cer n s h er e.
C o mm en ts: " v er y c lear"
" th e essa y ques tio n s are usua lly s tra ig h t fo rwar d an d y o u k n o w y ow th ey ar e supp o sed to b e an swer in g." ,
" o n ce I we n t to th e w orks h o p I was ab le to b ett e r u n ders tan d h o w th e essa y was to b e w ritt en "
2. ( a b c d e ) A separa te ch eck li st a n d ex p lana tio n , n o t par t of th e ques tio n itse lf e x p lain in g p o in t
d istrib u tio n (sc o rin g ru b ric), w o u ld m ake th is p roc ess m o re u n ders ta n da b le.
Resu lts: 1 @ 1, 5 @ 2p t, 9 @ 3 pt ,18 @ 4, 7 @ 5p ts = average o f 3. 6 o u t of 5.0
An a ly sis: Qu ite a stro n g fee li n g t h is w o u ld h elp. Bu t read in g t h e c o mm en ts left m e th in k in g th is was n' t
v er y im p o rtan t to th em .
("ca n' t h ur t")
C o mm en ts:
"g oo d idea, T h is w ill all ow stude n ts to h av e a n e x ac t acc o u n t for a ll p o in ts"
"y o ur sc o rin g m e th o d is alread y c lear"
"h e lp fu l for fi rs t e x am , I f elt v er y u n c o mf o rtab le w ritin g it b ecause I h ad n o clue w h at to e x pec t. A g rad in g
ru b ric w o u ld le t m e k n o w w h at y ou were e x pec ting"
3. ( a b c d e ) C o n fus io n a b o u t h ow I ear n th e essa y p o in ts c o n cer n s m e as I w rite th e essa y (" n ot
k n o w in g" w h ere y o u mi g h t ga in o r l o se p o in ts in ter feres w ith y ou r th o ug h t pr og ress io n as y o u are p lann in g
an d wr itin g th e essa y ).
Resu lts: 2 @ 1p t, 13 @ 2 pt, 9 @ 3p t, 8 @ 4p t, 5@5p t, = a v erag e o f 2.9 out o f 5
An a ly sis: Near n eu tra l n ot stro n g fee li n g o r c o m m en ts. N o t m uc h o f an issue for m o st
C o mm en ts:
"Wh il e I'm wr itin g t h e essa y I 'm jus t tryi n g t o d o th e b es t I ca n . I'm n ot w orr ied a b o u t th e p o in ts ." " N o ,
dep th o f an swer a n d lack o f tim e ar e my c o n cer n s", "T h is ca n b e t ro u b li n g b ecause t h e p o in t sy ste m isn' t
cu t a n d dr y "
4. (a b c d e ) I th in k m o re in fo rm a tio n o n th e sc o rin g ru b ric w o u ld im pro v e m y essa y sc o res.
Resu lts: 3 @ 1p t, 7 @ 2 pt, 1 4 @ 3p t, 14 @ 4, 1@ 5p t = av erage o f 3.1 o u t o f 5
An a ly sis: A ga in a lm o st n eu tra l o v er a ll b u t a b o u t 30% at leas t fav o red th e idea. C o mm en ts:
"m o re in fo rm atio n n ev er h urts", "a sc o rin g ru b ric d o es n' t enh an ce o r inh ib it my w ritin g."
5. An y ot h er sugges tio n s?
24 y ,
"A grad in g ru b ric at th e b eg inn in g of th e se m es ter w o u ld b e h elp fu l, I f e lt v er y n er v o us ab o u t th e fi rs t essa
after th at it ge ts eas ier"
ASSESS
ADJUST: Continuous fine tuning
ANALYZE
TEACH
ADJUST
1. Already have incorporated a series of changes
•Three “editions” of workshop notes
half page outline of my talking points
full page on writing
two page handout on writing,
organizing, proofreading and rubric
•Adding concept map to lecture presentation W’10
•Expanded “ workshop” to a take-home option W’10
•Last week upgraded the handout and sent e-copies to all
of my spring students as a Pre Exam 1 study sheet.
•Plan to subdivide workshop participants into working
groups this spring.
25
ASSESS
Conclusions
Yes, teach
it again !
TEACH
ANALYZE
Experiment was a success:
ADJUST
-hypothesis supported
-embedded data collected
-students engaged & learning outcome goals were met
-essay scores improved and now more accurately
reflect learning (no longer disproportionately low)
-improved essays: better assessment of misconceptions
Cost/ Benefit Analysis
-costs: clocked about 10 hours per semester
-benefits:
students write better--realize& appreciate it-- “you care”
course & the department get assessment quantified
I followed my convictions --teaching “good” writing in science
improvement keeps teaching interesting and enjoyable
Bottom line: It was, and is a success at many levels
26
Open Discussion/Questions
27
Mary Brake
28
IMPLEMENT
1.
Program Review
The model: Continuous Improvement
The goal:
Establish student learning outcomes for program, and
Show that students have met these outcomes.
2.
Method
Link program outcomes to courses
Combine multiple program outcomes into groups
Use multiple methods to evaluate program outcomes
Use embedded assessment to evaluate course objectives
29
ASSESS
Use embedded assessments to:
- Assess for applying concepts learned in previous
courses needed to understand more complex
concepts in follow on courses.
- Assess for concepts learned in a particular course.
- Assess ability to combine multiple concepts to
show that students have mastered “the knowledge,
techniques, skills and modern tools of MET”
30
ANALYZE CONCEPTS TAUGHT IN
PREVIOUS COURSE(s)
Example: In Fluid Mechanics students must remember how to use
certain concepts from Statics and Dynamics.
HW Question: See next slide
Concept: In addition to applying the general concepts of “submerged
bodies”, students must use the concept of taking a moment about a
particular point. (Think torque)
Assessment: use rubric to determine students ability to apply previous ,
often theoretical, concepts to this real problem. This gives an idea of
the overall class understanding
Note: The grade on this question may or may not match “grade” the student receives on the rubric.
31
Example Problem using Prior
Knowledge
A rectangular gate is installed in a vertical wall of a reservoir. Compute the magnitude
of the resultant force on the gate and the location of the center of pressure. Also
compute the force on each of the two latches.
From “Applied Fluids Mechanics” by Robert L. Mott (2006) Prentice Hall
32
Example of student who knows the fluid
mechanics but not how to take a moment
Did not find the force on the latches because student did not remember or
did not know how to take the moment.
Used with permission
33
Example of student who knows the fluid
mechanics and prior knowledge
Used with permission
34
ASSESS PRIOR KNOWLEDGE
Student
Doesn’t understand the fluid
concepts nor
that a moment
is needed to
solve problem
Can solve fluid
mechanics but
doesn’t see that
the moment
Eqn. is needed.
Can set up the
fluid
mechanics and
knows moment
Eqn. is needed
but can’t
remember how
to solve
Knows that the
moment Eqn. is
needed and
knows how to
apply it in solving
fluid mechanics
problems
0
1
2
3
1
x
2
x
3
x
4
x
5
x
6
x
… 15
x
35
ANALYZE AND ADJUST
ANALYZE
Class average = 2
Conclusion: In general, students knew the fluids mechanics and how to set up
the problem but either remembered the moment equation and how to solve it
or did not remember the moment equation at all.
ADJUST
Review how to apply the moment equation to a simple problem, then assign
practice problems using the moment equation. Discuss with colleagues how to
spend more time on this concept in prerequisite courses.
Just a thought – carrots work better than sticks
Give small amount of extra credit for the homework. It is amazing what
students will do for extra credit!
36
ANALYZE CONCEPTS TUAGHT IN
CURRENT COURSE(s)
Example: Students think they understand turbulent flow and where it
occurs, but do they?
Exam Question: Draw the flow lines and turbulence (if it exists) for the
following situations.
Concept: When turbulence occurs, it occurs around back edges of objects
(usually).
Assessment: Have students draw the flow lines are two simple objects for
laminar and turbulent flow
37
Student who understands the difference between
turbulent and laminar flow and where turbulence occurs.
38
Student who does not understand turbulent flow
or even how to draw an air foil
Used with permission
39
ASSESS COURSE CONTENT
KNOWLEDGE
Student
Can’t draw
correct flow
lines or
areas of
turbulence
Can draw
flow lines
but not air
foils.
Can
correctly
draw flow
lines but
does not
know where
turbulence
occurs.
Can
correctly
draw flow
lines, air
foils, and
position of
turbulence.
0
1
2
3
1
x
2
x
3
4
x
x
5
6
… 14
x
x
x
40
ANALYZE AND ADJUST
ANALYZE
Class average = 2.5
Conclusion: Students have fundamental idea of the
concept of flow lines and how the flow travels around an
object. A few don’t understand where turbulence occurs.
ADJUST
Implement additional experiments to increase visual
knowledge.
41
ASSESS MASTERY OF SEVERAL
CONCEPTS IN A SENIOR COURSE
Example/Concept: In “Mechanical Vibrations” students need to
combine their knowledge of statistics, dynamics, and strength of
materials to learn how to analyze systems that vibrate.
Essay : Please type a one to two page essay describing a vibration problem
you see at work (or have seen in the past) and answer the following:
“If you were in charge of fixing this problem, what would you do or if the
problem was fixed, what was done?”
Assessment: Does their essay illustrate a knowledge of vibrations and the
underlying concepts.
42
Gather Evidence of What a Student Knows
“CMM stands for Coordinate Measuring Machine and it is a quick and
accurate means of dimensionally checking manufactured parts to be within the
tolerances specified by the correct controlled print…. During plant layout
planning I was requested for input on needs for the CMM lab. I did a layout of
our lab that included … a request (for) … CMM on an isolated foundation.
Nothing was done as I requested.
“We got (a) … a stamping press. The press is a large mechanical
slanted press and is just outside of the CMM lab. The press stamps parts in a
zigzag direction, which means that it moves in the vertical and horizontal
directions during operation. This caused an increase in vibration of the plant
floor.
“I (did) a study on how much the vibration (affected) the CMM … when
calibration of probe … a variance of target measurement of 0.2 mm (was
observed and) when (the stamping) press was not running, a variance of
0.00005 mm (was measured).
“I proposed the solution was to move the CMM and cut the floor
where it stood. Remove the concrete. Dig a new foundation 6 feet deep ... A gap
would be formed and filled with a rubber isolator between the new foundation
and the plant floor. Then add Anti Vibration Mounts to the points of contact of
CMM. The results of this would be very low natural frequency (-2 -3 Hz) and the
disturbing frequency (greater than) 4Hz would (be) reduced by 90% …”
Used with permission
43
ANALYZE AND ADJUST
ANALYZE
All students gave great examples and were able to described
the problem and a solution.
All students were so enthusiastic about their essay that each
and every one of them requested to read their essay aloud to
the class and give additional commentary.
ADJUST
No adjustments necessary, yet. Use this assignment again!
44
Take Home - Embedded Assessment:
ASSESS
ANALYZE
IMPLEMENT
Fosters professional development shifts unconscious competent to
conscious reflection;
As a narrative , it “captures”
improvements for
accountability;
ADJUST
… thus, humanizing assessment.
Unconscious
Competent
Conscious
Reflection
Awareness of continuous improvement of student learning
45
Contact Information
Peggy Liggit pliggit@emich.edu
Director of Academic Assessment (I)
Office of Institutional Effectiveness and Accountability
234 McKenny
Ypsilanti, MI 48197
734-487-0199
Howard Booth hbooth@emich.edu
Professor of Biology
Department of Biology
316 Mark Jefferson
Ypsilanti, MI 48197
734-487-4391
Mary L. Brake mbrake@emich.edu
Professor, Mechanical Engineering Technology
School of Engineering Technology
118 Sill Hall
Eastern Michigan University
Ypsilanti, MI 48197
734-487-2040
46
Open Discussion/Questions
47
Embedded Assessment: Catching Misconceptions in Student Learning
AND
Capturing Program Improvement Efforts
SOTL Panelists 5/18/2010 Session E2
Peggy Liggit pliggit@emich.edu
Director of Academic Assessment (I)
Office of Institutional Effectiveness
& Accountability
234 McKenny
Eastern Michigan University
Ypsilanti, MI 48197
734-487-0199
Howard Booth hbooth@emich.edu
Professor of Biology
Department of Biology
316 Mark Jefferson
Eastern Michigan University
Ypsilanti, MI 48197
734-487-4391
Mary L. Brake mbrake@emich.edu
Professor, Mechanical Engineering Technology
School of Engineering Technology
118 Sill Hall
Eastern Michigan University
Ypsilanti, MI 48197
734-487-2040
Panel Agenda
Introductions and Overview
The Conscious Competent Learning Model/Embedded
Assessment Complement
Participants share how they have captured student
misconceptions
(use the space below for your notes):
Embedded assessment applied to an individual course
Open Discussion/Questions
Embedded Assessment used to capture program improvements
Open Discussion/Questions
*If you would like an electronic copy of our presentation, please contact Peggy via email.
48
Download