View PPT - Course, Curriculum, and Laboratory Improvement

advertisement
C16 – Tools for Assessing Learning in
Engineering
Teri Reed-Rhoads
Purdue University
School of Engineering Education
Inventions and Impact 2: Building Excellence in Undergraduate Science,
Technology, Engineering, and Mathematics (STEM) Education
13-15 August 2008
Washington, DC
Engineering Education Grant
Writing Workshop
ASEE Section Meeting
April 3, 2008
Teri Reed-Rhoads
Purdue University
trhoads@purdue.edu
Assessment
• Formative – checking along the way
• Summative – final analysis of achieving
research goals
• Consider an outside evaluator –
unbiased, measurable feedback
• Consider an external advisory board
• Give details, actual tools, timelines, etc.
Assessment
•
•
•
•
Course
Curriculum
Program
Project
Types of Assessment
– Cognitive Domain
• Content knowledge (Concepts)
• Process knowledge, e.g., teamwork (Skills,
Abilities)
– Student interest, perceptions, and attitudes
(Affective)
– Rate change (Achievement)
PD’s Response – Goals on
Cognitive Changes
• Goals on cognitive changes
– Increase understanding of concepts
•
•
•
Ability to solve statics problems
Ability to construct free-body diagrams
Ability to describe verbally the effect of
external forces on a solid object
– Increase processing skills
•
•
•
Ability to solve out-of-context problems
Ability to visualize 3-D problems
Ability to communicate technical problems
Connie Della-Piana, Russ Pimmel, Bev Watford
“Project Evaluation”, Workshop for Faculty from
Minority Serving Institutions, Feb. 8 –10, 2006.
Concept Inventories
https://engineering.purdue.edu/SCI/worksho
p/tools.html
• Engineering Related
– 20 instruments
• Science and Mathematics Related
– 8 instruments
• Physics and Astronomy Specific
– 12 instruments
4.
5.
6.
7.
8.
9.
Engineering Related CI 10.
1. Chemistry
11.
2. Circuits
12.
3. Computer Engineering
13.
4. Computer Science
14.
5. CS (Intro)
15.
6. Discrete Math (CS)
16.
7. Dynamics
17.
8. Electromagnetics
18.
9. Electronics
19.
10. Fluid Mechanics
20.
11. Heat Transfer
12. Materials
13. Nanotechnology
14. Signals and Systems
15. Statics
Computer Science
CS (Intro)
Discrete Math (CS)
Dynamics
Electromagnetics
Electronics
Fluid Mechanics
Heat Transfer
Materials
Nanotechnology
Signals and Systems
Statics
Statistics
Strength of Materials
Thermal and Transport Sciences
Thermodynamics
Waves
Engineering Related CI
Science and Mathematics CI
Science and Mathematics CI
1. Biology
2. Calculus - Nelson
3. Calculus - Epstein
4. Chemical Equilibrium
5. Geosciences
6. Natural Selection
7. Organic Chemistry
8. Physics*
Physics Physics
and Astronomy
CI*
and Astronomy CI*
1.
2.
3.
4.
5.
6.
7.
Force Concept Inventory
Mechanics Baseline Test
Astronomy Diagnostic Test
Brief Electricity and Magnetism Assessment
Conceptual Survey in Electricity and Magnetism
Diagnostic Exam Electricity and Magnetism
Determining and Interpreting Resistive Electric
Circuits Concept Test
8. Energy and Motion Conceptual Survey
9. Force and Motion Conceptual Evaluation
10. Lunar Phases Concept Inventory
11. Test of Understanding Graphs in Kinematics
12. Light and Color
*Rebecca S. Lindell, Elizabeth Peak and Thomas M. Foster, “Are They All Created Equal? A
Comparison of Different Concept Inventory Development Methodologies”, 2006 Physics Education
Research Conference Proceedings, American Institute of Physics.
Goals on Process Knowledge
• Teamwork
– Team Effectiveness (Imbrie)
– Team Developer (McGourty)
• Communications
– 3-D Visualization
– Etc.
• Leadership
PD’s Response – Goals on
Affective Changes
• Goals on affective changes
– Improve students’ attitude about
• Profession
• Curriculum
• Department
– Improve students’ confidence
– Improve students’ intellectual
development
Connie Della-Piana, Russ Pimmel, Bev Watford
“Project Evaluation”, Workshop for Faculty from
Minority Serving Institutions, Feb. 8 –10, 2006.
Affective Measures
• Pittsburgh Freshman Engineering
Survey (Besterfield-Sacre)
• Attitudes Toward Statistics (Wise)
• Computer Science Attitude Survey
(Moskal)
• Structured Interviews
Table 4: Sample Factors from Pilot Interviews
Factor Category
sample factors identified in pilot study interviews with ten students majoring in
Industrial Engineering at OU
I. Student's
a. strong desire for b. strong interest in c. exposure to
d. exposure to
Background
career with people math and science
science and
computers at home
orientation
engineering via
and in school
family
II. Attributes of the a. scholarship
b. student
c. OU is close to
d. like OU
Institution
opportunities
organizations and
home
networking
III. Attributes of
a. switch from
b. people oriented
c. useful to public
d. challenges,
the Field
chemical or
and human factors opportunities, and
petroleum engr
flexibility
IV. Pedagogy/
a. interaction with b. interactive and
c. feedback from
d. student work
Curriculum
faculty
group work
faculty
has relation to real
life
V. Department
a. interaction
b. women role
c. faculty invite
d. faculty are
Culture
between faculty
models
participation in
approachable
projects
VI. Student's
a. graduate school b. balance
c. seeks work for
d. developing
Future
between family
company w
other interests,
and work
community
e.g., music
involvement and
flexibility
Consider a Graphic
I
I
b
a
c
VI
b
c
II
b
d
c
a
IV
II
d
b
V
a
c
c
VI
b
c
III
III
V
a
d
IV
(A) factors related to people interests
(B) factors related to STEM interests
Figure 1: Complex Interrelationships Among Factors, Based on Pilot Interviews
PD’s Response – Goals on
Achievement Rate Changes
• Goals on achievement rate
changes
– Improve
• Recruitment rates
• Retention or persistence rates
• Graduation rates
Connie Della-Piana, Russ Pimmel, Bev Watford
“Project Evaluation”, Workshop for Faculty from
Minority Serving Institutions, Feb. 8 –10, 2006.
Know the Motivation or
Background
Table 1: Proportion of Women in Selected STEM Fields
Participating
Departments
IE
Chem E(a)
Mathematics
Physics
CS(a)
AME(a)
OU students(b)
47/84
126/325(c)
24/75
16/75
46/312
45/338
55%
39%
32%
21%
15%
13%
students
nationwide(d)
25%
33%
46%
20%
24%
12%
OU faculty(e)
4/10
1/15
3/29
4/28
2/12
1/15
40%
6%
10%
14%
17%
6%
faculty
nationwide
10% (f)
8% (f)
19% [13]
8% (g)
11% (h)
6% (f)


URMs 30% of population, few in labor force
Women 50% of population, 10% in labor force
# of BS Engineering Degrees
1966
Under
Represented
Minorities
Women
146
(0.4%)
% of Freshman
Engineering Class
1977
2002
1975
2001
1,915
(4.8%)
7,971
(11.6%)
17.4
15.8
1,961
(4.9%)
14,102
(20.5%)
19.9
18.3
Chubin and Babco (2003), Walking the Talk in Retention to Graduation
Engineering Disciplines
Retention Rates
Engineering
Discipline
Proportion
Women
Proportion
URM
Total
URM
NonURM
Chemical
32.4
11.5
35.0
48.1
Civil
20.1
13.2
37.9
44.2
8.8
47.6
58.7
Computer
Electrical
11.8
14.3
36.3
47.2
Industrial
29.4
17.7
44.2
51.3
Mechanical
12.0
11.0
38.0
48.7
Total
19.0
11.7
38.9
49.5
Chubin and Babco (2003), Walking the Talk in Retention to Graduation
National Science Foundation. Women, Minorities, and Person with Disabilities in
Science and Engineering: 2000. Arlington, VA, 2000 (NSF 00-327)
Table 2-3 Evaluation Table
Goal
Increased retention
of women students
in engineering
Objective
Develop courses with
interdisciplinary focus
to connect eng with
societal concern
Evaluation Technique
Retention data after each
semester & for cohorts
after 4 and 5 years
Anticipated Outcome
Increased percent of
women remaining in
any engineering major
Authentic Teaching Alliance Assessment Cycle
11. Data entered, compiled,
summarized and reported to
stakeholders in Summer Workshops
1. Attitude & Experience
surveys, Pre-year administration
to K-12 students
Aug Sept
10. Attitude & Experience
surveys, Post-year
administration to K-12 students
9. Teacher Exit Interviews
Fellow Exit Interviews
Faculty Adviser Interviews
Summer
Fall Semester
June
May
3. External Evaluation Oversight Committee
Dec
8. Communication Skills
Survey given to Fellows
7. Activity Analysis by teachers
and fellows reflecting following
each lesson
2. Activity Attitude Assessments
by K-12 students following each
activity in the semester
Spring Semester
Jan
6. Activity Attitude Assessments
by K-12 students following each
activity in the semester
4. Activity Analysis by teachers
and fellows reflecting following
each lesson
5. Portfolio collection fellows reflection on ATA
experiences
Group Questions
1. What are the known
instruments/methods in each of the
three areas? (Cognitive, Affective, and
Rate Change)
2. Where have these been used
successfully/unsuccessfully?
3. Where are the gaps in each area?
Process
•
•
•
•
One note taker
One reporter
Everyone discusses
We will report out with 15 minutes left
There is no failure. Only
feedback.
-Anonymous
Happy grant writing!
trhoads@purdue.edu
https://engineering.purdue.edu/SCI/Workshop
Affective
• National academy of engineering
website with book of surveys
Rate Change
• Longitudinal measurements, courses,
rate changes of minorities, retention,
enrollment, GPA for performance
measures
• Use Associate Dean’s offices as
resources for stats, changes, etc.
Wish
• Critical Thinking – one that is free
• Definition of engineering changes for
students as they go through the curriculum
• Digital Systems within computer engineering
• Teaming attitude by individual
• Measuring how well the student can deal with
an out of box situation.
• Clearinghouse that explains what already
exists, how long is it, how it was used in the
past, cliff notes version
Wish
• ABET could sponsor (or someone) or post
tools that are being used
• More CI
https://engineering.purdue.edu/SCI/workshop
/tools.html
• Tools for writing and communication
• Life-long learning tools (continuous learning)
• Procedural knowledge on engineering design
• FREE Critical Thinking
Wish
• Meta-cognition
• Societal context of knowledge, global
competencies, service learning, research
skills, discourse and collaborative dialogue,
oral communication, problem solving,
leadership (some instruments from Business
schools), ability to integrate knowledge,
longitudinal data, qualitative data analysis
• Evaluation of case studies
Cognitive
• Math placement test
• English Placement test
• Concept Inventories
https://engineering.purdue.edu/SCI/workshop/tools.ht
ml
• FE, GRE – standardized tests
• Critical thinking
– Criticalthinking.org – has tools, and rubric developed by
Gary Brown at Washington State, Watson Glazier (also not
free), U of South Carolina EFFECTS (web-based)
– http://www.ce.sc.edu/effects
•
•
•
•
California Critical Thinking
Collegiate Learning Assessment
CAAP – from ACT
SALG – Student Assessment of
Learning Gains
• IDEA – teaching assessments
• MSLQ – Motivated Strategies for
Learning Questionnaire
• NSF User Friendly Guide to
Assessment
•
•
•
•
•
Rogers and Sando, Rose-Hulman
Learning Styles Instruments (Kolb, Felder)
Career Instruments (Holland)
FMRI, Caps measuring brain activity
Experience sampling method – student wears
a bracelet and when it goes off the student
writes down what they are doing
(engagement measures)
• Calibrated Peer review
• Teamwork effectiveness (Imbrie, Smith,
Layton)
• National Center for Case Studies (Buffalo)
Download