Colorado School of Mines Annual Assessment of Student Learning Outcomes Report

advertisement
Colorado School of Mines
Annual Assessment of Student Learning Outcomes Report
For Academic Year 2013-14 Undergraduate Programs
The purpose of assessment is to promote excellence in student learning and educational practices by
fostering a campus culture of self-evaluation and improvement. The annual assessment report enables
CSM to document engagement in continuous improvement efforts. Provide responses to the following
questions/items and email this completed document to kmschnei@mines.edu by September 26. Feel free
to expand the tables below as needed.
The Assessment Committee will provide written feedback in response to the department annual report,
using this rubric as the basis for their feedback.
Department/Program: PHYSICS
Person Submitting This Report: Todd Ruskell
Phone: x2080
Email address: truskell@mines.edu
1. Describe your assessment plan (list not only the activities completed in the past year, but your
entire ongoing plan.)
We use the ABET “a-k” criteria for our student outcomes.
The assessment measures listed below in many cases represent only a sampling of how each criterion is
assessed.
At the course, level, the Department maintains an “ABET Notebook” for every required course in our
curriculum, and for many of our elective courses. This notebook contains a syllabus, course grades,
sample assignments and student work for each semester. The notebooks indicate how individual student
outcomes are assessed in each course. In addition, faculty compose a “Post-Delivery Review” report
which includes an overall assessment of the semester, changes that were made since the previous
offering of the course, and suggestions for the future. If the assessment committee would like more
details regarding the specifics of the “course dependent” assessment items reported here related to
Physics courses, we invite you to the Physics office to peruse our roughly 12 linear feet of course
notebooks, and our most recent ABET report.
At the Department level, faculty are encouraged to review the individual assessment notebooks, especially
when they receive a new course assignment. Results from senior exit and alumni surveys are distributed
to faculty either at faculty meetings, via email, or during our annual Departmental retreats. Every second
year our Departmental retreat focuses on issues related to our undergraduate program. In May 2014, we
took a close look at the longitudinal alignment of our individual course learning objectives, and how they
integrate to address our student outcomes and program objectives. We are in the process of distilling this
information and turning it into actionable items.
In situations where we list “course completion” of courses in other departments, we rely entirely upon the
Department in which the course is taught to ensure adequate assessment of the published outcomes for
the course. If we detect an apparent deficit with regards to a particular course, we report that to the
Department in question. However, in general we understand that the campus Assessment Committee is
the appropriate body on campus for overseeing the assessment plans of individual units.
1
Student Outcome 1: An ability to apply knowledge of mathematics, science, and engineering
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for all
Engineering Physics upper
division core courses (includes
for example homework,
quizzes, exams, lab reports,
written and oral reports, etc.)
Performance
criteria
course dependent see relevant course
notebook for specifics
Senior design reports
PHGN472/482)
90% B or better
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
question
Faculty feedback during UG
Faculty Retreat
GRE examination
Population/sample/
recruitment
strategies
all courses
Frequency/timing of
assessment
each course
offering
each report
annual
Population/sample/
recruitment
strategies
90% of seniors
Frequency/timing of
assessment
annual
consensus
satisfaction
all faculty
biennial
600 or better
10% of graduating
seniors
annual
Student Outcome 2: An ability to design and conduct experiments, and analyze and interpret data
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for lab
courses (PHGN215, 384, 315,
317, 326)
Performance
criteria
course dependent see relevant course
notebook for specifics
Senior design (PHGN471/472;
PHGN 481/482)
90% B or better
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
survey question
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
lab courses
each course offering
each report
annual
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
annual
2
Student Outcome 3: An ability to design a system, component or process
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for lab
courses (PHGN215, 384, 315,
317, 326)
Performance
criteria
course dependent see relevant course
notebook for specifics
Senior design reports
(PHGN471/472; PHGN
481/482)
90% B or better
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
survey question
EPICS I & II
Students successfully
complete courses
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
lab courses
each course offering
50%
annual
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
Annual
All students
When students take
these courses
Student Outcome 4: An ability to function on multidisciplinary teams
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
courses with team
each course offering
component
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for lab
courses involving teams
(PHGN215, 384, 315, 317,
326)
Performance
criteria
course dependent see relevant course
notebook for specifics
Indirect assessment
measure(s)/method(s)
Senior design survey
(PHGN471/472; PHGN
481/482)
Performance
criteria
4/5 on relevant
survey question
Senior exit survey
4/5 on relevant
survey question
90% of seniors
Annual
EPICS I & II
Students successfully
complete courses
All students
When students take
these courses
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
50%
annual
3
Student Outcome 5: An ability to identify, formulate and solve engineering problems
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
identified courses
each course offering
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for
PHGN341, 384, 315, 317, 326
Performance
criteria
course dependent see relevant course
notebook for specifics
Senior design reports
(PHGN471/472; PHGN
481/482)
90% B or better
50%
annual
Exams in all science and
engineering courses
Students pass exams
All students
each course offering
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
survey questions
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
annual
Student Outcome 6: An understanding of professional and ethical responsibility
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for
PHGN215, 315, 317, 326
Performance
criteria
course dependent see relevant course
notebook for specifics
Senior design report on ethics
(PHGN471/472; PHGN
481/482)
90% with B or better
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
survey question
NHV
Students successfully
complete course
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
identified courses
each course offering
50%
annual
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
Annual
4
All students
When taken
Student Outcome 7: An ability to communicate effectively
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
identified courses
each course offering
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for identified
writing intensive courses,
PHGN315 and 326
Performance
criteria
course dependent see relevant course
notebook for specifics
Oral reports during Senior
design (PHGN471/472; PHGN
481/482)
90% with B or better
all reports
annual
Written senior design reports
(PHGN471/472; PHGN
481/482)
90% with B or better
all reports
annual
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
survey questions
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
annual
Student Outcome 8: Broad education necessary to understand the impact of engineering solutions in a
global and societal context
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for
PHGN215, 384, 317, 341, 350,
361, 462
Performance
criteria
course dependent see relevant course
notebook for specifics
Senior design (PHGN471/472;
PHGN 481/482)
90% recognize
social/global context
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
survey question
Completion of core
requirements and required
LAIS electives
Students successfully
complete courses
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
identified courses
each course
offering
50%
annual
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
Annual
5
When taken
Student Outcome 9: A recognition of the need to engage in lifelong learning
Indirect assessment
measure(s)/method(s)
Senior exit survey
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
annual
Performance
criteria
4/5 on relevant
survey questions
Student Outcome 10: A knowledge of contemporary issues
Direct assessment
measure(s)/method(s)
Senior design reports
(PHGN471/472; PHGN
481/482)
Performance
criteria
50% address
contemporary issue
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
survey question
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
50%
annual
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
annual
Student Outcome 11: An ability to use modern engineering tools necessary for engineering practice
Direct assessment
measure(s)/method(s)
Assessments contained in
course notebooks for PHGN
384, 317, and 326
Performance
criteria
course dependent see relevant course
notebook for specifics
Senior design (PHGN471/472;
PHGN 481/482)
90% B or better
Indirect assessment
measure(s)/method(s)
Senior exit survey
Performance
criteria
4/5 on relevant
survey question
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90%
each course offering
50%
annual
Population/sample/
recruitment
Frequency/timing of
strategies
assessment
90% of seniors
annual
6
2. Map your assessment methods to your outcomes.
Not required to complete as per email communication with Kay Schneider Sep 12-14.
Table 1
Assessment
method 1:
Assessment
method 2:
Assessment
method 3:
Assessment
method 4:
Assessment
method 5:
Assessment
method 6:
Student outcome 1
Student outcome 2
Student outcome 3
Student outcome 4
Student outcome 5
Student outcome 6
Student outcome 7
Student outcome 8
Student outcome 9
Student outcome 10
Student outcome 11
3. Map the student outcomes to courses and to the ABET outcomes. You may use check marks or
designate P=primary emphasis and S=secondary emphasis. If your assessment plan only includes
the ABET outcomes and you have no additional outcomes, you do not need to complete table #3.
Table 2
Courses
Student outcome 1
Student outcome 2
Student outcome 3
Student outcome 4
Student outcome 5
Student outcome 6
Student outcome 7
Student outcome 8
Student outcome 9
Student outcome 10
Student outcome 11
1
0
0
2
0
0
2
1
5
3
0
0
3
1
0
3
1
1
3
1
5
3
1
7
3
2
0
3
2
6
3
4
1
3
6
1
4
6
2
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
4
7
1
4
7
2
4
8
1
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Student outcome 1
Student outcome 2
Student outcome 3
Student outcome 4
Student outcome 5
Student outcome 6
Student outcome 7
Student outcome 8
Student outcome 9
Student outcome 10
Student outcome 11
B
X
C
X
X
X
X
X
X
X
X
X
X
X
X
X
X
Table 3
A
X
4
8
2
D
ABET Outcomes
E
F
G
H
I
J
K
X
X
X
X
X
X
X
X
X
X
7
4. Identify the assessment activities that your program has implemented in the past year.
Direct measures:
All direct assessments mentioned above were implemented in AY 2013-14.
Indirect measures:
Graduating senior exit survey
5-year Alumni survey
Faculty retreat in May 2014
5. Describe how you have shared assessment results with faculty. Describe how faculty have
used assessment results to improve student learning, including the specific actions you have
taken or will take to facilitate students’ attainment of the student learning outcomes. (ABET
Criterion 4C.)
Our assessments indicate that we seem to be doing a pretty good job educating our students.
Table 4
Mechanisms for sharing
assessment results with
faculty:
Results from our graduating senior and alumni surveys are collated by
the Department Head and shared with faculty via email and at our
annual retreat.
Table 5
Action #1 taken:
Date action taken:
Basis for this action:
Student outcome
impacted:
Specific assessment
measure(s) that motivated
Used annual faculty retreat to examine on a longitudinal basis course
content and learning objectives and how they relate to the
expectations of subsequent courses and the curriculum as a whole.
May 2014
This has not been done in some time.
All
None. It was simply time to take a look at the integration of individual
elements within our program.
8
action (if not described
above):
Measurement/assessment
of the impact of the action
that was taken:
Faculty are better informed as to how the individual parts of our
curriculum contribute to the program as a whole. We are still distilling
the results to determine if specific action items will be identified.
6. Describe any changes that you are planning to make in your assessment plan for next year.
Table 6
Planned changes:
We will merge our alumni survey with the campus alumni survey to minimize
duplication, and improve our ability to collect quantitative data. This was
planned to happen this year, but timing-wise, it didn’t happen.
We will examine the recommendations for assessing the ethics criteria, and
explore ways in which it might make sense to incorporate them in our
curriculum.
7. Describe how you have used the feedback from the assessment committee in response to last
year’s report to improve your assessment efforts.
Table 7
Use of committee’s
feedback:
The feedback from last year’s report was largely positive—either “Established”
or “Exemplary”, so few changes were implemented.
9
Download