Annual Assessment Plan

advertisement
Division of Academic Affairs
Annual Assessment Report AY 10-11
Annual Assessment Plan AY 11-12
For Undergraduate & Graduate Degree Programs
Program Information
Name of Program:
Prepared By:
Mathematics
College:
CMS
(Department Chair/Program Coordinator)
Olaf Hansen
Date:
09/29/2011
Email Address:
ohansen@csusm.edu
Extension:
8005
PART A: Annual Assessment Report AY 10-11
Due by May 27, 2011
Please note: Should you need additional time to analyze data gathered in Spring 2011, please send an email requesting an extension
to Jennifer Jeffries (jjeffrie@csusm.edu).
1) Please describe:
A. the program student learning outcomes you focused on for assessment this year.
B. the assessment activities you used to measure student learning in these areas.
Jennfier
C. Jeffiresto
the results of your assessment(s).
D. the significance of your results.
A. We focused our assessment on SLO 1, Demonstrate mastery of the core concepts in algebra and analysis.
This SLO is also related to SLO 3, Develop and write mathematical proofs, because part of the mastery is
the ability to prove mathematical statements and to understand the connection between results. These
connections become most visible in the proof of theorems, where deductive logic is used to derive new
results from known true statements.
B. In Fall 2010 the Mathematics Department decided to use Knowledge Surveys to assess SLO 1. Knowledge
surveys ask students about their confidence to answer a set of questions. Students don’t have to answer
the question in the survey, but indicate their ability to answer the question. Typically they have a few
choices (for example 3) and we used the following choices to answer each question in the survey:
1. I am unable to provide a complete, correct, and detailed response.
2. I have partial knowledge or know where to quickly (20 minutes or less) obtain a complete, correct,
and detailed response.
3. I can provide a complete, correct, and detailed response with my present knowledge.
When I started developing the questions in May 2011 I decided to assess SLO 1 in two of our core classes
which are closest related to SLO 1: MATH430 Foundations of Analysis, and MATH470 Introduction to
Abstract Algebra. MATH430 provides the theoretical background for Calculus and gives students a deeper
understanding of derivative and integral beyond the mechanical skills which students learn in Calculus
classes. MATH470 teaches students about the theory behind the solution of equations and how the solution
of equations is related to the number systems we use.
When I selected the topics for the questions in each class I followed the course descriptions in the university
catalog which itself closely follow the sections and the order in mot standard text books on these subjects.
For MATH430 I developed 126 questions distributed in the following way:
Sets, Numbers, and Inequalities
24 questions [1-24]
Limits of functions
8 questions [25-32]
Sequences and Limits
9 questions [33-41]
Continuous Functions
15 questions [42-56]
Differentiable Functions
20 questions [57-76]
Integration
19 questions [77-95]
Series
18 questions [96-113]
Taylor’s Theorem
13 questions [114-126]
The expectation is that students should be fairly familiar with topic one from their Calculus classes and
1
MATH378, which is a prerequisite for MATH430.
For MATH470 I developed 108 questions in the following areas:
Binary Operations and Groups
47 questions [1-47]
Rings
36 questions [48-83]
Fields
5 questions [84-88]
The polynomial Ring F[x]
20 questions [89-108]
Here we also expect that students should be familiar with topic one because of the MATH378 prerequisite.
Also simple examples of rings and fields appear in MATH378.
The survey was distributed with the help of SurveyGizmo and the idea was that students will take the
surveys online outside of the class time. Knowledge surveys are usually given several times during a
semester to study the changes as students progress through the course. I decided to give the survey twice,
once in the second week and then in the fifteenth week of the semester.
C. The two instructors for MATH430 and MATH470 announced the knowledge survey in class during the first
week and encouraged students to take the survey. Dr. Puha, the instructor of MATH430, gave as an
additional incentive some extra credit for the completion of the survey.
The first lesson we learned was that this kind of motivation is needed. From the 13 students of MATH470
only 3 took the survey. In Dr. Puha’s class 21 out of 23 students took the survey. So at the end of the
semester we will repeat the survey only for the MATH430 class. The following graph shows the results from
the first MATH430 survey
The graph shows for each question the average of the answers. One can see that the averages are higher
for the first questions, which is expected. At the time the survey was closed Dr. Puha covered continuous
functions. Question 50, where averages seem to drop, is a question about the continuity of the absolute
value. If we calculate the average over the averages of the first 50 question we get 2.25 compared to 1.74
for the remaining 76 questions. This seems to indicate a difference in the average values. Within the first 50
questions it appears that questions 9-13 and question 21 have a low outcome. Questions 9-13 are
questions about infimum and supremum, difficult concepts, which students might need to see more often
during the course in order to gain a better understanding.
For questions 21 and 92 there was problem with the survey. The web browser had problems to display the
question correctly, so the question was only partly displayed.
D. Because the main point of a Knowledge Survey is the repetition of the survey to assess the progress
students make during the course, we cannot derive any conclusions in the moment. But it is encouraging to
see that the current data already indicate the material which is covered in class. Also the relatively low
results for the questions about supremum and infimum, which is kind of expected, show that the collected
2
data contains useful information.
2) As a result of your assessment findings, what changes at either the course- or program-level are being
made and/or proposed in order to improve student learning? Please articulate how your assessment
findings suggest the need for any proposed changes.
As mentioned in 1.D the department has to wait for the outcomes of the second knowledge survey, before we can
analyze the data for MATH430.
The only insight we got from the first round of surveys is the clear need to give students extra credit for the
completion of a survey with more than 100 questions. To answer about 100 questions is a large time commitment
for students (students report about 20-60 minutes) and we have to provide some incentive to spend this much time.
If the department decides to run the surveys again next Fall the faculty, who will teach these classes, must be willing
to give extra credit, otherwise we cannot expect a sufficient number of completed surveys.
3) If you used the resources that were given to you as stated in your plan, please check here.
If you used them differently, please provide specifics.
As far as I know no plan for the current year was submitted.
PART B: Planning
for Assessment in 2011-2012
Required by October 3, 2011
1) Describe the proposed PSLO activities for AY 2011-12. (Note that assessing PSLOs can take many forms.
Programs may find the attached list of sample assessment activities helpful. Please check the Assessment
website for additional resources at www.csusm.edu/assessment/resources/).
As indicated above we will assess SLO 1 in the AY 2011-12 with the help of a Knowledge Survey. At the beginning
of the Spring semester I will collect and prepare the data, so the department can discuss the results. This might lead
to adjustments of course content or changes to the prerequisites.
In Spring 2012 we will assess SLO 6, Understand, produce, and critique mathematical models and algorithms
appropriate to their fields of specialty, utilizing appropriate software where necessary:
MATH 523 Cryptography and Computational Number Theory (Shahed Sharif) for
"Understand, produce, and critique algorithms appropriate to the field of specialty, using appropriate software
where necessary."
MATH 620 Seminar in Advanced Mathematics (Amber Puha) for
"Understand, produce, and critique mathematical models appropriate to the field of specialty, using appropriate
software where necessary."
MATH 563 Numerical Solution of Ordinary Differential Equations (Olaf Hansen) for
"Understand, produce, and critique algorithms appropriate to the field of specialty, using appropriate software
where necessary."
We will grade one homework assignment at the beginning and one homework towards the end of each course with
special consideration of the above PLO. I am supposed to develop two rubrics for the assessments (one for the
3
mathematical models, one for the algorithms) until January. The hw will be graded based on these rubrics and we
will then analyze in how far incoming students had already some exposure to mathematical models or algorithms,
and in how far the students’ abilities improve over the course of the semester. Maybe our expectations here are too
high?
2)
What specific assessment activities will you conduct this year in order to measure student
achievement of these outcomes?
The Knowledge Survey for MATH430 will be administered for the second time in December 2011.
For our graduate program I will develop a rubric for the assessment of understanding, producing, and critiquing
mathematical algorithms and models. Then we will collect results from graded homework and exams with respect to
this SLO during Spring 2012. The courses where we will assess this SLO are MATH523, MATH563, and MATH620.
3) Please describe how the assessment support of $750 will be used.
Sample of assessment activities:

Development (or review and refinement) of measurable program student learning outcomes.

Development of program student learning outcome matrix, showing in what courses each
PSLO is assessed, or each PSLO is introduced, reinforced and expected to be mastered.

Identification and assessment of one or two program SLOs.

Development of rubrics for assessing PSLOs.

Development of commonly agreed upon “signature assignments” designed to assess mastery
of PSLOs.

Faculty assessment retreat.
4

Dissemination and/or discussion of PSLOs with students in the program.

Development of course learning outcomes and delineation of their relationship to PSLOs.
5
Download