School of Mathematics, Meteorology and Physics

advertisement
School of Mathematics, Meteorology and Physics
Proposed School Module Evaluation System
Robin Hogan
Version 2: 4 Nov 2004
This document describes a proposed unified module evaluation system for the School of Mathematics,
Meteorology and Physics. The requirements of the new system is that it should be
1. Able to serve the needs of all three departments while being simple and concise.
2. Able to provide all the main information as numerical summaries of multiple-choice questions, while
welcoming further written comments.
3. Open: the information should be easily available to both staff and students.
4. Efficient to administer across the whole school.
5. Acted upon: the information should be used by Heads of Department and Head of School to reward
good teaching as well as remedy bad teaching.
As with current forms, the new forms will consist of a list of multiple-choice type questions followed by
space to write further comments. Separate forms are required for the different teaching environments:
lectures, practicals and academic tutorials.
Collation of the information
Ideally the new system should be manageable by a single school secretary. The main work involved is to go
through the completed questionnaires for a module and to add up the number of entries in each of the
possible replies to each question, summarising the results in a spreadsheet. The job also involves maintaining
a database of modules, printing off copies of the questionnaires for the lecturers and disseminating the
information. In the Department of Mathematics in the past a lot of administrative time associated with
module evaluation has been spent typing up the hand-written comments to guarantee student anonymity.
Currently in the Department of Meteorology only around 60% of modules are evaluated when they are
given: a particular module is evaluated only once every two years (except for all new modules and all
modules given by a different lecturer from last time, which are also evaluated). Even so, the collation of the
statistics from the multiple-choice questions takes around a week of clerical time per term. The workload
scales approximately with the number of students, and there are around twice as many students in each of the
Mathematics and Physics departments as in Meteorology, so the evaluation of 100% of modules from all
three departments would be expected to occupy around 8 weeks of clerical time per term, or around half of a
full-time person.
This is a lot of work for one person so initially I propose that we follow Meteorology practice and evaluate
only half of the modules in a given year (but also evaluate those given by a new lecturer from last time and
all new modules). The staff-student liaison committee exists to ensure that problems with unevaluated
modules can still be picked up. Also it is too much work for the handwritten comments to be typed up. As
Meteorology already has a secretary allocated to module evaluation (Della Fitzgerald) I propose that for the
first year the school secretary handles only the evaluation of Mathematics and Physics, but using the same
questionnaires and summary spreadsheets. In the second year the school secretary would ideally take on
module evaluation for all three departments.
An option to consider for the future is the acquisition of an optical form reader that would enable the
responses to the multiple-choice questions to be read automatically into the spreadsheet, thus making the
whole process much more efficient. Facilities are available in psychology and elsewhere, but there is little
spare capacity so the school would have to purchase a system (cost around £1200). It might also make it
possible to evaluate every module every year. The situation will be reviewed after the first year of the
information being collated manually.
Multiple-choice questions
Care must be taken in deciding the questions that are asked in the multiple-choice section. As this is the
information that is disseminated through the school (as frequency distributions for each question), it must be
comprehensive enough to pick up the main aspects of good lecturing as well as noting the main pitfalls of
1
bad. On the other hand, some things are minor enough (e.g. size of a lecturers handwriting) that a specific
box is not required.
There are three main types of multiple-choice question in the context of module evaluation:
A. Those for which the perfect lecture would score at one end of the scale, such as “How interesting
were the lectures?” from 1 (very) to 5 (not at all). These questions are probably better with 4 rather
than 5 classes, as they then don’t allow the student to “sit on the fence” by going for the middle
value, although for uniformity with questions of type B, 5 may be preferable. They may also be
phrased as a statement to which the student is asked for a response between “strongly agree” and
“strongly disagree”.
B. Those for which the perfect lecture would score in the middle of the scale, such as “How well paced
were the lectures?” from 1 (too fast) through 3 (just right) to 5 (too slow).
C. Those for which each category indicates a range of numerical values, such as “How much work did
you spend on the practicals outside class?” where each box consists of a different range of hours
from, say, “0-1 hours” to “>5 hours”.
Note that the agree-disagree form doesn’t cope efficiently with questions of type B or C above; two
questions would be required to diagnose whether the lectures were too slow or two fast, and for determining
time spent outside class the form is quite inappropriate.
I now summarise the questions that are asked at the moment (based on sample questionnaires that I have
received) and the questions that I propose to ask in a School questionnaire. The letters A-C indicate the type
of question, followed by the number of multiple-choices boxes available, with a “+1” indicating that a “not
applicable” box is also present. The “School” column also has brief reasons why I have not included some
questions. For uniformity there are 5 possible responses to all questions.
Lectures
Question
Too difficult/easy?
Interesting?
Lecturer clear?
Lecturer audible?
Lecturer legible?
Lecturer punctual?
Room suitable?
Too much/little work involved?
Notes/handouts helpful?
Tutorials helpful?
Want more/fewer tutorials?
Reasonable pre/co-requisites
Module well structured?
Lecturer enthusiastic?
Lecturer made clear the direction
heading in?
Feedback on work useful?
Work returned promptly?
Mathematics
A5
A5
A5
A3
A3
A3
A3
Meteorology
B5
A4
A4
Physics
A4
A4
A4
School
B5
A5
A5
Minor issues: if a
problem they will be
written down in the
“comments” section
B5
A5
A5
A4+1
A4+1
B5
A5
A5
A5
A5
A4
A4
A4
A4
A5
A5
Lecturing too fast/slow?
Student interaction encouraged?
Your attendance at lectures?
Your attendance at workshops?
You used recommended texts?
Your work outside class.
B5
A4
A4
A4
A4
2
Synonymous with
“too fast/difficult”?
A4+1
A4+1
B5
A5
A5
A5
Synonymous with
“well structured”
A4+1
Available in practical
questionnaire
B5
A5 (new)
We want information
about the lectures, not
about the students
Practical classes
Question
Too difficult/easy?
Interesting?
Too much/little work for scheduled time?
Notes/handouts helpful?
Reasonable pre/co-requisites
Help understanding of concepts?
Work returned promptly?
Feedback useful?
Time spent outside class
Supervision helpful?
Your attendance at workshops?
You used recommended texts?
Your work outside class.
Meteorology
B5
A5
Physics
A4
A4
A4+1
A4
A4
A4
A4
A4+1
C4
A4
A4
A4
A4
A4
A4
School
B5
A5
B5 (new)
A5
One for the lecture eval.
A5
A4+1
A4+1
C5
A5
We want information
about the lectures, not
about the students
Academic tutorials
Question
Tutorials helpful?
Clear explanations?
Enthusiastic instructor?
Encouraging of participation?
Responsive to problems?
Work returned in good time?
Useful feedback on marked work?
Mathematics
A5
A5
A5
A5
A5
A5
School
A5
A5
A5
A5 (new)
A5
A4+1
A4+1
The Word “mail merge” facility is used so that the details of module may be easily changed using the red
buttons at the top of the page. Customisation is necessary for each department: the department name is
mentioned near the top of the first page, and the name of the module evaluation coordinator for the
department and the place where questionnaires should be deposited is indicated at the bottom of the second
page. The module database file will need to be kept up-to-date for each department.
There will be some difference in the nature of practicals and tutorials in each department. Where a module
includes both a lecturing and practical component (where a practical could be a laboratory or computer
class), it is envisaged that separate questionnaires would be handed out for the two parts. Only Mathematics
seem to evaluate their tutorials, so they may wish to customise the new tutorial form; for example, it may not
be feasible to have a different entry in the database for each tutor, in which case there would be an entry for
the student to fill in the name of the tutor.
Space for open comments
These are phrased to encourage positive and constructive responses – the first question is “Which aspect of
this module/practical class/tutorial did you like most?” The second is “How could this module/practical
class/tutorial be improved?” This is much better than “Which aspect did you like least?”
Use of the information
At the start of each academic year the staff member for module evaluation in each department provides a list
of modules to be evaluated that year to the school secretary, including the maximum number of students
likely to attend each module. The secretary prints out a sufficient number of questionnaires at the appropriate
time and the lifetime of the questionnaires is then as follows:
1. Given to students at the end of a module, with 5 minutes allocated to fill it in. They put it in the box
provided in the lecture theatre or in a general box elsewhere in their department.
3
2. These boxes are emptied after the end of term (and after week 5 in the case of 5-week meteorology
modules) by a secretary in that department, and sent to the School secretary responsible for module
evaluation.
3. The school secretary collates the numerical information from the multiple-choice part of the forms,
then sends the questionnaires back to the staff member responsible for module evaluation in that
department, along with the numerical summaries.
4. The staff member checks the written responses for major issues, if necessary alerting the Director of
Teaching and Learning or the Head of School.
5. The questionnaires are then sent to the lecturer who can make use of the information to improve the
module next year.
6. Questionnaires may be returned to the school office for storage. The numerical data are published on
noticeboards in the departments and on the school web site (e.g. as downloadable excel files).
I am not at all clear by what means this information will be used to highlight particularly good teaching so
that it may be rewarded, or to highlight consistently bad teaching so that action may be taken. I would hope
that the Director of Teaching and Learning and the Heads of Department take an interest in the results of the
questionnaires, otherwise there is little point in giving them out.
4
Download