October 9

advertisement
Policy Board Minutes—Oct. 9, 2013
3:30 p.m.
UC 503
Present: Nola Agha, Yvonne Bui, Robin Buccheri, Joe Garity, Devon Holmes, Richard Johnson III, Keith
Hunter, Jack Lendvay, Paul Lorton, Gabe Maxson, Willie MeLaugh, Elliot Neaman, Mary Jane Niles,
Julia Orri, Brian Weiner, Peter Williamson
Guests: Ed Munnich from Psychology; Rebecca Seeman (for McGoldrick)
Minutes from 9/25 were approved
I.
Announcements
A. New Faculty dinner 10/19 at Postrio Restaurant at the Prescott Hotel. All new (within 3
yr) are welcome.
B. Bui reported on parking (via Dan Lawson): double stickers not available until spring—
take 1 and switch; the use of SOE lot for faculty/staff only will be investigated.
C. A conference phone is available for PB meetings in UC 503.
D. Provost Turpin will attend the next meeting—address questions to Neaman
II. New Business
A. Ed Munnich from Psychology— Munnich is co-chair of the new teaching evaluations
(USF-Blue). They go live next fall. He is encouraged of the potential success of the
program and expects good online feedback. Neaman asked about a “sunset clause”,
similar to the new parking system. Munnich said that it needs to be negotiated in the
CBA. We have agreed to the implementation process and objections need to be raised
now. Both systems are in use this fall (only SUMMA counts). Twenty questions will be
eventually used (from 40). The deans will not have access to the new survey this year.
Four constructs will be used. All classes (not online classes) will be included this
semester. Neaman asked about if we agree later in the year. Munnich replied that
SUMMA is not designed around formal constructs. A new construct is “student
engagement”. Lendvay mentioned that SUMMA gives national stats which are
comparative and helpful. Munnich replied that there has been a national movement,
supported by the literature, away from national norms and towards evaluation that
reflects each school's values. For example, national norms reflect schools with wide
variation of class sizes and it is not clear how meaningful the comparison is. Buccheri
asked if we would see two sets of results. Munnich said that some new questions may not
http://www.usffa.net/
perform well but it may be too early to release results. Should a chair see results for parttime faculty? (no). Lendvay said that we should see our results—is this a replacement?
He would like information before agreeing to a change.
Johnson III asked about access to SUMMA scores during ACP’s. Neaman saw no
problem with seeing information. Munnich said that the new results would not carry any
weight now. Bui asked how soon the results would be ready—basically at grade time.
Feedback would be available quickly. Bui asked if results could be made available
earlier—can we give the evaluations in October? (no). Perhaps in the future for midterm
evaluations. Johnson III asked about the company. Any platform would be possible for
responses (smart phone, tablet). The questions are formatted for the screen—whatever
platform they are on. Munnich mentioned concern about bias. Using class time for
evaluations may decrease this. Bui asked about the number of questions—40 currently
(then 20). There will be a 6-point scale. Drawings for prizes for students will be available
as incentive for completing the evaulations.
Neaman asked about P and T. Munnich added that there are no national norms—if
you are above the national norm, is this “superior” for P and T? We need to compare to
the USF average according to the contract. Munnich mentioned that the SUMMA
questions create bias. The new system could focus on learning outcomes and reduce the
bias. The 4 constructs are: design, practices, student engagement, and learning. There
will be 4 numbers that emerge from these 4 constructs. We will know what the USF
average is. Lendvay said that he liked the idea of 4 constructs.
Neaman added that the P and T committees may have a more informed discussion.
Munnich said that if a faculty member is low in 1 construct—help may be available
(Center for Teaching Excellence). Johnson III added that bias may still occur—that this
may be a greater concern online. Neaman said that SUMMA’s are anonymous.
Williamson added that people may respond with different rules for online surveys—may
be influenced by websites or social media use prior to completing the eval. Neaman said
that this could work both ways. Munnich said that this is pilot information and bias could
be evident. Bui stressed the importance of the evaluations and that students should
respond in a responsible way. Munnich mentioned that good feedback have been received
from students. Students enter into USF portal to answer. Neaman asked about prior
research on online evaluations. Seeman added that perhaps students can read about the
importance of the evaluations. Maxson asked about comparing SUMMA and the new
questions. Agha asked about timing—some of the Sport Management classes end early
(September). Munnich replied that for the fall, only classes that end in December will be
included. Once the instrument is developed, they will be sent out at the same time.
Munnich continued to respond to concerns. The feedback has been good—the
environment was a concern. Hunter added that the online environment may induce bias.
Other concerns:
Can we keep SUMMA running? We need to choose our battles.
http://www.usffa.net/
How do we increase response rate? Negative responses will be increased. We don’t
want grievances. Blue sends out reminders. There may be incentives. Johnson III added
that grades could be blocked. Munnich added our longer grading period may be a
deterrent. Perhaps students can see their grades once they have submitted their
evaluations. Bui was concerned about small classes—what happens to small sample size
and skewed data? Students should be made to complete the evaluations. Munnich
suggested using class time for the evaluations to increase response rate and focus. He
added that faculty who talk about evaluations get increased response rates. Johnson III
asked about multiple sections with different scores.
Munnich mentioned about faculty adding their optional questions. We should think
about questions we want. We want a bank of supplemental questions with standardized
questions. He suggested emailing him (emunnich@usfca.edu ) with questions. We will
own the instrument—and evaluate for 3 years. The new survey is up to date with new
teaching practices.
How do we ask a specific question for each class? Our students need to learn our
learning outcomes. Faculty need to refer specifically to their learning outcomes. He
stressed the importance of using a similar language—for example “assignment”. Bui
asked about the option of leaving questions blank. Be clear to explain the language on
your syllabus. Lendvay stressed the importance of a consistent vocabulary. Faculty will
receive an average score—data will be in Excel or PDF format. Unit mean will be
reported.
Munnich added that WASC needs course evaluations. We own the teaching
evaluations. We need to make sure that students understand the difference between the
two. What happens to part-time faculty who are given a pre-designed syllabus? How are
they evaluated on design? We need to be mindful of this.
We would save money and time with the new evaluation. What would we spend the
money on? We have five years to pay for the initial outlay—then money (and staff time)
will be saved. How will the staff be re-assigned? These are some questions to be
addressed. Bui asked about deans’ involvement. Munnich mentioned that the deans were
concerned also. She asked comparing the two systems—what is a “good” score in Blue?
Munnich added that SUMMA data are available. Weiner asked if there was a faculty
survey question about syllabi (like SUMMA). Munnich said that these are concerns for P
and T narrative. Weiner added that these questions could be added at the beginning.
Lorton suggested changing the contract to reflect the actual questions used for P and T.
Neaman replied that this may make it harder on the applicant (part of the article can be
opened during negotiations).
Munnich finished by stating that we would like more information (faculty only).
Weiner asked about the validation process—there are 40 items that will go into a
correlation matrix. Items may belong to a different construct. Items with the best
predictor constructs will be used.
http://www.usffa.net/
The meeting adjourned at 4:58 p.m.
Submitted by Julia Orri
Policy Board Secretary
http://www.usffa.net/
Download