2010 Faculty Climate Survey University of Houston 1

advertisement
2010 Faculty Climate Survey
University of Houston
1
Executive Summary
Methodology
The 2010 Faculty Senate climate survey was administered to more than 490 tenure-track and nontenure track faculty and librarians who responded to a series of questions regarding the performance
of central administration, the President, the Provost, their respective Deans and Department Chairs.
Further, they were asked to rate their level of job satisfaction, their likelihood of leaving UH, as well
as their opinions regarding safety on campus, the posting of merit salary data, and their knowledge
regarding UH emergency management procedures and 2010 budget cuts.
Evaluation of Central Administration
Overall, evaluation of central administration was quite positive. Still, significant differences emerged
across colleges. Faculty in Hotel & Restaurant Management were most positive in evaluating central
administration’s performance, while Engineering faculty evaluated central administration least
favorably.
President’s Approval Ratings
In comparison to ratings provided for the overall job performance of the Provost, the respective
College Deans and the Department Chairs, respondents evaluated the President most positively.
The vast majority of faculty approved of the President’s job performance.
Provost’s Approval Ratings
The provost’s approval ratings were, on average, somewhat lower than the President’s and the
Department Chair’s approval ratings. Still, differences were small but more pronounced across
different colleges. Specifically, librarians approved more strongly of the Provost’s performance than
faculty in other colleges.
Dean’s Approval Ratings
In comparison to the Provost, President, and Department Chairs, Dean’s approval ratings were
lowest. Strong differences emerged across colleges, which show that particularly in Education and
Engineering, faculty approval of their Dean’s job performance was rather low.
Department Chair Approval Ratings
Overall, department chairs were evaluated positively. Still, business school and pharmacy faculty
perceived their department chairs’ performance in a significantly less favorable light than faculty in
other colleges.
Faculty Members’ Overall Satisfaction and Intentions to
Leave UH
On average, faculty members reported comparatively high levels of job satisfaction and low
intentions to quit. Comparative analyses across colleges indicate that UH’s risk of losing faculty is
highest in Engineering and lowest in Social Work.
2
Safety, diversity climate and opinions regarding posting
of merit raise information
Faculty raised significant concerns regarding safety on campus. Although the majority of
respondents felt safe in their offices or workspaces, a large percentage of respondents did not feel
safe walking to their cars during evening and weekend hours.
About half of respondents felt that merit raise data should be posted publicly, while almost 30% felt
such data should not be made available publicly.
Most respondents felt UH had a diversity-friendly work climate.
Emergency Management and Budget Cut Awareness
Although the majority of respondents were aware of UH emergency management procedures, about
27% of respondents were unsure or were unfamiliar with emergency management procedures. In
contrast, more than 75% were aware of budget cuts and/or UH Central Administration’s response to
the budget cuts.
Results of analyses of Qualitative Comments
153 of the respondents provided qualitative comments. Themes that were addressed by qualitative
comments pertained to concerns about national competitiveness of faculty salaries and salary
compression, the value UH places on funded versus non-funded research, and the somewhat lower
emphasis on high-quality teaching (in comparison to high-quality research).
3
Survey Methodology and Results
Survey Development
The survey tool for the 2010 faculty climate survey was developed by the Faculty Senate’s Climate
Survey committee with the goal to continue to assess faculty perceptions regarding administration’s
overall performance, as well as evaluate faculty opinions regarding issues such as posting of merit
raise data and faculty concerns regarding safety on campus.
Survey Data Collection
Survey data was collected in May 2010, a web-based survey tool was used and invitations were sent
to all faculty members. Note that in contrast to prior climate surveys, data in 2010 was collected from
tenure track and non-tenure track faculty, instructors as well as librarians.
Responses by Rank and Job Title:
Valid
Frequency
Percent
Percent
Assistant
Professor
Associate
Professor
Full
Professor
Librarian
Other
Total
4
Cumulative
Percent
75
15.2
15.2
15.2
119
24.2
24.2
39.4
155
31.5
31.5
70.9
16
127
492
3.3
25.8
100.0
3.3
25.8
100.0
74.2
100.0
Detailed Break-up by Job Title:
Frequency
Tenured/Tenure track Full Professor
Non-tenure track Clinical Associate
Professor
Non-tenure track Clinical Assistant
Professor
Non-tenure track Clinical other
Non-tenure track Instructional Full
Professor
Non-tenure track Instructional
Associate Professor
Non-tenure track Instructional
Assistant Professor
Non-tenure track Instructional other
Librarian
Senior Associate Librarian
Associate Librarian
Tenured/Tenure track Associate
Professor
Assistant Librarian
Tenured/Tenure track Assistant
Professor
Tenured/Tenure-track other
Non-tenure track Research Full
Professor
Non-tenure track Research Associate
Professor
Non-tenure track Research Assistant
Professor
Non-tenure track Research other
Non-tenure track Clinical Full
Professor
5
Percent
155
31.2
11
2.2
22
4.4
3
.6
5
1.0
3
.6
17
3.4
11
2.2
2
.4
1
.2
4
.8
119
23.9
9
1.8
75
15.1
1
.2
8
1.6
4
.8
13
2.6
3
.6
1
.2
Responses by College:
Except for 25 respondents, study participants reported their College. The largest number of
respondents worked in CLASS, with the second highest number of respondents working in NSM.
One Honor’s College faculty member responded, but was excluded from further analyses and
reporting to preserve anonymity.
6
Overall performance by central administration:
When asked to evaluate the overall performance of central administration, the majority of
respondents felt performance was strong or very strong (total of 45.5%), and another 34.2% of
respondents evaluated performance as “normal”. Only about 20% of respondents evaluated central
administration’s performance as “weak” or “very weak”.
7
Overall performance by central administration,
evaluations by college:
Respondents were asked to rate administration’s overall performance, with a rating of 1
corresponding to “very weak”, a rating of 2 corresponding to “weak”, a rating of 3 corresponding to
“normal”, a rating of 4 corresponding to “strong” and a rating of 5 corresponding to “very strong”. The
graph below displays averages across colleges. On average, faculty in Hotel/Restaurant
Management viewed performance as most favorable, while respondents in Engineering were most
concerned about overall administration’s performance, with average ratings ranging between “weak”
and “normal” for engineering faculty members.
8
Comparative job approval ratings for President, Provost,
Deans and Department Chairs
Respondents were asked to indicate whether they “strongly disapproved” (1), “somewhat
disapproved” (2), “slightly disapproved” (3) felt “neutral” (4) or “slightly approved” (5) or “somewhat
approved” (6) or “strongly approved” (7) of the respective performance of the President, Provost and
their respective Deans and Department Chairs.
Analyses demonstrate that overall, faculty were most inclined to evaluate the President’s
performance positively, with the performance of College Deans being evaluated most negatively.
9
Job approval ratings for President, Provost, Deans and
Department Chairs by College
The chart below displays the President’s approval ratings by college. Colleges not listed had only
one individual who completed the survey item. Note that on average, librarians were most satisfied
with the President, while faculty in Engineering was least likely to approve of the President’s job
performance.
All responses were scored based on the following response scale: “strongly disapprove” (1),
“somewhat disapprove” (2), “slightly disapprove” (3), “neutral” (4) or “slightly approve” (5) or
“somewhat approve” (6) or “strongly approve” (7).
10
The chart below displays the Provost’s approval ratings by college. Colleges not listed had only one
individual who completed the survey item. Note that on average, librarians were most satisfied with
the Provost, while faculty in Engineering was least likely to approve of the Provost’s job
performance.
All responses were scored based on the following response scale: “strongly disapprove” (1),
“somewhat disapprove” (2), “slightly disapprove” (3), “neutral” (4) or “slightly approve” (5) or
“somewhat approve” (6) or “strongly approve” (7).
11
The chart below displays Dean’s approval ratings by college. Colleges not listed had only one
individual who completed the survey item. Note that on average, faculty in Optometry and librarians
were most satisfied with their Dean, while faculty in Education and Engineering were least likely to
approve of their Dean’s job performance. In Business, Engineering and in Education, Deans on
average received ratings indicative of disapproval by their respective faculty. Variability among
ratings of Dean performance was also high across colleges, suggesting that the quality of leadership
at the Dean’s level varies significantly across colleges.
All responses were scored based on the following response scale: “strongly disapprove” (1),
“somewhat disapprove” (2), “slightly disapprove” (3), “neutral” (4) or “slightly approve” (5) or
“somewhat approve” (6) or “strongly approve” (7).
12
The chart below displays Department Chair’s approval ratings by college. The 2010 FS climate
survey did not ask respondents to indicate what department they worked in to protect participant
confidentiality. Hence, scores for Department Chairs are presented in a form where they are
aggregated at the College level. Approval ratings for department chairs were lowest at the business
school and in pharmacy and highest in Law and Education. Note that although the scores for the
library appear particularly low, they cannot be interpreted as such since the organizational structure
of the library does not use Department Chair’s in a way similar to what is done in other colleges.
All responses were scored based on the following response scale: “strongly disapprove” (1),
“somewhat disapprove” (2), “slightly disapprove” (3), “neutral” (4) or “slightly approve” (5) or
“somewhat approve” (6) or “strongly approve” (7).
13
Overall job satisfaction and intentions to leave UH and
seek alternate employment among faculty members
On a scale from 1-5, where 1 represented “strongly disagree” and 5 represented “strongly agree”,
participants were asked to rate their job satisfaction, endorse whether they were currently actively
seeking alternative employment options, and whether they anticipated leaving UH in the near future.
Overall, respondents were satisfied with their jobs, and mean ratings for intentions to leave UH
indicate that on average, UH faculty and librarians are committed to continuing their work at UH.
Response scale for these items ranged from 1-5, where 1 represented “strongly disagree” and 5
represented “strongly agree”.
14
Overall job satisfaction by College
On average, librarians were most satisfied with their jobs, while faculty in Engineering were least
satisfied with their jobs at UH.
Response scale for job satisfaction ranged from 1-5, where 1 represented “strongly disagree” and 5
represented “strongly agree”.
15
Overall intentions to leave UH among faculty members,
by college
Although on average respondents indicated they were not currently looking for positions elsewhere,
an interesting pattern of responses emerged across colleges. Faculty in Engineering were most
likely to report that they were currently seeking alternative employment, while faculty in Social Work
were least likely to currently seek employment elsewhere.
Response scale for this item ranged from 1-5, where 1 represented “strongly disagree” and 5
represented “strongly agree”.
16
“Hot Topics”: Safety, Diversity Climate and Opinions
regarding Posting of Merit Raise Information:
On a scale from 1-5, where 1 represented “strongly disagree” and 5 represented “strongly agree”,
participants were asked to rate whether they currently felt safe on campus, whether they felt UH
provided a diversity-friendly work environment, and whether they were generally in favor of posting
raise/salary data for UH faculty.
On average, respondents were unsure as to whether they felt safe on campus during weekend and
evening hours. A closer look at the data indicates that approximately 37% of respondents do not feel
safe walking to their care during evening and weekend hours, another 23% of respondents were
unsure if they felt safe. In contrast, examination of another item that pertained to whether
respondents felt safe in their offices/workspaces revealed that most study participants felt their
offices or workplaces were quite safe (only about 7% did not feel that was the case).
With regards to diversity on campus, about 10% of respondents felt UH did not have a diversityfriendly climate, while more than 65% of respondents felt UH provided a diversity-friendly work
environment.
About 51% of respondents were in favor posting merit raise and other salary data. However, about
29% of respondents felt such data should not be made available publicly. The remaining 20% were
unsure or did not have a preference either way.
Response scale for these items ranged from 1-5, where 1 represented “strongly disagree” and 5
represented “strongly agree”.
17
Respondents’ Awareness of Emergency Management
Procedures and Budget Cut Issues
Respondents were asked, again on a 1-5 scale ranging from “strongly disagree” to “strongly agree”
whether they were aware of UH emergency management procedures, and whether they were aware
of state-mandated budget cuts and processes used by Central Administration to respond to these
cuts.
Approximately 27% of respondents indicated they were not aware of UH emergency management
procedures, or were unsure as to whether they were aware. The remaining 63% either agreed or
strongly agreed that they were aware of UH emergency management procedures.
With regards to budget cuts, slightly more than 76% were aware of the state-mandated budget cuts
and/or the processes used by Central administration to respond to the cuts. The graph below
represents the percentage of respondents who disagreed, were neutral or agreed to the two
statements.
18
Themes emerging from qualitative comments: 2010
Open-ended comments were received from 153 respondents in 2010, compared to comments
from 219 respondents in 2007. Among these comments are several recurring themes, summarized
below.
19

The most common recurring theme continues to address faculty salaries not being
nationally competitive, or with the effects of salary compression and inversion experienced
by continued faculty.

Another frequent comment was the increasing emphasis on the importance of externally
funded research as a means of academic advancement. An apparent shift from comments
in 2007, however, is that far fewer comments reflect displeasure with institutional support for
faculty research efforts in 2010.

There were numerous comments in 2010, however, that research and scholarship that do
not result in external funding are still perceived by respondents to be less valued by the
institution and to receive little or no institutional support.

In a similar vein, a number of comments expressed perceptions that teaching continues to
be of a lower value, despite the strong emphasis on six-year graduation rates in guidelines to
be met to access state funds to support emerging research universities. It may be
noteworthy that the number, and dollar value, of teaching excellence awards is increased
since these data were gathered.

The one-day mandated furlough to meet requirements for return of state funds was not
warmly received, with some indications of perceptions that more funds than necessary to
meet state mandates were produced and redistributed within the University.

A number of comments dealt with issues that appear to be department or college-specific.
Particularly noted is that the 2010 survey did not allow participants to differentiate between
the effectiveness of departmental and college administration. A number of respondents
indicated that their ratings would be very different between the two levels, and therefore
simply did not respond to those items.

Comments from several colleges reflect a perception that a growth in administrative costs
at the college level was evident – at the same time that central administration was reducing
staff and amid calls for higher teaching loads. Costs of college administration is likely an
area best examined by the Provost with an eye to equity.
Overall Recommendations for Organizational
Interventions – Senate and Administration Initiatives
Altogether, the climate survey committee recommends Faculty Senate leadership and Administration
consider the following interventions on the basis of the results of the current survey:

Evaluations/approval ratings of Deans were most negative in the 2010 survey. In prior
climate surveys, Department Chairs and Deans were generally evaluated more positively
than the Provost and/or President. This pattern has been broken in the 2010 data, suggesting
that the President and Provost’s communication with faculty have improved significantly. In
contrast, communication and leadership at the Dean’s level were evaluated critically in a
number of Colleges, while evaluations of Department Chairs remained positive. The data
supports the Faculty Senate Executive Committee’s approval of the Faculty Affairs
Committee’s recommendation to have Faculty Senate administrated off-cycle review of
Academic Deans by college faculty.

The committee recommends further steps be taken at the College and Department levels to
communicate UH emergency management procedures in order to reduce the percentage of
individuals who are unsure about current emergency management procedures and their
implementation.

Significantly more faculty members were in favor of posting merit raise data than were
against the posting of merit raise data. We hence recommend the recently established
practice of publishing merit raise data be continued by Senate and Administration in the
future.

We recommend further attention be devoted to address safety on campus during evening
and weekend hours. The proportion of faculty who do not feel safe walking to their cars or
who are unsure as to whether they feel safe poses a significant concern for the Senate.
20
Recommendations for Future Senate Climate Surveys
Altogether, the climate survey committee recommends the following considerations be taken into
account as future climate surveys are developed and deployed:

In order to enhance response rates, we recommend Dilman’s Tailored Design Method for
survey design and deployment to be adapted for future climate surveys. Specifically, we
recommend potential respondents receive information regarding the survey before its actual
deployment, and for reminders to come from different constituents. We further recommend
Senators for each college send separate notes informing their constituents about the climate
survey and its objectives.

We recommend Senators for each College to be charged with reporting results derived from
the 2010 Climate Survey back to their constituents to ensure that the credibility of future
climate surveys is high.

We recommend a standard web-based survey package (e.g. surveygizmo or surveymonkey)
be used for data collection to ensure that data extraction, analyses and reporting can be
completed within approximately ten weeks after completion of data collection. Utilization of a
standard web-based survey package will also allow multiple committee members to extract
and work with the data.

The structure of the 2010 climate survey does not allow for department level analyses
(respondents did not indicate what department they worked in). Future climate surveys could
potentially include a section on department-specific issues as well as department chair
evaluations that can be tied to departments through demographic information. However, we
recommend specific language be used specifying that completing that section is voluntary
and that participants should only complete the section if they feel comfortable responding.
21
Download