2009 Student Satisfaction Survey: Key Findings Presented at the Management Committee 19 May 2009 Professor George Subotzky Executive Director: Information & Strategic Analysis 2009 Student Satisfaction Survey • Key indicator of management and operational service delivery & priorities for improvement • 5th annual survey: allows tracking institutional performance & service delivery over time – particularly significant around 5-year reviews • 5 Indices and composite USSI: – General Unisa Student Satisfaction Index (GUSI) – Unisa Registration Efficiency Index (UREI) – Unisa Student Support Service Index (USSSI) – Unisa Academic Performance Index (UAPI) – Unisa Administrative and Professional Services Index (UAPSI) – Combine to form the composite Unisa Student Satisfaction Index (USSI) Scores of 5 Indices & Composite USSI, 2005-9 100 80 60 40 20 0 -20 UAPSI UAPI (Admin & USSI (Academic) Professional (Composite) ) GUSI (General) UREI (Registration) USSSI (Student Support) 2005 76.31 75.91 67.91 71.13 74.96 73.24 2006 62.37 73.3 66.77 69.47 71.79 68.74 2007 73.09 73.9 64.63 69.07 70.23 70.18 2008 71.68 64.07 63.13 67.53 75.14 68.31 2009 63.84 62.44 59.11 62.79 67.56 63.15 2008-9 -7.84 -1.63 -4.02 -4.74 -7.58 -5.16 2005-9 -12.47 -13.47 -8.80 -8.34 -7.40 -10.09 Overview of Main 1-year Trends: 2008-9 All indices down • Composite Unisa Student Satisfaction Index (USSI): down 5,16 points to an unprecedented low of 63,15 • General Unisa Student Satisfaction Index (GUSI): down 7,84 points to 63,84 • Unisa Admin & Professional Services Index (UAPSI): down 7,58 points to 67,56 • Unisa Academic Performance Index (UAPI): down 4,74 points to 62,79 • Unisa Student Support Service Index (USSSI): down 4,02 points to 59,11 – 1st time below 60 for any index • Unisa Registration Efficiency Index (UREI): down 1,63 points to 62,44 (following 9-point drop last year) Overview of Main 5-year Trends: 2005-9 All indices down • Composite Unisa Student Satisfaction Index (USSI): down 10,09 points to an unprecedented low of 63,15 • Unisa Registration Efficiency Index (UREI): down 13,47 points to 62,44 • General Unisa Student Satisfaction Index (GUSI): down 12,47 points to 63,84 • Unisa Student Support Service Index (USSSI): down 8,80 points to 59,11 • Unisa Academic Performance Index (UAPI): down 8,34 points to 62,79 • Unisa Admin & Professional Services Index (UAPSI): down 7,40 points to 67,56 Top 10 Satisfaction Items, 2009 2009 Items Change of examination centre Clarity on examination centre location where you will write your examinations in 2009 Unisa Internet Website Change of address myUnisa e-learning environment Account/balance enquiries Clarity on method and process of payment Information and availability of examination timetables Statements of courses (modules) passed Usefulness of assignments Index score Rating 76.55 1 76.48 75.33 74.93 74.55 74.27 73.46 73.29 72.37 71.92 2 3 4 5 6 7 8 9 10 Bottom 10 Satisfaction Items, 2009 2009 Items Index score Rating Contact Centre (Call Centre) 49.76 1 Parking 51.70 2 Assistance and guidance from Help Desk/Ask Me’s 52.04 3 Curricula advice 52.27 4 Efficiency of student advisors 53.54 5 General organisation of the registration process 53.95 6 Efficiency of ‘Check Point’ 54.00 7 Unisa Regional Office 54.12 8 Office of Experiential Learning (Work Integrated Learning – WIL) 54.33 9 Student Representative Council (SRC): National Executive Council 54.68 10 Items reflecting largest declines in satisfaction between 2005 and 2009 Implications for Planning • Along with other review & reflective sources, the annual student satisfaction survey identifies areas for serious attention in order to improve service delivery, success, excellence, quality & relevance • It therefore represents a key set of indicators of institutional performance and improvement initiatives • The 5-year longitudinal trends indicate a disturbing steady decline in student satisfaction across all indices • Clearly, the University has not responded adequately to these indications over the years and has not been able to effect the required changes in the operational areas concerned in a coordinated and integrated way Conclusion • A hallmark of an effective learning organisation is its ability to learn from its intelligence sources and to rapidly effect the strategic or operational changes required – this is the role of actionable intelligence • The shorter the feedback loop, the more effective is the learning and change/improvement process • Within the integrated strategic management framework, these insights must generate effective, concerted, integrated and coordinated change/improvement efforts Main Recommendation To achieve an effective, integrated solution, the following 2 steps are recommended: 1. The primary responsibility and process for managing the improvement process must be confirmed • Clearly, primary responsibility for this belongs to the DSPQA • The process must integrate related initiatives, including: – Quality Improvement Plans – Ongoing monitoring & evaluation/organisational performance management in relation to the IOP and 2015 SP – Strategic project reviews – Service excellence – Risk management & internal audit initiatives – It will also have to draw from other performance indicators and sources of intelligence, such as the monitoring of student complaints Main Recommendation (2) 2. Utilising these various sources, an annual Improvement Action Plan is drawn up, comprising: – Clear identification of the problem areas/issues – Clear identification of the responsible operational units – A clearly defined process of formally referring the problem areas/issues to the operational units concerned – This would include requesting the operational units to draw up detailed improvement plans, with clearly specified internal responsibilities, targets, performance measures and timelines – These would be submitted to the DSPQA by a specified date (prior to the finalisation of the IOP) for approval in terms of planning consistency and practicability (adequate resources, time and identified dependencies) – Once approved, these would be integrated into the IOP which would then be monitored and evaluated as part of the IOP reviews