Practical customer service skills delivery in Higher Education – Background perspectives Executive summary The project has researched widely to identify empirical evidence, HE staff perceptions and understanding, along with working practices within and beyond the HE sector, to support approaches to customer service delivery and service quality in HE. This report presents evidence that HEIs need to focus on a number of key areas in order to work towards the goal of delivering quality services in line with customer expectations. Options are presented that build on applied service logic theory, the service quality construct and successful initiatives in the public and private sectors. Key areas for attention are: Getting students involved Making sure staff know every interaction is important – with students and with colleagues Measuring quality not just targets Measuring impact Managing behaviours and performance Keeping expectations married to reality – research, engage Introduction Scope of this document This document represents one half of the LFHE project “Practical customer service skills delivery in Higher Education”. The document recounts the findings from an extensive literature search and background reading from a wide range of sources, contextualised with information from: interviews and correspondence with other HEI staff around the UK customer service professionals in retail and industry commercial training providers who have specialised in customer service training interventions. Overview for the project (abstract from proposal) The Dearing report in 19971 commented on Higher Education’s need to ‘push towards delivering a more “customer-oriented” service and a focus on “value for money”’. While this statement has given rise to much contentious debate, increasing student “savvy”, relentless financial pressure and increased public and media awareness has driven Universities to address the issue of customer service head-on. There are three distinct areas in HE where services are provided, each of which require service management alongside client interaction. The first of these areas is the University Administration (also known as Professional Services in many HEIs) who administer the greatest diversity of services to students, their parents and to academic departments (plus the internal service provision between different administrative departments). There will also be responsibilities for services to external bodies, authorities and partners in the community (eg HESA, Research Authorities, and the general public). Aligned with these are the devolved administrative services based in academic departments whose customers include their immediate academic colleagues as well as most, if not all, of the other customers with which the ‘central’ administration deal. The third area includes the academic staff who deliver academic programmes and research. This area is the most contentious as academic colleagues are not quick to recognize their role as service providers or indeed their students as customers, although there is increasing recognition of the blurring of boundaries and responsibilities particularly in the realms of staff and service management 2. Why do we need customer service skills in HE? There has been some debate on this topic in recent years, largely from the perspective of the threat to collegiality, “academe3” and “scholarly communities4” that an increasingly “commercial culture and consumer mentality5” brings to universities. Paul Ramsden, Chief Exec of the Higher Education Academy presses the issue by stating6: “Students are not customers in any conventional sense of the word. The product or service they acquire derives value from their striving to achieve it and their achievement of it is determined to a Page 1 of 21 significant extent by those who supply it. Most students want to be treated as collaborators in a process of developing understanding.” However there is also a strongly felt opinion in the HE community that students are “…undoubtedly consumers of what universities provide.” 7 and that universities need to encompass all aspects of their responsibilities to staff, students, parents and external communities within which they exist 8. This inevitably involves some level of service provision, implying a customer-provider relationship for every one of these areas of responsibility.9 Humes, although writing about the US HE system, captures the zeitgeist in UKHE 10: “Yes, most of our academic institutions today have departments devoted to providing student services. But true customer service must involve more than a department or a handful of individuals. Providing a true service-centred environment is everyone’s job and the inspiration for delivering more than lip service must begin at the top.” We can conclude, therefore that all staff working in HE need to have at least some of the skills required to manage and deliver services to customers, whether they work in administrative or in academic roles. This is evidenced by the range of interventions already being applied within the community by commercial providers (eg Mentor11) and professional bodies (eg the AUA12). Furthermore there has recently been much online traffic on the Staff Development Forum regarding approaches to providing staff with these skills. Again, even here, arguments have arisen about the relevance and particular characteristics of customer service skills for academic staff. Colleagues from around the UK approach customer service skills delivery in different ways with varying degrees of success and at vastly different levels of cost and measurable effectiveness. The approach used by this project In conclusion, there is no doubt that an injection of concerted effort is now required to determine: (a) what are the necessary knowledge, skills and understanding that require attention; (b) what are the most effective forms of intervention to deliver this; (c) what is the relevance of the different customer service quality standards and how are they related to each other and to other quality initiatives and standards. The most pressing concern is that leaders and managers of teams and divisions within HEIs do not have the necessary information at their disposal to make the right decisions regarding these items. Consequently service delivery is compromised: managers and team leaders are disconnected from the services they manage13; responsibility for service provision is fragmented and vague; and unnecessary or inappropriate training interventions are made, sometimes at considerable cost, with little impact on the services and the staff concerned. Theory and practice Prologue: the student as customer/consumer Palfreyman (2007)14 discusses the complex marketplace in which UK (and US) HE operates and the historical changes that have resulted in the system status quo. We might be tempted to begin with an attempt to understand why Universities exist but, for this project at least, it is sufficient that they exist and that the primary customer is the student. Undoubtedly, the current situation with regards the issues around students as customers/consumers, their part in moulding the product/services they are buying, the relatively fragile economics of UKHE and our aspiration to be more like our US cousins, reflects the economic climate of the era in which we exist (and is, therefore, being analysed accordingly). Swain (2008) and Palfreyman agree that “students are undoubtedly consumers”, furthermore Palfreyman suggests that the sovereignty (i.e. supreme control) of “professionals and administrators” should go and that “the consumers should be re-enthroned.” According to Rosenthal, Peccei and Hill (2001)15: “Customer sovereignty has been cast as the dominant discourse of the current socioeconomic-political sphere”. It is reasonable to conclude that we might be prone, therefore, to deploy this model of analysis in the context of this project’s research and too readily see students as potentially dominant forces in the delivery of the services/products that UKHE offer. That they have a significant role to play might seem obvious but, as Palfreyman puts it (in the role of consumers): “the student punter does not know what he/she is buying by way of HE” The role of the student, the concept of HE as a service provider and the complexities of the market are all investigated in the sections below. Page 2 of 21 Customer service: is it all nonsense anyway? There are a plethora of popular books, internet sites, articles and studies that are aimed at addressing the spectrum of subjects that collectively form the topic of “customer service”. The majority of even the most popular published books and articles, however, contain little or (most often) no reference to empirical or theoretical research to support their content. There are so-called “gurus” in this field as in many others (eg John Tschohl 16) and authors of mainstream “Customer Service theories” (eg McMillan, 200317), but finding definitive, robustly evidenced sources is difficult and there are few that deal specifically with the situation in education or HE in particular. It is, however, clear both anecdotally (eg by talking to any call-centre operator – see Datamonitor, 200418 for context) and in research literature (eg Lovelock & Wirtz 200119; Zeithaml et al, 200620; Sturdy, Grugulis, Willmott, 200121) that “customers” and “services” are not entirely artificial concepts but can be characterised, albeit with varying degrees of complexity, and explored objectively and empirically. In an attempt to pull all this together and to understand the development of the wide range of customer service practices and any theories underlying these, this project has researched a wide range of literature in both the academic and practitioner domains. This section draws together the most salient points for this project and attempts to relate this information to the data collected from interviews, email exchanges, group discussions and experiences about what is happening in our Universities with regards customers and service provision. Service logic theory in Higher Education Ng and Forbes (2009)22 discuss the application of “service logic” to the idea of education as a service (and, implicitly, students and their parents/sponsors as customers). Services are variously viewed, traditionally, as activities or as deed, processes and performance (Gonroos, 198823; Zeithaml et al, 200624). Lovelock and Wirtz (2003)25 further characterise services, in relation to their generalist qualities and distinctive attributes but these are not necessarily aligned with the usual focus deployed in HE, i.e. that what matters is the “learning experience”. Ng and Forbes present a “new” framework that synergises traditional services marketing research with educational theory and concepts to help analyse the “value to students” of the University experience. This framework employs a number of key concepts discussed below. The customer as co-creator of the service experience Ng and Forbes’ paper discusses the application of marketing theory and research in an educational context. Appreciating services marketing principles, as applied in HE, is central to this project’s attempts to understand the impact of customer service practices on the main customer (i.e. student). However, As Ramsden (2006) puts it, “Students are not customers in any conventional sense of the word. The product or service they acquire derives value from their striving to achieve it and their achievement of it is determined to a significant extent by those who supply it” The underlying principle here is that the “product or service” (i.e. the overall University experience) is cocreated by both the student and the service provider (i.e. the collective body of the University). The principle that the customer has a significant impact on the service they experience (as opposed to just the product they receive; this distinction being that the interaction affects the experience regardless of the product – see later) is discussed in other contexts by Bitner et al (1997 26) among others. The impact of the customer’s role in co-created services is significant (see Ng and Forbes’ paper for further discussion) and thus will determine the overall experience of the student (as customer). This concept is supported by other research (Sierra and McQuitty, 2005 27) applying social exchange theory to determine that: “social exchanges can create a sense of shared responsibility to service settings and predict that inseparability produces customer perceptions of shared responsibility for service outcomes”. This, in turn, can develop greater customer loyalty in situations where the experience is positive. The post-modern educational paradigm of “shared discovery” is hardly new but using it to build an argument against the growing “consumer mentality28” in HE is a relatively recent occurrence, at least in the UK. This argument also brings into question other aspects of the university’s relationship with its students. The Times Higher Education reported in March 200929 that a respondent to a recent survey stated that although one institution was “signed up to being responsive, there has been no fundamental debate about why student engagement is important. The language of the student as customer is very Page 3 of 21 strong, but the language of the student as a junior member of a learning community is less often heard.” In this case it seems that the concept of student-as-customer is not the problem, indicating that perhaps the balance of the experience, or at least the University’s understanding of that experience, had shifted too far in one direction. Clearly it is important to appreciate all the different points of impact a University can have on the student experience and attempt to understand all the components of the complex relationship between University and student. The concept of Core and Supplementary services In order to understand how to affect this co-created experience requires unravelling the relative contributions of both agents (i.e. University and student) and then isolating the significant “pinch-points” for each. Ng and Forbes (2009) talk in terms of elements of a “University Experience Framework” that indicates as objectively as possible the key points for students and how this is impacted upon by the University’s contributions. The core service is all about the learning experience which is (based on cognitive science research) “emergent, unstructured, interactive and uncertain” and the learning by the student cannot be disconnected from the teaching by the staff. Similarly, not only is it co-created by both student and teacher but it is “hedonic”, i.e. contains elements of pleasure and adventure. The hedonic aspect might seem even further out of the University’s control but can be a product of many service-oriented functions and features. For instance, through the creation of more pleasant social spaces (the “servicescapes” of Bitner,1992 30; see also Lovelock and Wirtz, 200631 for extensive discussion about service environments) resulting in improved customer-customer interactions. Johns (199932) states: “interacting with other customers may not be included in the provider’s service concept but nevertheless contributes to the customer’s experience”. In Universities the social mixing of students is important and this might include diverse interactions including, for example, those of different nationalities or gender 33. All this will impact on the overall student experience. At this point we should consider the composition of core services before we move on to look at the impact of other services. NG and Forbes’ principle assumption is that core services are all about the direct learning experience, i.e. the outcome of direct teaching and the student’s own learning. This however can be unpacked somewhat to expose an overlap with other service provision in modern libraries and online via University-hosted (usually) e-resources, the provision of which is not all managed in the academic domain. This is particularly prevalent since the inception of “e-learning” (where learning occurs through the use of, and interaction with, electronic resources) and “blended learning” (where different modes of learning are employed). Maltby and Mackie, 200934 give an excellent account regarding student engagement with virtual learning and its relationship with other learning modes. Such learning is the result of access to and use of services provided by information technology and library services. We should not ignore the fact that much of these services exist, as far as the customer is concerned, in the “core service” arena and will have a direct impact on the learning experience. Within, and also beyond, the core service element is the “Supplementary service”. Ng and Forbe’s concept of supplementary services includes student applications, finances, campus facilities, accommodation and so on and includes the combination of the people involved, forms, manuals and processes. Crucially, Ng and Forbes propose that: “…the efficient delivery of supplementary services does not denote a good university experience. These are commonly referred to as hygiene services…i.e. services that meet basic needs and that, when not met, can cause dissatisfaction amongst students. Yet, meeting these needs does not make students satisfied – it merely prevents them from becoming dissatisfied.” It follows that getting these “hygiene services” right is essential before we can even think about delivering excellence in any aspect of the core service. It is apparent, however, that the relationship between all the different services provided to customers is also critical. Ozment and Morash (1994)35 found in their empirical study that both core and “peripheral” service offerings must be tailored to customer needs to give satisfaction, i.e. one without the other will not give a positive result even if “peripheral” or “supplementary” services are considered to be largely “hygiene” in nature (as in Ng and Forbes). In a University context the faculty and professional services might thus be mutually culpable for bad customer experience. If we add-in the customer we get a triad of inputs into the equation: Own Contribution Student Experience = LT + SS + OC Experience of Learning and Teaching Experience of Supplementary Services Page 4 of 21 Here the good experience of learning and teaching is down largely to the experience in academic Schools or Departments; the good experience of supplementary services includes all other services beginning with Open Days and ending with Graduation; and the customer’s own contribution includes the “hedonic” aspects such as social interactions and adventure, pleasure, discovery. Of course, we should be aware in this model that each component has a respective weighting, but determining this would necessitate a relatively deep insight at an individual level, i.e. each individual Student Experience will have its own weighting factor for the “LT”, “SS” and “OC” components. This is in part because different individuals will have different reasons for entering HE which will flex and change as time progresses) and also because the “OC” component especially is impossible to objectively measure in any meaningful manner. In terms of marketing and evaluating services, this model thus begs the question of “what DO students want?” or even “WHY are students attending University”? This goes beyond the scope of his project but does shed light on how difficult it is to objectively evaluate the overall student experience (and therefore the impact of service quality) in a reliable, empirical fashion. The importance of the customer and service-provider interaction There are plenty of references to the quality of the customer experience being affected by the customerprovider interaction, indeed the vast majority of training interventions are aimed at improving and developing service-provider skills when dealing with customers. Sierra and McQuitty (2005) puts this in context: “When there is a close interaction between a service employee and a customer, the manner in which the service is performed is often more important than what is actually delivered.” In all service-oriented organisations (including Universities even if we remove the learning and teaching element) services, in Lovelock and Wirtz’s (2006) words, services: “…cover a spectrum from high-contact to low-contact operations reflecting the type of service involved and the nature of the processes used in service creation and delivery.” Hence not all interactions will be personal (i.e. face-to-face in this sense) and, therefore, it follows that not all interactions can be directly impacted by interpersonal skills development, although the service provided might be the same. For example, the University Library self-issue machines will issue a book for you to take from a Library in a similar way as talking to someone behind the issue desk. The customer experience, however, will be affected not by the human interaction but by the experience of using the machine. A slightly different example might be the use of a call-centre operation versus a face-to-face operation. The call-centre operation brings with it a host of other factors that further complicate the customer-provider interaction including the impact of “efficiency” goals on the interaction 36 (often through a negative impact on empathy building in an attempt to drive down handling times). If we do some very simple comparisons (see the Table 1 below) between interpersonal and automated services there are some superficial differences in service quality measures, the most obvious of which is the “added value” measure. This brings to the dynamics the idea that, by being proactive (and to some extent being more empathic with the customer) a service provider can interpret what other/associated needs the customer may have and/or help make the customer’s experience more positive. The customer-provider interaction can also be examined from the perspective of how the customer is perceived and actively managed by the service provider. It is interesting to see, that in academic discourses at least, the paradigms used to describe customers include “sovereign”, “spy”, “emotional vampire”, “object of control”, “enemy” and “source of uncertainty”37. Practical and business literature, however tend towards a vision of the customer needing “care” and “support” and sufficient attention to “delight” and exceed expectations. McMillan (2003)38 goes as far as employing a medical paradigm where the provider delivering excellent customer care can experience a “helper high”. Page 5 of 21 Table 1 Comparison of services and relative qualities Interpersonal service Service quality measure Automated/online service Service quality measure Issuing a book in a library Length of queue Efficiency of person Politeness of person Appearance and demeanour of person Added value Self-issue machine Availability (i.e. is it working?) Length of queue Ease of use Computer problem solving, clinic-style Length of queue/booking slot Efficiency of person Confidence in ability to solve problem Politeness of person Appearance and demeanour of person Added value Call centre Response time Efficiency of call response Confidence in ability to solve problem Politeness and demeanour of person The perceptions of customers by service-providers is important, in particular by service managers who will mould the service provided by their influence on their employees. The impact of employee performance on customer service Sierra and McQuitty’s39 work builds on empirical studies that highlight the role of employee performance in the experience of the customer. Their work relates the social exchange theory discussed above (i.e. that of the “shared responsibility for service outcomes”) to how the services are actually delivered. In their words: “Consumer’s perceptions of shared responsibility for service outcomes correspond with the extent to which consumers experience emotions in a service setting, which affects their willingness to purchase the service repeatedly.” In HE, the customers for the “core” services (i.e. learning and teaching) are, to some extent, captive and the service they co-experience is only “sold” – for want of a better term – once (although this excludes any other services related to this such as adult education, second degrees, higher degrees and so forth). However, once attending, the principle customers (students) do have a range of choices of the “Supplementary” services like bars, accommodation (to some extent), shops and social events. Furthermore we must remember that both “core” and “supplementary” services are part of the overall customer experience and that the “supplementary” services are usually hygiene-services in nature, i.e. do not necessarily contribute positively but can contribute negatively if they do not meet expectations. Hence, even in HEIs there needs to be a good understanding of the role of the customer. Again, to use Sierra and McQuitty’s words: “Thus, service employee training programmes should emphasise the customer’s role in the service experience to increase perceptions of shared responsibility and to create a positive emotional response for customers.” The key point is to increase customer involvement, to seek out key areas where the customer can have an impact on the service in a managed fashion, and to seek customer feedback. Such increased engagement will bring a range of benefits such as stronger emotional response, improve loyalty (to “brand”) and reduce complaining behaviour or at least give a better response. Most importantly it gives the customer a chance to understand what the expectations are on them as well as making it clear what are the responsibilities of the provider. Ng and Forbes (2009) go further and argue that: “…the problems encountered by the US higher education system (i.e. student consumerism and disengagement) are a result of institutions not communicating their expectations of students’ commitments. The problem therefore is not created by student-oriented marketing but by the failure of universities to see how value is co-created.” Page 6 of 21 It would be reasonable to make customer engagement a key aim of customer service initiatives in UKHEIs and to make sure that this is not just finding out what customers think of the services they have received but also to find out: what they wanted/needed in the first place; want they want/need in the future and; how they can best be involved in shaping the service(s) they want/need. How to go about achieving this might, effectively, involve aspects of the next few sections. The Service Quality Construct There is a widely employed model used to measure customer service called SERVQUAL. This has been adjusted over the years into a core, five “dimension” model (see Schneider and White, 200440 for a comprehensive overview –the table below is from this publication): Dimension Definition Reliability Delivering the promised performance dependably and accurately Tangibles Appearance of the organisation’s facilities, employees, equipment and communications materials Responsiveness Willingness of the organisation to provide prompt service and help customers Assurance (combination of items designed originally to assess competence, courtesy, credibility and security) Empathy (combination of items designed originally to assess access, communication and understanding the customer) Ability of the organisation’s employees to inspire trust and confidence in the organisation through their knowledge and courtesy Personalised attention given to a customer The dimensions have been used extensively in customer satisfaction surveys and in empirical research on the topic, largely to identify quality issues (and solutions). In UK Public Service there is a reference to similar dimensions (although not explicitly termed “dimensions”) that are the policy objective of “delivering customer-focused, efficient public services41”. This is all about making sure government departments are “relentlessly customer focused” and deliver a “joined-up service42”. The dimensions presented in these papers can be summarised thus: Dimension Definition Responsiveness Provision of services designed to meet the needs of customers not for the convenience of service providers. Seamlessness Provision of joined-up, multi-channel service delivery so that the customer experiences a co-ordinated and consistent series of interactions for a particular service or set of related services. Quality of service Improve customer satisfaction to engage customers with processes and make them more likely to take up services to which they are entitled. Also to generate less unnecessary contact therefore be less expensive to serve. Strategic transactional efficiency Getting customers to use the channel that best meets their needs and to be most cost-effective for the service provider In public services the underlying problems arise from the multitude of services being provided by a multitude of agencies, hence the drive to make the wide range of services more seamless and the service experience more consistent. Interestingly the last two dimensions include an emphasis on cost-effectiveness where the cost to the service provider is actually part of the boundaries within which the customer service quality dimension must operate, implying a provider-driven compromise at the expense of the customer. Whilst, in reality, all organisations need to attain cost-effectiveness it is not usually so explicit. Indeed looking at the case study of Yorkshire Page 7 of 21 Water43 it suggests that effectiveness was achieved by investment in service mechanisms not by restricting services to what was best for the provider, despite what the customer wanted 44. The Cabinet Office papers propose that the customer experience can be best improved through segmentation: targeting “different customer groups with the right combination of treatments that will influence their behaviour”. By “treatments” what is meant is offering a differentiated approach to dealing with customers who need different services, principally by adopting different options for: Service offering (features, attributes and benefits) Channel (face-to-face, telephone, post, electronic) Communications (branding, positioning) Pricing (Standard, promotional, differentiated) The aim is to identify the most appropriate form of service delivery to meet the segmented customer market. In UKHE this might be modelled into the provision of diverse services such as IT support, Student Finances, Counselling Services and Admissions, i.e. adopting different, but consistent service provision to ensure the most cost-effective and suitable customer interaction. Customer perceptions As we have seen above the perception of the service quality by the customer is critical in the overall experience of that service. Brady and Cronin’s (2001)45 study identifies a model of service delivery that is linked to service quality perceptions. In their model, empirically founded: “..service quality perceptions are tied to distinct, actionable dimensions: outcome, interaction, and environmental quality.” Their research develops the SERVQUAL dimensions and identifies “sub-dimensions” for each of their own that are controllable by the service provider; recognising that the customer will also have an impact and, indeed, responsibility46. Brady and Cronin’s research model is copied in part below: In Brady and Cronin’s work they also identify that each “sub-dimension” contains three items which act as descriptors for each dimensioni (excepting the dimension of “tangibles”): Reliability Responsiveness Empathy The omission of the tangibles dimension from the list is interesting as Brady and Cronin propose that “there is evidence that customers use tangibles as a proxy for evaluating service outcomes”. Thus we could presume that tangibles will “set the scene” for the customer experience (at least in some cases), regardless of real outcomes. Although taken to its extreme this might result in the development of style over substance, there is no doubt that first impressions of a range of tangibles will impact significantly on the overall customer experience. i Valence = the perception of overall service quality as a result of service interaction. Page 8 of 21 In analysing the components of service quality we are beginning to look more closely at the areas of the service experience on which we can impact. In particular, for this project, we find that there are specific components of Brady and Cronin’s work that might be a starting point for developmental intervention, whether that be of a strategic-procedural or cultural-behavioural nature. Fundamentally, what is on offer, in terms of the sum-total of the service dimensions and the customer’s own input, must match the overall performance. As Ng and Forbes (2009)47 propose, there are a number of areas where gaps between customer expectation and service reality can occur: keeping them all closed is the key to service quality. Service quality and Service gaps Ng and Forbes48 apply service logic to the “University Experience” via the Gap Model of Service Quality seen in Zeithaml et al (1990)49. This is applied in preference to the literal application of the service dimensions described in Brady and Cronin given that the University experience is the product of unknown expectations which can be influenced. Ng and Forbes suggest that the Gap Model might better reflect (as a broad approach) the service dimensions that need attention for the student experience. A summary of the model is presented here (as described in Ng and Forbes): Gap 1 The difference between what a student expects and what the institution thinks the student expects, often arising from the lack of research. Gap 2 The difference between the institution’s understanding of students’ expectations and the development of service designs and standards, a difference where resource constraints play a role. Gap 3 Gap 4 The difference between the development of service designs and standards and the actual delivery of the service, arising due to the complexity of the service encounter and the interaction between students and staff. Student expectations Service design and standards Institutional understanding of student expectations The difference between the delivery of the service and the institution’s external communications, i.e. promises made by the brand, advertising, sales force etc, arising where there is a difference between performance and promises. Service delivery Institutional promises The Gap Model can also be represented as feedback “loop” where the loop is not closed from the student point of view (i.e. the dotted line), thereby creating the dominance of the service provider. As we have seen, in all other effective models of customer service the customer is dominant 50; this model therefore is doomed to failure from the outset. Applied to the combined services (i.e. “core” and “supplementary”) this model could be used to isolate areas for attention although ultimately we are really presenting a model of organisational development rather than targeted troubleshooting. Students aren’t the only customers… Staff in UKHEIs have to deal with a range of customers other than just students. Parents and sponsors are often quoted alongside students as “stakeholders” (see references 3-7 for example) or customers at various points of the student’s own “customer journey”. Additionally HE Registry and Admissions services are expected to deliver accurate information to HESA and to funding bodies, who will be customers in their own right at various times of the year. Lastly of course there are the internal customers – colleagues. This is usually perceived to be in the form of non-academic staff providing services to academic staff (and also to each other). There are however increasingly grey areas where academic staff may become “managers” of departments and consequently adopt a more service-oriented approach to various aspects of their work 51. Interaction with these “secondary” customers will impact on employee performance, work efficiency, job satisfaction and ultimately, therefore, on behaviours towards “primary” customers, i.e. the students. Thus, in this sense, we are once again in the domain of Sierra and McQuitty and examining the end-result of (secondorder) performance management on employee interactions with customers. This is not to say that secondary customers are not important but that I am proposing that this project is mostly concerned with our primary Page 9 of 21 customer base (i.e. students) and therefore mostly concerned with interactions with them. Getting these interactions right may well involve performance management which might be considered “first-order” in this case. Where otherwise secondary customers become primary customers, ie when they are asking for the same services as primary customers, the same parameters for achieving a good customer experience will apply. Therefore it is not necessary to explore secondary customer interactions (i.e. staff-staff, staff- agency etc) in this project. Finding out what customers want Principles As seen above in the “Gap Model” of service logic, it is essential to find out what customers want, to adjust the service offerings accordingly, and to monitor customer expectations and actual service delivery constantly. All the studies quoted so far have employed a range of well-established techniques in an attempt to understand: 1. What the customer really wants or needs 2. How well services are being delivered Public Service model The Cabinet Office paper cited above52 includes a useful summary account of the wide range of interventions that can be employed to this end: 1. Front line staff 2. Surveys 3. Customer journey mapping 4. Usability testing and website analysis 5. Ethnography 6. Consultation 7. Formal and informal contact with representative bodies 8. Agents or intermediaries 9. Written correspondence 10. Media coverage (i.e. coverage concerning the service and/or service provider in the media) Explicit interventions such as customer forums, focus groups and “mystery shoppers” are not included but might be implicit in one or more of these methods. From the list there are some interesting comments and case studies with possible impacts on this project. Firstly, front line staff are considered: “a rich vein of customer insight which is often overlooked…typically they have an excellent idea of what is important to their customers, that customers would like to have more of, what frustrates them and what they would change.” Secondly, concerning customer surveys: “The data from quantitative surveys is often useful for providing robust evidence to support a business case for change, for example….However, to fully understand why customers behave or think in certain ways a blend of approaches is often needed.” The report goes on to comment that “surveys are expensive and time consuming” and imply that they may not always tell the provider what they want to know. Indeed survey design is critical (consider the SERQUAL dimensions53 above) and the purpose needs to be honest and very clear so that organisations do not use surveys as a shield to demonstrate a superficial commitment to customer feedback. Lastly, concerning ethnography (the scientific method of describing human behaviour and culture): “People can’t always articulate what they want or need…you can’t expect them to just give you answers.” Ethnography has been used in the private and public sector to identify behavioural patterns and habits and to help overcome issues and problems associated with these. What is certain, however, from all the literature and research seen is that regular and appropriate interventions must be in place, along with the willingness and processes to be able to act on outcomes, if the service organisation is to even start on the road to delivering a quality customer experience. Page 10 of 21 Gaining customer insight through such interventions is seen to be the first step in engaging the customer and to ensuring the customer’s part of the co-created service experience will be as positive as possible. NSS and other student surveys In the past decade there appears to have been a far greater use of surveys in HEIs and across the UKHE landscape in an attempt to see if, in essence, things work. The National Students Survey (NSS) is the most widely known survey tool along with the International Student Barometer (ISB). Both are employed to benchmark “student experience” nationally (i.e. inter-HEI) and to paint a broad picture of the experience at a local level (intra-HEI). The NSS is marketed as a chance for students’ voices to “be heard”. The role of the NSS is seen (by the NSS providers) as54: “It’s your chance to have your say about what you liked and didn’t like about your student learning experience during your time in higher education.” The data gathered is described as being: “1. published on Unistats.com where prospective students and their advisors can use the results to help make informed choices of where and what to study 2. useful to your university, students’ union or college to facilitate best practice and enhance the student learning experience.” Hence, the NSS is positioned as a method of getting feedback about (in effect) service quality although the quality of the data is questionable for all the reasons highlighted above concerning what makes the student experience and the limitations of anonymous surveys (see Ng and Forbes, 2009 for a fuller discussion). That the NSS at least provides a benchmark for UKHE is not in dispute, that the NSS cannot drive service improvements might be a conclusion from this analysis. The ISB, however, is marketed differently. Its providers, i-graduate, see the ISB as55: “An independent and confidential feedback process for education providers, tracking the decision-making, perceptions, expectations and experiences of international students • A risk management tool, identifying the key drivers of international student satisfaction and establishing the relative importance of each. • A comparative measure, tracking year on year how expectations and perceptions change - within your institution and against national and global benchmarks” Whilst more measured in its approach to the value of the survey tool there are tacit assumptions that HEIs making use of the ISB will expect service quality to be influenced by action arising from findings. Neither the NSS or ISB can provide the solutions to problems with student experience. Like all surveys that do not expressly probe clearly identified service dimensions 56, the information acquired will simply highlight an issue, not the dimension or attribute that needs to be adjusted. i-graduate offer a range of other barometers/surveys that do attempt to address service dimensions, or at least the outcomes directly related to these (eg price and scholarship which addresses market perceptions of fees and scholarships , employer which looks at what employers are seeking in graduates in an attempt to influence service offerings tailored to employer needs). There is no doubt that such high-profile customer feedback has a direct influence on service provision in HEIs. They certainly provide a starting point (i.e. “something has to happen”) and point a finger at the likely service area that needs to be addressed. Brady and Cronin’s model 57 and the comments regarding the organisational service gaps in Ng and Forbes could then be used to target the service dimensions that need attention although it is more likely in many cases that wider organisational development (OD) will be required in the long run. Holistic Evaluation Research in the US58 has proposed a move away from “accountability models” with an operational focus to a more holistic approach that considers impacts. Sheffield Student Services have piloted this approach with the aim of developing a more strategic approach to improving service quality and the overall student experience. The Director of Student Services at Sheffield summarises the reasons for this approach thus: “Arguably it enables us to be more proactive in influencing policy, rather than merely to demonstrate levels of usage or customer satisfaction in a somewhat reactive way. This organic model of evaluation encourages us to focus on learning and it is a balanced approach which is closely in tune with the Balanced Scorecard/Strategy Map concepts which underpin our departmental strategy. Holistic evaluation should help us move beyond one-dimensional satisfaction indicators. In short it could be seen to put the ‘values’ back into evaluation.” Page 11 of 21 The key to holistic evaluation is the information gleaned from the customer and how this is acquired. Rather than ask questions or focus on the service delivered (the “tangible” dimension) or indeed on the interaction itself, customers are asked to say how the service has impacted on aspects of their lives or other parts of their University experience. It could be argued that there is a possibility of mismatch between the impact and the service since customers may not be able to determine the boundaries of service delivery (effectively the dimensions being evaluated) and their holistic experience. However, in this form of evaluation we are seeing a move towards closing the gap between service delivery and customer expectation without disengaging the whole student experience from the complement of different services that make up the experience. A holistic approach, therefore, might be effective in measuring both elements of the co-created experience, through smart questioning and reflective techniques. Taking the student pulse Being able to collect, analyse and feedback quick snapshots of student perceptions of their experience (and by extrapolation the product of the associated services’ quality) might be a more effective way of moderating service delivery than large-scale surveys over longer periods of time. The LFHE project “Taking the Pulse” 59 demonstrates this methodology being deployed for staff surveys. The principle is to make the snapshot (i.e. data collection) very simple therefore allowing rapid collation, analysis and, in theory, action through feedback to relevant stakeholders. Although the principle is sound there is a risk of “survey burnout” when institutions also expect returns from NSS, ISB and so forth, and is increased should the pulse surveys be run as intended, i.e. throughout the student lifecycle. Towards service quality through Customer Service frameworks Principles and overview A recent workshop on customer service in HE concluded that: “it is easy to hide behind customer service awards and charters to avoid doing something really effective”. The same group of HE service providers also concluded that many HEIs struggle to find where to start and what to do to start making necessary improvements. Service frameworks are intended to help both with the overall quality of service provision and with the change agenda necessary to make the provision work in the first place and over the longer term. Unfortunately the world of customer service frameworks is complex and made worse by the overlap with “Quality Standards” which are often constrained by operational context with only a nod to the customer service element. The latter include procedurally driven frameworks like the IT Infrastructure Library (ITIL) which talks about service quality60: “IT resources are focused on service quality to satisfy customer requirements” However, most of the voluminous supporting materials are concerned solely with process and systems: the impact on customers is only implicit. Frameworks concerned largely with customer service and service quality tend to have an emphasis on the customer interaction. Whilst we have seen that this is important it is by no means the only dimension to service quality that needs attention. This is perhaps most extreme in the competency framework like the NVQ in Customer Service. The National Occupational Standards (NOS) 61 do make reference to process and systems but their job is largely to ensure that the individual concerned has the necessary competency, not the organisation or team working within it. Having said this, the higher levels (3 and 4) contain reference to strategic interventions along with establishing, adapting and monitoring the systems and processes, although this is dependent on the role of the individual undertaking the award and is not concerned with the organisational angle in the larger picture. In other words the NOS are somewhat dependent on both a cultural approach and necessary OD already being in place to be effective, although it is possible to achieve the standards as individuals, given the right opportunities. The other frameworks of interest to this study are to do (in essence) with service quality: establishing processes and systems, ensuring competency in staff, measuring the quality of interaction, engaging customers and selling the service appropriately. In other words, closing the gaps identified in Ng and Forbes’ model62 we saw earlier. Although this study does not include advanced frameworks like the ISO9000 series or EFQM business excellence model, a short discussion on two more basic frameworks is worthwhile. Customer First A standards-oriented framework aimed at “putting the customer first”63, this is used extensively in the hospitality industry, often as a starting point for progression to other frameworks. One of the attractants with Customer First (and with other frameworks) is the working towards an “award”, moderated externally to the organisation, thus endowing a real sense of achievement and success. Page 12 of 21 The framework consists of 32 statements divided into three sections under the headings of “customer relationships”, “market awareness” and “people”. Each statement contains detail grouped under “why”, “what” and “evidence examples”. Success is governed by the ability to consistently demonstrate the statement as required by the external assessor(s). There is room for subjective evaluation but, given the contexts, it would be reasonable to expect fair assessment. In UKHE many corporate services possess the standard although, anecdotally, this does not necessarily follow that these services are excellent, simply that they have at some point demonstrated processes, understanding of the issues and some measure of reasonable customer service. The simplicity of the standard, however, does lend itself to a starting point for areas where there is very little to work with otherwise. The key points, with respect to the discussions preceding this section, are that this standard identifies: 1. the importance of the involvement of the customer in the service, implicit to which is recognition of the co-created elements 2. the importance of the interaction between service provider and customer, including employee behaviours 3. the significance of understanding customer expectations and managing these through communication and related interactions. Customer Service Excellence (previously Charter Mark) The Charter Mark standard has been around for some years and has recently been revamped into the Customer Service Excellence (CSE) model. In the words of the relevant Government Office 64: “The Government wants public services for all that are efficient, effective, excellent, equitable and empowering – with the citizen always and everywhere at the heart of public service provision. With this in mind Customer Service Excellence was developed to offer public services a practical tool for driving customer-focused change within their organisation. The foundation of this tool is the Customer Service Excellence standard which tests in great depth those areas that research has indicated are a priority for customers, with particular focus on delivery, timeliness, information, professionalism and staff attitude. There is also emphasis placed on developing customer insight, understanding the user’s experience and robust measurement of service satisfaction” CSE is a more sophisticated model of service quality than Customer First, largely because it is aimed at a certain type of service delivery by a certain sector. CSE also contains statements, termed “elements” grouped into “standards” in turn grouped into one of five “criteria”. In all there are 57 elements allowing for a more detailed description of what “customer service excellence” means in this context. Key to the CSE is the concept (and practicalities) of customer segmentation. This is discussed above65 but in essence is about offering differential modes of services to different customer groups. In the complex service delivery that is HE this might be applicable on a microlevel but is harder to see how this can be scaled to bring about – or at least to herald the start of – cultural change for the student experience as a whole. Consequently, in HE we see CSE being awarded to individual service areas (eg the Libraries of Swansea and Nottingham Tent), whereas Customer First appears to cover wider service groups, albeit with quite different service remits (i.e. the “tangibles” of a student coffee shop and halls of residence are quite different to those of the Academic Registry and an IT Helpdesk for instance). A confident service or small group of similar services might be able to sustain achievement of the CSE but its more detailed probing of market awareness and tailoring of services might be more challenging to a wider grouping in an HE context. The status quo in HE Introduction Discussions with a range of individuals from UKHE Services has provided some insight into how customer service is perceived, and acted upon. The discussions were based on answers to a standard set of questions, also deployed in a survey format (using BOS). Conversations with individuals have been augmented in a number of instances with group discussions and sometimes from team inputs (eg HELOA, SDF and the Bath Workshop on Customer Service). Appendix 1 lists the questions asked and respondents (when identified) used in this section. Page 13 of 21 The voice of service providers General position Everyone encountered had some form of customer awareness with regards: an appreciation that students are customers at various points in their journey that students contribute to the overall University experience that learning is not wholly consumer-oriented that students are not the only customers (i.e. internal customers and those sponsoring students are also subject to the same experience of service provision) Customer Service skills training provision All but one of the respondents also had some from of customer service training available in their institution, or had undertaken some themselves or in their division. The value in this training was not universally seen with half of respondents saying it was “OK” or “some positive impact” but included with this were comments such as “teaching granny to suck eggs” and “didn’t address underlying problems with my department”. Most of the training was in the form of short sessions although some institutions also had longer “focused” skills sessions or rolling programmes. Five Institutions offered NVQs internally to staff. More than half of the respondents had skills sessions delivered by both internal staff developers and external training providers. Four of these employed external providers to instil a more strategic approach and to help with developing a “service culture”. Evaluation of customer satisfaction/service quality Over half (12) of respondents seek feedback from customers proactively although it was not identified if things like the NSS or ISB formed part of this. Of these, six of the institutions had feedback restricted to certain departments, two were unsure and four were adopting, or had adopted, an institutional-wide approach to evaluating service quality. This included Sheffield employing their “holistic” evaluation model. Use of frameworks Seven respondents knew of “Customer First” or “Charter Mark” in their institutions, exclusively in “hospitality” or “estates” related service areas. Four respondents were undertaking ITIL in their IT sections and three had “Customer Charters” of which they were aware. This data, of course, really only shows the awareness of the individual concerned and does not accurately reflect the bigger picture but it is worthwhile noting that, in terms of measuring and attempting to ensure service quality, HE services in the traditional consumeroriented areas of retail, residences and leisure are way ahead of their colleagues in the rest of Professional Services. This does reflect the changing culture of the student as customer which is driving the need to adopt more customer-centric practices in other areas of academic institutions and perhaps justifies the more “managerialist” approach to higher education, feared by many academics. Strategic approaches 13 institutions believe there is a need for customer service skills although it was not specified if this included a strategic approach but is implicit. This includes seven who feel these need to be developed for academic and administrative staff across the board. Bath, Middlesex and Imperial are all taking a strategic approach to service quality which includes qualitative and quantitative measures of the student experience and satisfaction across the institutions using different methodologies. These will be examined in more detail in the case studies phase of the project. It is worth noting that all the strategic initiatives (bar Sheffield) have been driven, and have arisen, from staff developers not service managers. Having said this, where present such strategic approaches have all involved Senior Management at a very early stage and engage service managers and senior academics from the outset. This is particularly evident in Imperial and Bath’s approach and cannot be emphasised more strongly for these initiatives to work. All respondents were quite clear that programmed approaches to service quality improvements are very long term and include strategy, process, management, evaluation and skills elements. Page 14 of 21 Views from training providers This section contains highlights of the information kindly provided by two leading customer service training providers in HE, Paul Kent Associates and the Tim Russell Group. Both providers, and also Mentor with whom Exeter have worked, offer bespoke and standard workshop-style skills sessions which they have tailored to the UKHE market. All have slightly different approaches but agree that: customer service awareness in UKHE is on the increase until recently customer service skills development was restricted to student services greater emphasis is now given to building relationships including with colleagues, addressing the internal market academic staff are becoming more amenable to skills interventions and becoming involved in wider initiatives delivered through external providers (although the “customer” word is not employed as such). Paul Kent have developed a substantial portfolio of different customer service interventions aimed at all HE staff and have a broad experience with many HEIs tailoring provision to meet their needs – where the tailoring element is seen as crucial to effective delivery. Their approach reflects the constant demand to keep up with what students want which in turn, in Paul Kent’s opinion, is beginning to be driven by: “fee paying students wanting more value for money” Again, the importance of all interactions is emphasised although Paul Kent admit that: “In reality, so far Universities have concentrated their customer service initiatives almost exclusively on their support staff.” Tailoring and targeting skills sessions, and putting them into a wider programme of development will drive change to some extent, although the impact of such skills sessions will be lessened where there are procedural gaps, or poor management or supervision. This was evident in one Institution where trainers delivered a whole programme of skills sessions which, upon evaluation, were discovered to have had “little or no impact”. In fact such costly exercises can count heavily against the general purpose. Tim Russell’s approach is less about skills interventions and has, in the HEIs mentioned during discussions, taken on a more strategic theme with his team acting as consultants and advisers to promote cultural change through institutional-wide development. Tim says: “To be really successful in achieving improvements in customer service, a holistic approach is essential.” This approach relies less heavily on skills sessions although there is included in the model provided a focus on communication, interpersonal relationships and performance management, clearly identifying employee behaviours as one of the keys to service quality. The model is also inclusive of all staff and all customer segments: “It should be emphasised that, although there has been some debate about the use of the word ‘customer’ within HEIs, the thrust of our work is about developing the effective relationships between all interested stakeholders within the university. These include academic and administrative staff, students at all levels of study, their parents, employers and local and national businesses, suppliers, research and funding agencies.” Whilst skills sessions will always be needed and are relatively simple to procure and deliver, higher level development is far more time consuming and difficult and requires considerable commitment from the institution’s senior team. Synthesis of the research and the status quo Introduction Whilst there are no obvious universal answers there are some clear messages from the research literature, practical interventions and discussions with Institutions and service providers. Ultimately each situation will probably require a different approach but some starting points and milestones can be derived from the work above: these are presented below. Lastly, given the economics of the sector, it is inevitable that students will require HEIs to be more accountable for the service they experience, regardless of their contribution to their experience. Page 15 of 21 What’s important Get students involved The NSS, ISB etc are starting points and contributors to the mass-indicators applied (with some variation and debatable value) as a benchmark for UKHE. Institutions need to take notice but not rely on them as drivers for change. The key message might be “think institutionally, act local”. Following the Cabinet Office model of segmentation each service will need to ensure that it has the right channels of contact for its customer group and that customers really do take part in formulating service provision and have a voice. This is tied in with evaluation but should be more about aligning expectation with reality and closing any gaps. Make sure staff know every interaction is important – with students and with colleagues Concentrating on effective communication and internal relationships will ensure that all staff know their responsibilities and will know how to escalate and refer enquiries. This is the first step to engendering a culture of service-quality ownership and therefore allowing confidence in service provision and trust between provider and customer. Measure quality not just targets Quality is about consistency, value and richness of experience. Although targets are important to ensure volume of tasks are managed appropriately, there is evidence that the quality of the employee-customer interaction can be more important than the service itself. This is linked to performance and behaviours (see below) and to interactions (discussed above). Service quality is a product of: getting the right policy and procedures (informed by the customer) in place and making sure they are applied appropriately by observation of application (management, mystery shoppers) and by consulting customers (focus groups, questionnaires etc) ensuring staff give consistent, reliable and empathic service through adaptation and development of appropriate attitude, behaviour and expertise. Some of this can be developed through skills sessions but much of this is a product of the overarching culture and also of the environment in which the service is delivered. This must be monitored and measured by observation (management, mystery shoppers) and by consulting customers (focus groups, questionnaires etc). ensuring the service environment is fit for purpose and encourages positive customer-customer interaction and employee-customer interactions Measure impact Sheffield’s holistic evaluation of impacts on student lives might be considered as an adjunct to the more functional, reactive assessment of service quality provided by focus groups etc. This should not, however, rule out these service measures, simply augment the quality of the data acquired, thereby enabling a much more strategic approach to service development. “Taking the student pulse” is a good example of spotchecking in a holistic fashion. Manage behaviours and performance Managers need to be heavily engaged with the daily operations, talking to customers, watching service operations, being visible and acting on situations without resorting to disruptive and dis-empowering micromanagement. Teams need to be reviewed using customer service parameters or competencies and be offered appropriate development. Keep expectations married to reality – research, engage with customers Managing this gap is crucial. There is evidence that not enough research is done about what students think they have registered for when they begin a course at a University. Differences between the reality of the student experience and perceptions developed from Open Days, prospectuses, website information, league tables etc need to be identified and acted upon. Getting the “hygiene” services right will not necessarily win awards but getting them wrong will lose customers: Ensure real listening happens Ensure there is action when needed Page 16 of 21 Tell your customers what’s going on Deal with PR and marketing spin What can be done 1. Get the basics in place. If you don’t have consistent customer service policies and procedures then ask your customers to help you develop them. Make sure your staff are all on board and fully appreciate the ramifications of every customer interaction and the role they have to play. 2. Use a framework. There is a lot to do across services and within teams. A framework like Customer First gives a good introduction to the basics where there are a lot of inconsistencies and/or communication barriers. CSE offers a more sophisticated model for complex services particularly where customer segmentation (i.e. developing different channels and service offerings) is important. 3. Employ someone to oversee it all. There are so many key areas where things can go wrong which halt the necessary cultural change that an individual driving force might be able to overcome. Shared responsibility is possible although there is always the risk of “slow death by democracy” where no one person is able to act without a consensus. 4. Get senior staff engaged. Cultural/organisational change is impossible without strong leadership. 5. Monitor constantly. Effective monitoring and evaluation of services is essential and is one of the starting points for any initiative. This needs to be consistent and the data gathered needs to be of value and communicated and acted upon appropriately. This function needs to be given to an individual or team. 6. Offer skills development. Targeted, tailored skills development will always be needed. Choose the provision carefully and make sure people know why they are receiving it and what impact it should have. Measure and evaluate this constantly. Next steps for the Project The Case studies Three HEIs have been identified as suitable case studies along with a cross-sector study with the Tim Russell Group. The HEIs and their approaches are: 1. Imperial College, London. The Customer Service Academy. 2. University of Bath. The Service Quality project. 3. Middlesex University. Developing an Institutional Service Culture Further information will be provided for context from operations at Exeter and Bournemouth. Case study material will be collected during the months of August and September by visits, interviews and focus groups. Page 17 of 21 Appendix 1 1a Questions asked – conversations 1. Do you have a customer services strategy, charter or statement? * If so, from where did this originate and who maintains/updates it? * If not, do you feel you should have one? 2. Please state which, if any, quality standards or frameworks you employ that focus on customer service and service delivery (such as Charter Mark, Customer First, EQFM or ITIL). * If you employ one or more of these who is responsible for its implementation and monitoring? 3. Please state which, if any, customer feedback and evaluation systems you have in place (eg "mystery shoppers", focus groups, customer forums)? * If you have such systems who manages them? 4. Have you or your colleagues had any customer service skills training in the past five years? If so: * In general how was the type of training determined (eg advice from Staff Development Unit or external provider)? * What need was the training intended to meet? * How was the training delivered (i.e. in-house or by an external provider)? * How was the success of the training measured? * Was the training considered to be effective? 1b Questions asked – survey 1. What is the name of your Institution? 2. When considering customer service skills development, what is your involvement? 3. Please select which role is the closest to your own. 4. In your Institution, do you believe there is a need for customer service skills? 4.a. If you answered "Yes" please select the most appropriate reasons why from the list below. 4.b. If you answered "No" please select the reason(s) why from the list below. 5. In your Institution, does customer service skills training happen (including the use of learning opportunities such as online resources)? 5.a. If you answered "Yes" please select the type of training or learning intervention. 5.b. If your answer was "Yes" please select how the training or learning intervention was provided. 5.c. If you answered "Yes" please choose how effective the training has been. 6. In your Institution please select which of the following underlying frameworks and/or procedures are in place. 6.a. If you selected one or more of the options please select where these operate. 7. Do you have any other comments? Page 18 of 21 1c Contributors to this report (as of 1/7/09) Organisation Position Aberystwyth University Network Development Team leader City University Infrastructure Services Manager CSE Key Account Manager HEaTED Executive Director HELOA Chair of Regional Group Imperial College, London Learning and Development Centre Manager Loughborough University Staff Development Adviser Mentor Training Account Manager Middlesex University Staff Development Manager Paul Kent Associates Chief Exec Royal Holloway, University of London Infrastructure Technical Architect The Open University IT Helpdesk Team Leader Tim Russell Associates Chief Exec University of Bath Head of Staff Development University of Bath Assistant Registrar University of Bournemouth Staff Development Adviser University of Brighton Principal Systems Officer University of Cardiff Counselling Services Administrator and Receptionist University of Exeter Counselling Services Administrator University of Exeter Library Customer Service Manager University of Exeter Planning and Resources Assistant Director University of Exeter Corporate Services Training and Development Manager University of Exeter Professor of Marketing Science University of Greenwich Technical Services Manager University of Manchester Shared Service Desk Manager University of Manchester Student Services Manager University of Oxford Chief Cashier University of Plymouth Student Services Manager University of Portsmouth Technical Support Officer University of Reading Helpdesk Team Leader University of Reading IT Support Manager University of Sheffield Director of Student Services University of Sunderland Acting Customer Support Manager University of the West of England Student Services Manager University of the West of England Staff Development Manager Page 19 of 21 References 1 Dearing, R. 1997. Higher Education in the Learning Society. The Report of the National Committee of Inquiry into Higher Education. 2 Whitchurch, C. 2006. Professional Managers in UK Higher Education: Preparing for complex futures. LFHE Research and Development Series. 3 Diacon, T. 2008. Customer Service, or Provider Responsibility. Inside Higher Ed. 13th March (www.insidehighered.com/views) 4 Newman, M. 2008. New order of service as ‘customers’ are ditched. Times Higher Education 24th January. 5 You pays yer money... 2006. Times Higher Education. 29th September 6 Ramsden, P. 2007. Editorial. Academy Exchange. Issue 7, Winter 2007. (www.heacademy.ac.uk/resources/publications/exchange) 7 Swain, H. 2007. Go the extra mile with a big smile. Times Higher Education 16th February. 8 Ibid. 9 Humes, L. R. 2004.The role of customer service in Higher Education. Humes & Associates. (www.humesassociates.com). 10 Ibid. 11 See http://www.mentorgroup.co.uk/profile/ourstory.html 12 See jttp://www.aua.ac.uk/events/courses/customerservice/ 13LFHE. 2006. A CPD Framework for Service Managers in HE. LFHE Small Development Projects SPDCPD06. 14 Palfreyman, D. 2007. Market, models and metrics in higher education. Perspectives. Policy and Practice in Higher Education. 11(3): 78-87. 15 Rosentahl, P., Peccei, R. and Hill, S. 2001. Academic discourses of the customer: ‘soveriegn beings’, ‘management accomplices’ or ‘people like us’?. In Sturdy, A., Grugulis, I & Willmott, H. (Eds). 2001. Customer Service – Empowerment and Entrapment. Critical Perspectives on Work and Organisations: Palgrave 16 2009. See http://www.johntschohl.com “The Service Quality Institute”. 17 McMillan, B. 2003. The Customer Service Theory. www.brianmcmillan.net. 18 Datamonitor, 2004. see www.datamonitor.com, reference code BPCS60. Improving customer service provision within the utilities industry –Yorkshire Water Case Study. 19 Lovelock, C.H. and Wirtz, J. 2003. Services marketing: people, technology, strategy. Prentice Hall: New Jersey. 20 Zeithaml, V.A. & Bitner, M.J. and Gremler, D. D. 2006. Service marketing: Integrating customer focus across the firm. McGraw-Hill: Boston. 21 Sturdy, A., Grugulis, I & Willmott, H. (Eds). 2001. Customer Service – Empowerment and Entrapment. Critical Perspectives on Work and Organisations: Palgrave. 22 Ng, I. C. L. and Forbes, J. 2009 Education as Service: The understanding of University experience through service logic. Journal of Marketing of Higher Education, forthcoming. 23 Gonroos, C. 1988. Service quality: the six criteria of good perceived service quality. Review of Business 9(3): 10-13. 24 Zeithaml, V.A. & Bitner, M.J. and Gremler, D. D. 2006. Service marketing: Integrating customer focus across the firm. McGraw-Hill: Boston. 25 Lovelock, C.H. and Wirtz, J. 2003. Services marketing: people, technology, strategy. Prentice Hall: New Jersey (Chapter10). 26 Bitner, M.J., Farander, W.T., Hubbert, A.R. and Zeitaml, V.A. 199. Customer contributions and roles in service delivery. International Journal of Service Industry Management 8(3): 193-205. 27 Sierra, J. and McQuitty, S. 2005. Service providers and customers: social exchange theory and service loyalty. Journal of Services Marketing 19/6: 392-400. 28 You pays yer money... 2006. Times Higher Education. 29th September 29 Institutions hear consumers when students speak. 2009. Times Higher Education 5th March. 30 Bitner, M.J. 1992. Servicescapes: the impact of physical surroundings on customers and employees. Journal of Marketing 56: 57-71. 31 Lovelock, C.H. and Wirtz, J. 2004. Services marketing: people, technology, strategy. Prentice Hall: New Jersey 32 Johns, N. 1999. What is this thing called service? European Journal of Marketing 33(9/10): 958-973. 33 Ng, I. C. L. and Forbes, J. 2009 Education as Service: The understanding of University experience through service logic. Journal of Marketing of Higher Education, forthcoming. 34 Maltby, A. and Mackie, S. 2009. Virtual learning environments – help or hindrance for the ‘disengaged’ student? ALTJ, Research in Learning Technology Vol. 17, No. 1, March 2009, 49–62 Page 20 of 21 35 Ozment, J. and Morash, E. 1994. The augmented service offering for perceived and actual service quality. Journal of the Academy of Marketing Science 22(4); 32-363. 36 Korczynski, M. 2001. The contradictions of service work: call centre as customer oriented bureaucracy. In: Sturdy, A., Grugulis, I & Willmott, H. (Eds). 2001. Customer Service – Empowerment and Entrapment. Critical Perspectives on Work and Organisations: Palgrave pp.79-101. 37 Rosenthal, P., Pessei, R. and Hill, S. 2001. Academic Discourses of the Customer: ‘Sovereign Beings’, ‘Management Accomplices’ or ‘People Like Us’? In: Sturdy, A., Grugulis, I. & Willmott, H. (Eds). 2001. Customer Service – Empowerment and Entrapment. Critical Perspectives on Work and Organisations: Palgrave. Pp 18-37. 38 McMillan, B.M. 2003. The Customer Service Theory. See www.brianmcmillan.net. 39 Sierra, J. and McQuitty, S. 2005. Service providers and customers: social exchange theory and service loyalty. Journal of Services Marketing 19/6: 392-400 40 Schneider, B. and White, S.S. 2004. Service Quality Research Perspectives. London: Sage. 41 Segmentation guildelines. Delivering a customer-focused, efficient public service. 2006. Cabinet Office (e-Government Unit). 42 Customer insight in public services – A Primer. 2006 (October). Cabinet Office, Delivery and Transformation Group. 43 Datamonitor, 2004. see www.datamonitor.com, reference code BPCS60. Improving customer service provision within the utilities industry –Yorkshire Water Case Study 44 Ibid. 45 Brady, M.K. and Cronin, J.J. 2001. Some new thoughts on conceptualizing perceived service quality: A hierarchical approach. Journal of Marketing 65 (July 2001): 34-49. 46 Sierra, J. and McQuitty, S. 2005. Service providers and customers: social exchange theory and service loyalty. Journal of Services Marketing 19/6: 392-400 47 Ng, I. C. L. and Forbes, J. 2009 Education as Service: The understanding of University experience through service logic. Journal of Marketing of Higher Education, forthcoming. 48 Ibid. 49 Zeithaml, V.A. & Bitner, M.J. and Gremler, D. D. 2006. Service marketing: Integrating customer focus across the firm. McGraw-Hill: Boston. 50 See Sturdy, A., Grugulis, I & Willmott, H. (Eds). 2001. Customer Service – Empowerment and Entrapment. Critical Perspectives on Work and Organisations: Palgrave 51 Whitchurch, C. 2006. Professional Managers in UK Higher Education: preparing for complex futures. London: LFHE. 52 Customer insight in public services – A Primer. 2006 (October). Cabinet Office, Delivery and Transformation Group. 53 Brady, M.K. and Cronin, J.J. 2001. Some new thoughts on conceptualizing perceived service quality: A hierarchical approach. Journal of Marketing 65 (July 2001): 34-49. 54 See http://www.thestudentsurvey.com/ 55 See http://www.i-graduate.org/services/student_barometer.html 56 Ozment, J. and Morash, E. 1994. The augmented service offering for perceived and actual service quality. Journal of the Academy of Marketing Science 22(4); 32-363. 57 Brady, M.K. and Cronin, J.J. 2001. Some new thoughts on conceptualizing perceived service quality: A hierarchical approach. Journal of Marketing 65 (July 2001): 34-49. 58 Keeling, R., Wall, A., Underhile, R., Dungy, G. 2008. Assessment Reconsidered: Institutional Effectiveness for Student Success. NASPA: http://assessmentreconsidered.org/ 59 Brown, M.E. 2006. Taking the pulse. LFHE Small Development Project. See: http://www.lfhe.ac.uk/research/smallprojects/sdppulse06.html/ 60 See http://www.itsmf.co.uk/BestPractice/Why_BP.aspx 61 Institute of Customer Service 2006. Overviews of Customer Service National Occupational Standards and NVQ/SVQ Units at Levels 2,3 and 4. http://www.instituteofcustomerservice.com/Gallery/docs/1/NOS%20Overviews.doc 62 Ng, I. C. L. and Forbes, J. 2009 Education as Service: The understanding of University experience through service logic. Journal of Marketing of Higher Education, forthcoming. 63 See http://www.customerfirst.org/PotentialCustomers.aspx 64 See http://www.cse.cabinetoffice.gov.uk/aboutTheStandardCSE.do 65 Segmentation guildelines. Delivering a customer-focused, efficient public service. 2006. Cabinet Office (e-Government Unit). Page 21 of 21