Measuring Quality, Cost, and Value of IT Services

advertisement
I
N
D
I
Measuring Quality, Cost, and Value of IT
Services
A
N
A
U
N
Office of the Vice President for Information Technology
and
University Information Technology Services
Indiana University
I
V
E
R
S
I
T
Y
Paper presented at the Meeting of Indy ASQ Section 0903 Meeting
Tuesday, 14 January 2003
I
N
License terms
D
I
A
N
•
A
U
N
I
V
E
R
S
•
Please cite as: Peebles, C.S., C.A. Stewart, B.D. Voss and S.B. Workman.
Measuring Quality, Cost, and Value of IT Services. 2003. Presentation. Presented
at: Meeting of Indianapolis ASQ Section 0903 (Indianapolis, IN, 14 Jan 2003).
Available from: http://hdl.handle.net/2022/15213
Except where otherwise noted, by inclusion of a source url or some other note, the
contents of this presentation are © by the Trustees of Indiana University. This
content is released under the Creative Commons Attribution 3.0 Unported license
(http://creativecommons.org/licenses/by/3.0/). This license includes the following
terms: You are free to share – to copy, distribute and transmit the work and to
remix – to adapt the work under the following conditions: attribution – you must
attribute the work in the manner specified by the author or licensor (but not in any
way that suggests that they endorse you or your use of the work). For any reuse or
distribution, you must make clear to others the license terms of this work.
I
T
Y
2
I
N
IU in a nutshell
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
•
•
Founded in 1820
$2B Annual Budget
8 campuses
>90,000 students
3,900 faculty
878 degree programs; >1,000
majors; > 100 programs
ranked within top 20 of their
type nationally
• University highly regarded as
research and teaching
institution
I
N
D
IT@IU in a nutshell
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Academic programs in IT through computer science,
library and information sciences, engineering and
technology, and most notably through new School of
Informatics
• CIO: Vice President Michael A. McRobbie
• ~$100M annual budget
• Technology services offered university-wide
• UITS comprises ~600 FTE staff, organized into
crosscutting units (e.g. finance and HR) and four
technology divisions (Teaching & Learning Information
Technology,Telecommunications, University Information
Systems, Research and Academic Computing)
I
N
D
I
A
N
A
U
Transformations
N
I
V
E
R
S
I
T
Y
A Bit of Anthropology of
Organizations
and
How IT Is Organized
I
N
D
Culture, Strategy, Organization & Structure
I
A
N
A
U
• Mary Douglas. How Institutions Think. Syracuse
University Press, 1986. Dimensions of Grid and Group
High
Grid
N
I
V
E
Low
Grid
anarchic
isolates
(marginal to
society)
individualistic
(ego-focused
network)
hierarchies
(nested/bounded
groups)
Low
Group
High
Group
sect/enclave
(bounded group)
R
S
I
T
Y
I
N
D
Culture, Strategy, Organization & Structure
Continued
I
A
N
A
U
• James Cortada. Best Practices in Information Technology.
Prentice Hall, NY, 1999
High
Grid
systematic
production
hackers
N
continuous improvement
I
V
E
R
Low
Grid
systematic
customization
craft
culture
Low
Group
High
Group
S
I
T
Y
I
N
D
Value
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• IT and Value Creation
– It’s all about time: powers of automation and
augmentation
• IT and Value Destruction
– It’s all about time: wasted time due to poor operating
systems, poorly crafted applications, and mysterious,
opaque user interfaces
• IT and Value Protection
– It’s all about time: time spent in support and education
I
N
D
Measures of Performance and Success
I
A
N
A
U
N
• Do not have measures like EVA and “profit” as a measure
for the success of university IT organizations
• Must draw exemplars from business and benchmarks from
wherever they are available
I
V
E
• Organization performance: IBM “Adaptive Organization”
and “Customer Relationship Management”
R
S
I
T
Y
• Measurement: “The Balanced Scorecard” and “ Counting
What Counts”
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Performance Measures for All Organizations,
Including University IT Organizations
• Robert Kaplan and David Norton. The Balanced
Scorecard. HBS Press, Boston, MA, 1996.
• Four dimensions of retrospective and prospective measures
– Financial perspective: deployment (and growth) of revenue, ABC
against internal (historical) and external benchmarks
– Customer perspective: customer satisfaction measures, number of
partnerships with faculty in teaching and research, support of
university business processes, support of library processes
– Internal perspective: process measures, classic IT measures of
availability, cost-of-poor-quality, speed and depth of development
cycles
– Learning perspective: employee satisfaction, employee
development (MSCE, CCNE, etc.), personal alignment of
employee goals with position
I
N
D
I
A
N
A
U
Measuring Quality and Cost
N
I
V
E
R
S
I
T
Y
Customer Surveys
Activity Based Costing
I
N
D
Importance of Quality
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Claiming that quality is important is easy. Doing
something useful is hard.
• Measurement of quality requires
– Leadership initiative
– Understanding your services
– Commitment to two-way, fact based communication
with customers
• Motivations at IU
– Initiative
– Feeling of responsibility
– Responsibility center management
I
N
D
RCM
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• RCM Implemented at IU in 1989 (Whalen 1991).
• IT organization(s) defined as a Responsibility Center; paid
for through a non-discretionary tax on other RCs
• IT customers are also captive users
• Why not decentralize?
I
N
D
User Satisfaction Survey
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Created out of leadership desire for quality and
accountability, and desire for fact-based response to
assaults on IT center’s budget
• Key features of Survey
– Administered by independent survey organization
– Stratified random sampling; survey center finalizes and
administers survey, performs randomization, tabulates
responses, ensures anonymity
– This assures credibility and quality of data
– Longitudinal comparisons assured by consistency of
questions from year to year
I
N
D
User Satisfaction Survey details
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Undergraduate students, graduate students, staff, and
faculty sampled. N has varied; current 1000 undergrad,
500 each remaining category per campus
• Likert scale, Y/N questions, demographic information,
opportunity for comments at end
• Data reported as average score (+ 95%CI), satisfaction
percentage (percent scoring 3-5), percentage who use a
service.
• All results, including every text comment ever written
[identifying references deleted] available on the Web at
http://www.indiana.edu/~uitssur
I
N
Some key results: Overall Satisfaction
D
I
A
A
U
N
I
V
E
R
S
I
T
Y
Percentage
N
99
97
95
93
91
89
87
91
92
93
94
95
96
Year
97
98
99
00
I
N
D
I
Overall quality maintained by attention to
individual services
A
N
A
U
N
I
V
E
R
S
I
T
Y
Satisfaction with Stat/Math Center
I
N
User demographics: ownership
D
I
A
A
U
N
I
V
E
R
S
I
T
Y
Percentage
N
100
90
80
70
60
50
40
91
92
93
94
95
96
Year
97
98
99
00
I
User demographics: time spent using
computer per week
N
D
I
A
A
U
N
I
V
E
R
S
I
T
Y
Hours per Week
N
20
18
16
14
12
10
91
92
93
94
95
96
Year
97
98
99
00
I
N
D
Survey Credibility and Utility, 1
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Response rates:
– IUB survey response rates in 1991 ranged from 23%
(undergraduates) to 50% (faculty).
– IUB response rates in 2002 ranged from 41%
(undergraduates) to 64% (staff)
– IUPUI survey initiated in 1997 with a 26% response
rate for undergraduates
– IUPUI response rate for 2000 at 38% (undergraduates)
to 55% (staff)
• High response rate due to small incentive ($5-$10 value)
and to recognition of value of survey
I
N
D
Survey Credibility and Utility
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• External
– Results disseminated rapidly and widely. Survey
conducted Feb-end of April; results on Web in July or
August
– Results, and actions taken as a result, publicized widely
throughout the year
• Internal
– Key component of annual internal quality
assessment/plan
– Often used in internal proposals as justification
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Survey Credibility and Utility: VAX phaseout
as an example
• IU had largest academic VAX center in US, and had
depended upon VAXes for ~15 years
• IT organization announced plan to eliminate use of VAXes
within 3 years in 1994
• Cost and quality measures informed this decision
• Reaction: horror, humor, horror
I
N
D
VAX phaseout reaction
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Horror: initial reaction of user community. Project leader
set the following goal: Within two years, at least 95% of
those surveyed would respond “yes” to the question “Is
the improvement in your computing environment sufficient
to more than outweigh the cost to you of conversion”
• Humor: initial reaction of some colleagues
• Horror: reaction of project team when they realized the
project manager was serious
I
N
D
VAX phaseout: the results
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• The goal was met.
• The setting of user satisfaction (rather than computer
center convenience) as a key goal resulted in openness to
the project that was essential to its success
• Full details on Web: Stewart, C.A., et al. 1998. Changing
(almost) everything and keeping (almost) everyone happy.
CAUSE/Effect 21:39-46.
http://www.educause.edu/ir/library/html/cem9837.html
• Key lesson: if you are doing the right thing and
communicating well, user opinion will not lag far behind
expert opinion
I
N
D
Activity based Costing
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• “There are no results inside an organization. There are
only costs.”
– Peter F Drucker, Managing the Nonprofit
Organization: Principles and Practices. Harper Collins,
NY, 1990. p. 120
• Activity Based Costing-Activity Based Management
– John Shank and Vijay Govindarajan. Strategic Cost
Management. Free Press, NY, 1993
– Robert Kaplan and Robin Cooper. Cost and Effect.
HBS Press, Boston, MA, 1998
I
N
D
Activity Based Costs and Management
I
A
ABCosts
N
Wages and Benefits
A
Training
Organization
Products
People
Hardware and Software
(Expense or Depriciation)
U
N
I
Maintenance and Other
Contracts
Processes
Unit Costs
Circuit Charges
E
Data
Video
Voice
R
Organization Sustaining Activities
S
Other Costs
T
Y
SERVICE
Expendable Supplies
V
I
Quality
Measures
Knowledge
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
The Ferengi “First Rule of Acquisition”: Once you
have their money, never, ever give it back
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
UITS Services — Bloomington Campus
1998-1999 Fiscal Year
Report on Cost and Quality of Services
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
UITS Services — Bloomington Campus
1998-1999 Fiscal Year
Report on Cost and Quality of Services
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
UITS Services — Bloomington Campus
1998-1999 Fiscal Year
Report on Cost and Quality of Services
I
N
D
ABC data
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Full reporting of data on Web at
http://www.indiana.edu/~uits/business/scindex.html
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
A Case Study of Activity
Based Management
Reengineering E-Mail at Indiana
University Bloomington
I
N
Cost and Quality: an E-mail example
D
I
A
N
U
N
I
V
E
R
S
I
T
Y
Percentage
A
100
95
90
85
80
75
70
65
Pine
Exchange
Vax Mail
Overall
92
93
94
95
96
Year
97
98
99
00
I
N
D
I
Volume and Cost for e-mail services at
Indiana University Bloomington,
1996 - 1998 Academic Years.
A
N
A
1996-97
66,000
1997-98
48,000
1998-99
46,000
307,000,000
270,000,000
179,000,000
U
Number of Mail
Accounts
Number of
Messages
N
Mb Received
1,258,000
NA
391,825,000
I
Cost/Message
$0.001
$0.002
$0.002
V
Cost/Account/
Year
$8.96
$9.80
$6.11
E
R
S
I
T
Y
I
N
D
I
User perceived quality measures for five mail
systems used at Indiana University Bloomington,
1996-1998 Academic Years
A
N
1996-1997
Percentage
Satisfied
A
1997-98
Satisfaction Percentage
Score
Satisfied
1998-1999
Satisfaction Percentage
Score
Satisfied
Satisfaction
Score
U
Pine
95.1%
4.1
90.5%
3.9
85.8%
3.7
N
Unix
Mail
(Elm)
Eudora
82.1%
3.5
82.2%
3.5
62.2%
3.0
96.5%
4.2
87.8%
4.1
92.3%
4.1
89.0%
3.8
78.4%
3.8
76.9%
3.5
96.3%
4.2
85.0%
3.8
92.5%
4.2
I
V
E
R
S
I
T
Y
Group
Wise
Exchange
I
N
D
I
Comparative measures for e-mail support requests
in Indianapolis and Bloomington during the
academic year 1998-1999.
A
N
A
1998-99
Total
Number
of Users
N
Pine IU-B
44,000
1,537
1 per 28.6
Percentage
of e-mail
Calls to
Support
Center
23.8%
I
Outlook
Exchange
IUB
Pine IUPUI
4,500
1,773
1 per 2.5
27.4%
2.7%
45,000
1,943
1 per 22.7
22.7%
4.2%
6,000
2,425
1 per 2.5
28.2%
5.3%
U
V
E
R
S
I
T
Y
Outlook
Exchange
IUPUI
Number of Call per
Calls to
User
Support
Center
Percentage
of Total
Calls to
Support
Center
2.4%
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Balanced Scorecard: example from Research
& Academic Computing
• Service measures (for all, current levels and from last year)
– Quality – from user survey
– Utilization – from user survey
– Cost
• Staff
– Turnover
– % receiving some sort of external certification
• Results
– Number and $s of grants applied for or supported
– Number and $s received (not less than $1M in last 3 years)
– Publications
• Enabled by our services
• By IT organization staff
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Balanced Scorecard 2002
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Balanced Scorecard 2002
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Balanced Scorecard 2002
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Balanced Scorecard 2002
I
N
D
Quality Planning
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Internal quality plan written by each management unit,
focusing on assessment and plans for improvement
• User survey and ABC data key indicators
• Any service with a satisfaction rating of less then 90% is
generally taken to be cause for action
• IT center budget office, which has customers only within
the IT center, does its own annual survey!
I
N
D
I
A
N
A
U
IT Support@IU
N
I
V
E
R
S
I
T
Y
Support in Breadth
I
N
IT Support at Indiana University
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
IT has invaded our lives and culture
Demands for IT support increased exponentially
Budgets remained steady (at best)
How do we deliver a quality product?
I
Magnitude of General Purpose
Computing at IU
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Computer systems
• 15,000 on life-cycle funding
• 15,000 estimated other components or devices
• 108,000 privately owned systems (94%)
• Telephones
•
•
•
30,000 desk units
1,144 cell phone
1,045 pagers
• PDA and new technologies
• Approaching a 200,000 device problem
I
N
D
Increasing Support Needs at IU
I
A
N
A
Computer ownership increased 48% in 10 years . . .
Halls connectivity increased from 53% to 100% in 5 years . . .
Usage increased 3.1 hours per week in two years . . .
Percent of Faculty, Staff, and Student with Computer at Home
U
N
100.0%
90.0%
80.0%
I
70.0%
V
60.0%
Hours of Computing on Weekly Basis
22.5
50.0%
E
22.0
21.5
40.0%
21.0
R
30.0%
S
20.0%
20.5
20.0
19.5
10.0%
I
T
19.0
18.5
0.0%
1991
1992
1993
1994
1995
1996
1997
18.0
1998
17.5
1999
2000
2001
2002
17.0
Y
2000
2002
I
N
D
Some background
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Support free, unlimited to 115,000 IU Faculty, Staff,
Students.
• End-user, general-purpose support provided by central
computing organization (UITS)
– Support Center
– IU Knowledge Base
– Student Technology Center Consultants
– Residential IT Support
– IT Training and Education
– Call Center (Telephony)
I
N
D
End-User General Support
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• 2.5 Million personal contacts each year
– Equates to 1 every 12 seconds 24 hours/day
• 8.5 Online front-line contacts each year
I
N
D
Support Center
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Provide support to 217,760 contacts last year
• Variety of questions/platforms
• Various support delivery methods
– Call Center
– Walk-in Center
– Online
I
N
D
Call Center
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• 24x7x365
• Multi-campus
• Calls distributed to ACD routed queues
– Computers/Applications w/ Windows OS
– Computers/Applications w/ Macintosh OS
– Computers/Applications w/ Unix OS
– Mission Critical Systems
– General IT questions
I
N
Walk-In Center
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
Campus-centric locations
Support requiring personal authentication
Computer configuration
General-purposes support of convenience
I
N
Online Support
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
Support via e-mail
Knowledge Base – 7.2M hits per year
Notifications - .5M hits per year
#1 Division Priority for 2003
I
N
D
Managing the quantity of contact
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Proactive versus reactive support
• Tools to reduce time of contact
• Increase quality of answers
I
N
D
Measuring the Quality of Contact
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Annual Department Survey
• Daily Support Center Survey
– Simple measures
– Non-intrusive to customer
– Indicator of quality of service
I
N
The 3 Key Indicators
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
1.
2.
3.
Did the customer receive a solution
in a timely manner
and delivered courtesy and respect ?
I
N
What do we do with the info?
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
Send to 45 users from day before
Every response is read and every “no” receives a response.
Used as indicator of need for training or resource allocation
Positive feedback
Date: Mon, 15 Jan 2001 11:15:49 -0500 (EST)
From: kdyn <kdyn@indiana.edu>
To: <scpc@indiana.edu>
Subject: Re: Dell Optiplex GX1..MS Natural Keyboard
(#269.1445)
THANKS, Jim. Please forward my message on to your boss –
this is the BEST service I've ever gotten on ANY computerrelated problem anywhere. I'm glad my technology fee is
paying your salary. THANKS.
I
N
Sample Results
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Call Center
Walk-In Center
via Email
Total
Responses % Satisfied Users
1067
92.7%
148
94.6%
195
88.2%
I
N
D
Managing the Resources
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Utilize technology for what it does best
• Utilize humans for what we do best
I
N
Knowledge Base Background
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
KB was born in 1988
Internal tool to retain computing support information
Became a self-serve tool in 1995
For more info see: “What is the history of the Knowledge
Base” at http://kb.indiana.edu/data/acjq.html
I
N
D
What is the KB?
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
The KB is a system with three physical components:
1. It is a repository of problems and their resolutions
(currently holding over 8,000 entries);
2. It is a search engine to sort through that repository
(currently handling over 50,000 queries each week); and
3. It is a user interface that allows a user to engage the
search engine to manage the data in the currently used
by hundreds of thousands of users worldwide).
I
N
D
KB Quality Control Process
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Extensive process control
• Review and address the “goose eggs”
• Measure the “no, I did not get an answer”
I
N
Connecting the People and the Technology
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
Trouble Ticket System replaced with Work Flow System
Enterprise-wide
Eliminates the “Post-It-Note” syndrome
Manages workload, communication, information, and
escalation.
• Eliminates duplication of work
• Manages KB process
I
N
Student Technology Centers
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
•
2300 Machines, 127 Locations, 227 Consultants
80 percent of student body use consultant services
175,000 personal contacts per year
Training previously focused on technology
Change to service oriented training, given technology
resources
• Improved satisfaction from 75% to 93.9%
I
N
D
Residential IT Support
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• In-room services – 11,000+ students
– 3000 in-room service calls/year
• Residential Technology Centers – 400 workstations
• Concentrating on proactive support
• Concentration on connection to network
– 7,400 computers successfully connected by start of classes 2002
– Increased successful in-room consulting visits by 37% in 1 year.
– Support Center first line of contact
• Satisfaction Metrics
– Increased from 65.4% in 2000 (before UITS)
– To 93.8% in 2002
I
N
D
IT Training and Education
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Better educated users support themselves
– Or ask harder questions….
• Free classroom training
• 29,434 students last year
• Free CBT training
• Satisfaction rating 98.1%
I
N
D
I
A
N
A
U
IT Support@IU
N
I
V
E
R
S
I
T
Y
Support in Depth
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Center for Statistical and Mathematical
Computing
• Formed in 1991, prototype for IT center specialized
support services.
• http://www.indiana.edu/~statmath/
• Formed as part of Interdisciplinary Consortium of
Statistical Applications, with Dept. of Mathematics and
Graduate School.
• Math and Grad School provide statistical consulting by a
faculty member and specialized classes
I
N
Good Support – A total team effort
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
•
•
•
•
Not just the Support Organization
Careful Implementation Plans
Extensive Change Management
Preparation of Support Resources
– Consultant training
– KB entry
– Support Tools
• Careful communication to Organization and Customer
base
I
N
D
I
A
N
A
U
IT Support@IU
N
I
V
E
R
S
I
T
Y
Distributed Support
I
N
D
Distributed Support Services
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Classic set of support models (McClure, Smith, Lockard)
– Centralized
– Decentralized
– Haphazard
– Distributed
• Planned/Proactive (IU Approach)
• Reactive
• Feral
I
N
D
Leveraged Support Model
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Roles (Voss, Workman, Alspaugh, et al)
– Support provided centrally
– Support provided by the users themselves
– Support provided by local IT staff
• Central IT organization responsible for ensuring all
elements
I
N
D
I
A
N
Quality Distributed Support
Evolution at Indiana University
• Distributed Support Assistants (DSA) Program
A
U
N
I
V
E
R
S
I
T
Y
• Technical Information for Excellent Support (TIES)
Program
– Education/Certification (Ed/Cert)
• Partners in Computing Support (PICS) Program
I
N
D
I
A
N
A
DSA Program – Seeding Quality Local
Support
• Two-year “Try before you Buy” Effort
– Funding assistance from IT organization
– Staff mentoring and coordination
U
N
I
V
E
R
S
I
T
Y
• Advantages
– Seeing the value of local support
– Consistency of local support with central IT
– Partnering – common purpose, non-compete
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
DSA Program – Seeding Quality Local
Support
• Unforeseen Consequences
– Development of local IT job market
– Close-knit Community
– Career paths – in and out of central IT
• Covered 26 campus entities
– Success led to ‘wild growing’ local support in other
units
I
N
D
TIES Program
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• As support grew wild, it needed training and contact with
central IT – thus TIES was born
– 10-12 weekly sessions each semester
– Training and information sharing focus
– Integration with DSAs elsewhere
• As time passed, need for detailed skills that were
certifiable – Ed/Cert developed
I
N
D
PICS Program
I
A
N
A
U
N
I
V
• More than just a user group – a set of services
– Tools acquired centrally, served to local staff
– Forums for discussion, sharing, and learning
– Special Focus on serving our ‘partners’
• Forum for getting feedback on services
E
R
S
I
T
Y
• Forum for getting input from partners
I
N
D
I
A
N
A
Organizing for Supporting Local Support
Providers
• LSPs need forward looking focus
– Departmental Computing Advising and Support
– Departmental Support Lab
U
N
I
V
E
R
S
I
T
Y
• LSPs need back-up support
– Local Support Provider Services
– LAN Lab
– 2nd and 3rd level expertise in key applications and
technologies
I
N
D
An example …
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Date: Tue, 30 Jan 2001 10:17:46 -0500
From: LSP Services <lspserve@IUPUI.EDU>
To: ITSTAFF@LISTSERV.IUPUI.EDU
---------------------------------------Windows 2000 and the Active Directory is almost upon us. As most of you have
hopefully heard by now, the Active Directory is an upgrade to the universities current
Windows NT 4 infrastructure. UITS is planning on a production deployment by late
February. Please join us in an information session this coming Friday morning, February
2nd 9:00 am in BS 2007. We will be discussing the design and architecture of the
new Active Directory as well as some new technologies that will be introduced in this
upgrade. Representatives from LSP Services, Messaging and the Security office will be
there. This will be followed up by a question and answer session. If you have any
questions please don't hesitate to bring them with you and ask! Hope to see you there!
-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
LSP Services
University Information Technology Services
LSPSERVE@IUPUI.EDU
317.274.6577
I
N
Satisfaction
D
I
A
N
• User Survey results for UITS distributed support services
2001 2000
A
U
Satisfaction with UITS
Distributed Support Services
1999 1998 1997
90.1% 93.3% 90.1% 90.8% 91.4%
N
I
V
E
R
S
I
T
Y
• User Survey results for local support providers
2001 2000
Satisfaction with LSPs
in departments
1999 1998 1997
88.8% 89.1% 85.4% 85.7% 86.8%
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Key Factors in Delivering Quality Local
Support Services
• Organizing Central IT to deliver support
• Emphasizing Local Support Providers
• Emphasis on building LSP skills
• Emphasis on building User skills
• Providing Specialized Support services
• Providing Tools that enable user self-support (Knowledge
Base)
I
N
D
Other Environmental Success Factors
I
A
N
A
• Institutional/Enterprise Software Licensing
– Standardization and consistency
– Latest versions – lessen support burden
U
N
I
V
E
R
S
I
T
Y
• Departmental IT Equipment Modernization and life-cycle
funding
– Minimizing and eliminating problems and workload
caused by out-of-date equipment
– Standardization – volume purchases
I
N
D
Support is the Critical Element
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
• Adds value through standardization of hardware and
software
• Adds value through education of customers and IT
professionals
• Prevents some of the destruction of value through poorly
designed software and poorly manufactured hardware,
I
N
D
I
A
N
A
U
IT@IU
N
I
V
E
R
S
I
T
Y
The Value Equation
I
N
D
I
Value for Teaching, Research, Learning and
Overall Satisfaction Measure
A
N
Overall Satisfaction and Value Measures IUPUI
A
U
N
I
V
E
R
S
I
T
Y
5 Best 1=Worst
4.2
4
Value of IT in
Teaching
3.8
Value of IT in
Research
3.6
Value of IT in
Learning
3.4
Overall Satisfaction
with IT Services
3.2
1998
1999
2000
2001
2002
I
N
D
I
Value for Teaching, Research, Learning and
Overall Satisfaction Measure
A
N
Overall Satisfaction and Value Measures IUB
U
N
I
V
E
R
S
I
T
Y
5=Best 1=Worst
A
4.4
4.3
4.2
4.1
4
3.9
3.8
3.7
3.6
3.5
3.4
Value of IT in
Teaching
Value of IT in
Research
Value of IT in
Learning
Overall Satisfaction
with IT Services
1997 1998 1999 2000 2001 2002
I
N
D
I
A
N
A
U
N
I
V
E
R
S
I
T
Y
Indiana University Bloomington
Download