Document 11038305

advertisement
Working Pap er #111 l-{
never published
~^^^^^
Center for Information Systenns Research
Massachusetts
Institute of
Alfred P- Sloan School of
Technology
Management
50 Memorial Drive
Cambridge, Massachusetts, 02139
617 253-1000
DEFINING SUCCESS FOR DATA PROCESSING
A practical approach to strategic
planning for the DP department
Robert M. Alloway
March 1980
CISR No. 52
Sloan WP No. 1112-80)
-1-
Introduction
There
is
Processing;
plethora
a
complaints
of
and
admonitions
Unfortunately,
what it should do and will be.
"discussions" focus on one DP function or service at
out
of
addressed
explicitly
trsde-offs
Consequently,
context.
and
between
priorities
relative
a
DP
concerning Data
most
time,
of
these
leaving each
functions
are
not
resource allocation are
for
Amidst this heated discussion, user and DP managers look
not established.
to each other in vain for policies and priorities.
Consequently,
fundamental
important
rule
activities.
This
on
improvement.
companies as
of
each
—
resources
study
studied
identify
ensure
to
identifies
activity,
Moreover,
a
companies
six
management
allocate
and
performance
the
and
are
which
good
important
prioritizes
violating
the
most
activities
are
most
performance
activities,
management
do
measures
attention
for
practical foundation for DP strategic planning.
finance or manufacturing for example,
we have a good understanding of what constitutes success.
and
those
the approach used in this research is adaptable by
In most other functional areas,
can
on
measure
performance
in
various
ways,
assess
Consequently we
strengths
and
weaknesses, set revised objectives, and reallocate resources as necessary.
think equivocation on strategic planning for DP is primarily due to the
I
wish to thank the anonymous companies and managers who participated in
this research, Christine Bullen for her management of the data gathering
process, 3erry Nolte for his statistical analysis support, and the Center
for Information Systems Research of M.I.T. for partial funding.
I
n:i^^°i7
-2-
lack of
an
Defining
integrated
success
understanding of
basic
the
is
constitutes
what
prerequisite
success
performance
to
for
DP.
measurement,
determining priorities for improvement, and strategic planning.
Unfortunately,
the definition of success for DP is a moving target.
DP is being driven by dramatic growth in demand for new application systems
of more diverse types
successes have raised managers'
aspirations
to ever higher levels of information availability and analysis.
Increasing
Past
.
corporate data resources, continuing improvements in technology, and
growing DP literacy among user managers are all occurring simultaneously.
A continual process of defining and measuring DP success is necessary
to meet
DP
We must
this challenge.
identify relevant, current and emerging,
functions and prioritize them.
formance
order
in
to
modify
And
we
policies
and
continually measure per-
must
procedures,
shift
managerial
emphasis, and allocate scarce resources.
industry needs
In short,
linked
strongly
to
users'
a
strategic planning process for DP which is
needs
and
Three
planning.
corporate
impediments are frequently encountered when using
the
major
currently available
approaches:
1.
the strategic planning process itself
2.
lack of a corporate strategic plan
3.
indeterminate implications of corporate plans for DP
Several strategic planning methodologies are available;
are predicated upon an
planning
process
management
to
is
define
"User Managers'
forthcoming.
Every strategic
integrated definition of success.
itself
content-free.
success.
System
Needs,"
As
too
Robert
It
many
M.
is
DP
the
responsibility
managers
Alloway,
however, all
already
of
know.
CISR Working Paper
,
-3-
without
an
integrated definition of success,
the outcome of attacking
an
enormous complex planning problem with a content-free process is inevitably
inadequate.
Unfortunately,
not
all
Chief
Information
explicit corporate strategic plans.
such plans do not exist.
informal
In
some,
In others,
senior management
the
is
CIO
In many cases
have
access
to
this is simply because
the CIO is not a regular member of the
discussions
privy,
not
Officers
by
which
evolve
omission
implicit
commission,
or
strategies.
to
existing
explicit plans because of their critical, proprietary, competitive nature.
Even when the CIO has access to explicit corporate strategic plans it
is seldom easy to divine a collateral supporting DP strategy.
facilitative,
translation
enabling
and
into
DP
knowledge of current
Heavy
participation
Opportunity
goals.
user
by
dominate
benefits
senior
user
managers
priority and resource allocation trade-offs.
interpolation
the
recognition
activities possessed
only
Qualitative,
by
requires
user
necessary
is
and
intimate
management.
to
resolve
Total DP budgets for combined
user and DP efforts must be balanced against goals, personnel availability,
technological trends, and organizational feasibility.
Moreover,
efficient
enough
scarce management
the
to
entire DP
be
time.
planning
per formed/ revised
Few
companies
process
should
annually
without
have been
able
to
be
quick
and
overburdening
develop linked
corporate and DP strategic planning procedures both effective and efficient
enough to overcome these impediments.
The head of DP will be referred to as the Chief Information Officer (CIO)
because so many different titles are used in practice.
-A-
Undaunted, DP and user management has achieved
typical
The
manager
regularly
uses
to
3
tremendous progress.
systems
4
access
for
and
manipulation of information, most of which was previously quite cumbersome,
unecononic, or simply unavailable.
2
DP planning does result in projected budgets, manpower requirements,
and
hardware configurations.
planned
Decisions on specific projects are
made in light of corporate ROI, user departmental priorities, and 3-5 year
systems plans.
DP.
All
potential
The
which
of
desirable but is not a strategic plan for
_is
improved
for
effectiveness
of
efforts
DP
through
a
practical approach to DP strategic planning is worth pursuing.
article
This
managers
which
definition
provides
of
key
results
the
specifics
the
assessment
success,
of
identification
discusses
problems,
and,
of
survey
a
—
planning
for
strengths
of
based
the
on
of
user
an
and
and
DP
integrated
weaknesses,
recommended
results,
priorities for action.
results
The
your
firm
but,
concrete
point
works.
The
from
even
of
the
if
you
departure
advantages of
firms
six
feel
and
a
this
to
DP
disadvantage
planning,
is
its
and
deceptive
they
are
probably applicable to
are
this
not,
diagnostic
specific
approach
from corporate strategic planning,
needs
studied
are
its
provides
paper
a
methodology which
procedural
independence
strong linkage and translation of user
minimal
time/effort
simplicity
and
required.
potential
for
The
misuse
key
—
yielding misleading results for those who lack experience and expertise in
quantifying subjective managerial opinions using questionnaire measurement
techniques.
2
"User Managers'
forthcoming.
Systems
Needs",
Robert
M.
Alloway,
CI5R Working Paper
,
-5-
Reseerch Method
Identification of criteria on which to
extensive discussions
and
"blue-skying".
measure
The resulting
tically long and dominated by technical/operational
partitioning of strategic, managerial
_is
I
began
was
with
imprac-
Anthony's
activities was used
make no claim that this
See Exhibit
good start.
a
list
criteria.
and operational
to balance and shorten the list to 26 criteria.
list is exhaustive or optimal, but it
success
for a
1
list of the 26 criteria.
Next,
needed importance and performance ratings for
I
As part of a larger research project
from many managers.
managers
in
6
diverse
manufacturing
to
be
See Exhibits 2 and
a
each criteria
114 DP and user
an
stratified
extensive
sample
of
Finance and Manufacturing
senior, middle and junior managers from the DP,
departments.
,
completed
firms
selected
were
Respondents
questionnaire.
4
for firm and respondent profiles.
3
Every manager rated each of the 26 criteria on two separate questions
one
used
to
for
importance and one
weight performance
computing success
known
by name
departments
for
and
were
for
when
performance.
calculating
success
each of the 26 criteria.
organizational
easily made on
role,
Relative importance was
for
DP
and
when
Because every manager is
comparisons
individual criteria,
between
managers
and
groups of criteria,
and overall success.
Planning and Control Systems
1965
,
Robert
N.
Anthony,
Harvard Business School,
Robert M. Alloway,
Preliminary Results,"
"User Needs Survey:
Sloan School of Management Working Paper, 1096-79, December 1979.
et
^,
-6-
Is there a DP success problem ?
Overall DP success is mediocre.
Exhibit
average success is only
ii,
expect
DP
success
whereas
realistic;
before
ratings
4
of
7
Measured on the seven point scale in
4.1
(excellent)
teetering,
is
the 6 firms.
for
most
,
but
is naive
to
should
be
(good)
5
firms have
It
major DP shake-up
a
(inadequate) is reached.
3
These results are not a
Yes, there is a real problem in DP success.
surprise but certainly a cause for concern.
user managers,
"good",
incentive,
4.0,
and
5.0,
across
rated
naturally,
managers,
DP
departments
their
higher,
the board,
to
There
7.0.
than
Things are not
but nonetheless they basically agree.
definitely not "excellent",
4.4,
is
considerable
address the definition of DP success and
its improvement.
Firm
4.8.
More
are even
has
success are large,
firm differences
importantly,
larger,
these two
still
differences in overall
ranging
from
3.2
to
in
4.8.
users'
Talking
firms is the difference between night
ample
room
for
improvement
but
ranging
from 3.5 to
ratings of DP success
to
user
and day.
improvements
managers in
Firm
are
5
at
4.8
possible,
supported, and underway.
On the other hand. Firm 3 is in real trouble, not only because of low
user success ratings, but
(4.0)
user
and
user
ratings
(3.2)
makes
also because of the large difference between DP
success ratings.
problem
recognition
A
large difference between DP and
and
cooperation
Firm 3, for example, underwent a fundamental transfer
away from DP almost concurrent with our data gathering.
for
of
improvement
responsibility
-7-
significantly more difficult.
The
management
of
firm 3
decided
it
was
necessary to completely restructure DP in order to overcome this barrier to
improvement.
Firms
ratings,
5.1
recognition
and 6 also have large differences between user and DP success
1
to A. 4
and
and 4.6
cooperation,
to 3.6
DP
and
respectively.
users
become
Without common problem
entrenched
in
attack-
defense efforts thereby stagnating improvement until success becomes so low
a
radical restructuring of DP is finally forced.
Common
knowledge
recognizes
a
widespread
problem
with
DP
success,
user dissatisfaction, and important differences in DP success among firms.
This research confirms this DP problem with empirically measured importance
and performance on 26 criteria integrated into an overall definition of DP
success.
This alone is a significant contribution.
The real
contribution, however, derives from what can be learned by
dis-aggregating overall success.
For the first time,
within an integrated
definition of DP success we can dig deeply into patterns of strengths and
weaknesses,
compare
firms,
deal
with
search for the causes of low success.
trade-offs
between
criteria,
and
-8-
What are DP's strengths and weaknesses?
to
diagnose
identification
of
major
accomplished
focusing
order
In
grouping
the
by
areas
to
success
low
of
strength
of
average
users'
on
criteria
26
causes
weakness.
and
ratings
of
between
differentiate
begin
must
we
DP
areas
with
This
was
success,
and
relative
of
strength and weakness.
Cynical opinions notwithstanding, user managers are quite capable of
Exhibit
differentiating among various DP functions.
ratings
success
criteria
(5.0)
even greater
is
on
criterion
significantly
(training
each
for
than
range
the
from
users
for
in
firm
competence
inpdequate
in
range
The
(technical
H
different
programs
criteria.
DP
1
lists average users'
Good
success.
the
of
success
across
success
of
(2.7)
on
Users
capabilities).
success
staff)
DP
criterion
can
the
and
is
D
do
recognize criteria where DP is successful and distinguish those from other
criteria where it is not.
All
DP activities are not
"tarred with the same
brush" by the user managers studied.
The 26 criteria were clustered
on
patterns
common
conceptually
group
in
like
user managers'
DP
groups in Exhibit
into the 8
activities
performance
together
and
1
based
Users
ratings.
distinguish
between
management/process problems, user relations, and ongoing daily activities.
This makes a significant point.
to
differentiate
unrelated
to
the
among
them.
Users understand DP activities well enough
User managers'
realities of DP.
opinions
Users opinions are
are
a
not
valid
naive
or
source
of
relevant importance and performance information for diagnosis and planning.
the 26 criteria have been rearranged from the original
questionnaire.
A product-moment principal factor analysis was used for
grouping.
Obviously,
-9-
The
managerial,
effectiveness-oriented
Development and User Relations)
groups
are less successful
of
criteria
(e.g.
from the users'
point
of view than the traditional, efficiency-oriented groups (eg. Technical and
According to these user managers the essence of effectiveness
Operations).
improvement
in
DP
the
is
application systems.
preparation,
of more
managerially-oriented
This necessitates better user communications,
responsiveness,
training,
development
improving
attitudes
and
the
as
well
supporting
as
systems development
proposal
process itself,
and
restructuring the systems development backlog.
Do strengths and weaknesses vary by Firm ?
I
expected
the
patterns
of
strengths
and
weaknesses
be
The hypothesis
significantly different between high and low success firms.
was that high success DP departments would do well
to
on both efficiency and
effectiveness criteria, whereas, the pattern for low success DP departments
would be good
efficiency but
poor
effectiveness.
This,
however,
is
not
what the data says.
Firms
and
low
Exhibit
two
and 5 were selected to contrast the differences between high
3
success
1.
firms.
firms
using
the
eight groups of criteria
identified
in
Refer to Exhibit 5 for user success ratings by group for these
Firms
3
and
5,
with
overall
success of
3.2
and 4.8
tively, have the same basic pattern of strengths and weaknesses
levels differ.
Both
—
respeconly the
firms have the same relative strengths in efficiency
and weaknesses in effectiveness.
This pattern holds for all six firms studied.
From the researcher's
point-of-view this common pattern of strengths and weaknesses turns out to
-lo-
be
fortuitous.
One
set of
recommendations
improvement will apply to
for
all six firms and it lends credibility to generalizations to other firms.
Average
ratings
user
DP
of
success by criteria
show a
relative strength in efficiency and weakness in effectiveness.
departments
studied
weaknesses.
This
have
is
a
this
same
terribly
pattern
helpful
of
relative success patterns by criterion and firm.
identification
deeper,
further
of
the
causes
for
dis-aggregating
low success.
success
in
a
a
description
of
is not,
For
causes
for
six DP
and
It
search
All
strengths
relative
diagnosis,
pattern of
however, the
we
must
causes
of
dig
low
success.
The commonly acknowledged and often discussed communication problems
between users and DP suggest three possibilities for search:
disagreements
between users and DP on relative success, importance, and/ or performance on
the criteria for success.
.
-11-
Do users and DP agree on success by criteria?
Surprisingly,
criteria.
If
and
DP
users agree on
the
success of each of
criteria
disagreements
however,
tightly grouped
are
success
over
exhibit
this
would be plotted
it
far
Notice how the user and DP success ratings for all of
from the diagonal.
26
26
Exhibit 6 plots DP versus user success ratings on each criteria.
either users or DP disagreed on any criteria,
the
the
are
not
packed
is
along
the
diagonal.
the
cause
searching
for;
—
both
information
insightful
with
are
we
Clearly,
alarming and hopeful.
Alarming, because the top of the "ladder
—
and users agree
to
success"
is
empty.
DP
nothing even approaches "excellent", and only technical
competence (H) barely makes the "good" rung on the ladder.
The bulk of the
criteria fall in the caution zone between the scale's midpoint and "good";
Seven of the criteria fall below the midpoint in
they could go either way.
the
clear
danger
zone
and
one,
is
D,
universally
recognized
below
as
"inadequate"
Exhibit
improvement,
DP,
is
through
co- joint
fulfilled.
a
also
is
6
lot
of
hopeful.
The
necessary
prerequisite
for
clarity of problem identification between users and
This
the
excellent
heated
agreement
debate
and
progress to the next level of discussion
—
between
allows
DP
and
interested
users
cuts
parties
to
problem causes and how to fix
them.
However, all DP managers are not equally convinced there is
problem nor do
on success.
all DP departments and
a
severe
users have such excellent agreement
Exhibit 4 shows some material differences between DP and user
success ratings, firm 6 for example at 4.6 and 3.6 respectively.
Moreover,
.
-12-
the
plots of DP versus user success ratings for each firm (not shown for
lack of space)
in
low success
indicate that some convincing is still necessary especially
firms.
higher the overall success rating,
The
the agreement between DP and users on the
the better
success of each criteria.
The
less successful firms face a preliminary task, refining co- joint agreement
on problem identification.
The gap between high and low success firms will
continue to widen unless low success firms cease their entrenched attack-
defense
efforts
improvement
and
redirect
their
efforts
to
diagnosis
and
cooperative
-13Is it importance or performance ?
Ratings
importance
for
plotted in Exhibit
diagonal.
DP
each
of
criteria
for
versus
DP
users
is
Again all 26 criteria are tightly grouped along the
7.
This dispels the possibility that low success results from user-
disagreements
over
importance.
There
relative importance of all 26 criteria!
excellent
is
agreement
This does not, of course,
that these importance ratings are "right" or do not evolve.
the
on
imply
does imply
It
opinions of the relative importance of the macro level
that user managers'
DP criteria studied are just as "right" as DP's.
The
plot of DP versus
ratings of performance on each criteria
user
reveals similarly excellent agreement.
(not shown for lack of space)
importance, and
each succeeding agreement between DP and users on success,
performance,
quality
credibility
the
understanding
user
of
revealed in Exhibits 6 and
both DP and the whole firm.
user
of
7,
DP
of
is
opinions
has
managerial
With
enhanced.
been
which
issues,
has
The
been
impressive and potentially beneficial to
User management is capable and clearly should
participate in the managerial process of diagnosis and planning for the 26
criteria studied.
So far, we have identified 26 criteria, measured their importance and
developed a
performance,
revealed
success,
the
simple but
reliable
integrated definition of DP
efficiency/ effectiveness
pattern
of
strengths
and
weaknesses by criteria for all six firms, and searched for the cause of low
success.
Three possible causes of low success have been checked and rejected.
DP
and
users
criteria.
given
and
all
The
on
quality
success,
and
the "discussions"
DP -user
caiisfi
agree
importance,
consistency of
about
communication problems.
of Inw RiinnRSR?
and
this
performance
agreement
unknowledgeable users,
The
question
for
is
all
26
surprising
unresponsive DP,
remains;
what
is
the
-14-
Found:
the cause of low success
The cause of low success is fundamental
and
straightforward
—
user
managers see absolutely no relationship between importance and performance!
Exhibit
8
plots
criteria.
users'
ratings
importance
of
and
performance
These criteria should all lie on the diagonal.
each
for
Instead, there
is no managerially justifiable pattern, or any pattern for that matter;
resembles only
a
it
random shotgun pattern.
The most basic managerial rule, allocate resources to perform well on
the most
criteria,
important
tasks,
and
have
D
B,
is
clearly being violated.
performances
ranging
from
Equally
good
to
important
inadequate.
Equal performance results are achieved on criteria Z and W with importances
ranging from indifferent to high.
Exhibit 9 plots DPs'
26
criteria.
This
displayed in Exhibit
ratings of importance versus performance for all
preempts
8
are,
to
the
argument
put
it
managerially
performance.
The
justifiable
imbalance
users'
perceptions
Again,
DP managers also fail
relationship
between
the
inaccurate.
politely,
criteria should all lie on the diagonal.
any
that
between
importance
and
these
to detect
importance
performance
and
is
definitely the fundamental cause of low success.
Consider for example, the development of monitor, exception, inquiry
and analysis type systems (criteria 3,
K,
L
and M)
.
The more traditional/
For a complete discussion of system types, their growth rates, managerial
appropriateness, and levels of demand see "User Managers' Systems Needs",
Robert M. Alloway, CI5R Working Paper forthcoming.
,
-15-
efficiency-oriented the system
higher
the
performance.
oriented the system
importance.
to
the
more managerial/effectiveness-
the lower the performance but the higher the
,
These four points nearly form a straight line sloping downward
right.
the
Conversely,
and M)
(L
and K), the lower the importance but the
(3
The
greater
importance
the
lower
the
performance!
the
Exactly the opposite of the relationship we should see.
Obviously it is time to shift management's attention and
reallocate
scarce resources to close the gap between importance and performance.
But,
where should we begin?
What hurts the most ?
DP and user managers in these six firms could significantly improve
success
if
conducted
they
diagnosis
indepth
an
of
importance-
their
performance gaps and prioritized their reallocation of resources to improve
performance
important
highly
on
criteria.
This
can
with
done
be
the
approach to defining DP success used in this research.
Exhibit 10 returns to firm
for an example indepth diagnostic of the
5
importance-performance gap at the firm level.
5
and
and
important)
achievable
standards.
(good
5
choose
"success"
quadrant.
performance
for
the sense that
This
is
somevliat
arbitrary,
implies no
change
is
The lower-left
but
corner
upper-right
The
these criteria.
lower
The location of the axes at
I
have tried
the
is
necessary in
to
relative
importance-
quadrant is also "OK"
in
performance on these comparatively unimportant
criteria is acceptable.
My
appear
recommendation
much
too
benign
for
these
until
one
two
quadrants,
contemplates
leave
the
level
them
of
be,
might
managerial
-16-
attention and effort required to improve the lower-right quadrant.
The lower-right
quadrant is the real "killer".
high importance but low performance.
These criteria have
These are the criteria which ruin a
DP department's reputation, drive users up the wall, seriously impair DP's
ability to deliver, and prevent user managers from receiving their relevant
information.
The upper-left quadrant should receive declining management
attention.
relative
resources
"waste"
rather, DP should
DP
in
resources.
of
firm
has
5
technical
perfonning,
in
the
Given
oriented
user's
"waste"
DP
lies
and
(0)
for
(U)
(J,F
quadrant,
resources,
elsewhere,
improve
programs for
preparation
management
DP
use
for
operations
good
new systems development
needs
training
a
competence
scarce
definitely
success
Clearly
should
are
not
"killer"
the
in
here
a
steal
quadrant,
reallocate from "waste" and "OK" to "killer."
traditional
group.
performance
improve
to
quadrant
"success"
the
from
efforts
increased
Any
users
needs
users
criteria
M)
for
achieve
to
support
recognition
(3
"killers",
the
attitudes
and
fact,
leverage
the
and
In
H).
one
on
in
(L
(D)
and
(X,W,B,C
group
and
they
and
in
Z)
are
in
Z)
improved
and
over-
each
overall
managerially
responsiveness to
(G).
This
for
users
project
and
necessitates
in
proposal
practicability.
Moreover, senior user managers must be more involved (N), the new systems
development process itself needs improvement (S), and systems analysts need
more knowledge of users'
operations (T) for prioritization and
successful
implementation.
In
short,
a
complete
reassessment
of
the
systems development
activities (policies, portfolio balance by system type, systems development
procedures, user training programs, and systems analysts'
to make them more user management oriented.
skills)
is needed
-17-
Each firm is, of course, unique with its own special needs implying
importances and, equally critical, its own profile of performance.
Exhibit
(see
differs from
both
11)
and
is
similar
to
Firm
Firm
The
5.
3
most
noticeable difference is lower success on nearly all criteria (3.2 average
success
vs
4.8
for
firm
5)
reflected
quadrant and an overwhelming "killer"
required
is
impractical
simply
if
in
virtually
a
quadrant.
all
"killer"
empty
"success"
The magnitude of change
criteria
are
addressed
simultaneously.
As mentioned previously,
and
5
is
—
arbitrary
so
performance (dashed line).
we
the location of the axes (solid lines)
can
easily customize
This enables firm
3
for
firm 3's
at 5
average
to focus on a smaller, more
practical subset of improvement efforts.
When firm 3 has improved its current "killers", they will be replaced
by others.
move.
Their
Defining
firm
relative
success,
efforts to improve is
a
performance standards (dashed line)
assessing
importance-performance,
basic, recurring managerial task.
and
will
mounting
-18-
Priorities for Action
Given the excellent agreement between DP and users we have seen and
the consistency of
patterns across
the
all
firms,
six
generalizations of
action priorities for these six firms are reasonable.
Improvements
in
should
DP
begin
those
with
criteria
largest differences between importance and performance.
having
the
Moreover, it will
be significantly easier to achieve improvements on a particular criteria if
both DP and users agree on the relative size of its importance-performance
difference.
Exhibit
DP
and
12
importance-performance differences between
depicts these
user managers
in
all
six
greater the priority for action.
firms.
the difference,
The greater
Fortuitously,
DP and
the
user managers have
excellent agreement on these differences for all 26 criteria.
Beginning with the upper-right corner,
clusters can be identified.
DP is to develop a general
It
Ironically,
the
top
three
priority
top priority improvement
the
for
education program in DP for user managers (D).
would seem logical for this program to focus heavily on the criteria in
the other two top priority clusters.
Three criteria in the second priority cluster are more like outcomes
than
inputs
attitudes
—
perceived
responsiveness
Improvements
(G).
favorably affect
in
these criteria.
the
(0),
other
They can
communication
top
The
criteria
communicating
fourth criterion in this cluster (M)
more analysis systems.
and
user
should
also be directly addressed by
explicitly allocating more DP management time to
managers.
priority
(A)
with
user
is the development of
This type of application includes planning/ what-if.
-19-
statistics/ forecasting,
simulation/optimization,
and
Decision
Support
Systems.
The third priority cluster includes the other managerial system type,
flexible
inquiry systems
These systems
(L).
allow user managers
access,
either personally or via an intermediary, to ad-hoc and flexible reports on
an
as-needed
basis.
potentials
The
currently
of
database
available
management systems and user query languages should be aggressively pursued.
This
has
a
synergystic effect with cleaning-up current report
contents
(Y).
There
contents
from
flexible
modify
their
should
inquiry
contents
report
problems
minimal
be
systems;
as
with
users'
needs
Moreover,
accordingly.
outdated
change,
exception (K) systems are converted to flexible inquiry (L).
contents
according to
unconverted
on
these
applications
is
a
DP
results should not be downplayed.
users
maintenance
the
workload on output reports should decline as traditional monitor
report
report
(J)
and
Cleaning up
"budget-eater"
Rather,
but
management
attention to the maintenance process should be invested to improve it.
Improving the systems development process itself (S) is an enabling
requirement
(L
)
and
already
for
all
applications and especially for creating more inquiry
analysis (M)
been
expended
systems.
on
improving
modest and sporadic results.
to design the one project
To
date,
the
considerable time and money has
systems development
A fundamental
process
with
flaw in these efforts has been
procedure appropriate for all systems.
This has
see Decision Support Systems
Keen and Scott Morton, Addison-Wesley, 1978
and "Decision Support Sytems and Information Flows in the 1980 's", Robert
M. Alloway, Teleinformatics 79
North-Holland Publishing, 1979.
,
,
-20-
resulted
project
in
systems
of
(90%
appropriate
installed
the
systems
analysis
procedures
of
(40?o
base)
inappropriate
and
systems
new
monitor
for
The
demand).
exception
and
for
inquiry
key
here
is
and
to
recognize the fundamental differences in system types and develop project
procedures
appropriate
—
each
for
maintenance,
transaction
processors
(monitor and exception), inquiry, and analysis.
Note the cross-functional skill development implied by user education
in DP
(D)
supplemental
skill
to
Identification
and
programs
development
prioritization
priorities,
these
For both sides, this is
and DP education in user functions (T).
skills,
of
effective
own
function.
communication
modification
corresponding
and
their
in
of
of
personnel
evaluation/reward systems is necessary for success in these efforts.
Defining
success
for
is
DP
an
ongoing
process,
necessitating
re-assessments of the relative importance of alternative criteria to user
This requires the
management.
user managers in
involvement of senior
policy formulation and evaluation
Many managers have mixed feelings
(N).
about the
effectiveness of DP steering committees.
the meaty
issues
raised by defining
DP
success
for
Perhaps, with some of
DP,
steering
committees
would have more impact than serving as project progress reviewers.
These
ten
priority
top
criteria
that they compete with each other
mutually
are
for management
complementary
except
attention and resources.
They can be categorized as in Exhibit 13 and perhaps addressed by different
sub-groups within DP.
"user Managers' Systems Needs", Robert
M.
Alloway,
CI5R Working Paper ,
forthcoming.
"Temporary Management Systems", Robert
"Grow
Your
Own:
Analysts", Robert
Skill
M.
Planning,
M.
Alloway, CISR 1978.
Development,
Alloway, Datamation
,
and
April 1980.
Reward
for
Systems
-21-
Exhibit 13
Categorization of Improvement Efforts by Type
necessary
policy
changes
education
programs
D,
increased
directed
action
collateral
managerial
attention
series of
projects
"outcome"
like
M,
T
Y
L,
A,
0,
G
There is much more information contained in these exhibits than space
permits
discussing
in
detail.
favorite criteria from Exhibit
The fundamental
The
reader
invited
is
to
select
his/her
and trace them through the other exhibits.
1
conclusions from these exhibits,
however, have been
discussed.
Summary and Recommendations
The recommendations presented in this paper are for the sample of six
firms studied.
results,
This
is
a
piece
sound
generalizations
however,
approached
with
industrial
sector.
caution.
Only
Only
a
six
subjective
opinions
in
a
research with
this
firms
stratified
manufacturing and DP were included.
their
from
of
or
were
sample
any
research
studied
of
emphatic
clear,
—
managers
should
all
in
in
be
the
finance,
Respondents were reguired to guantify
guestionnaire.
define and measure success are not exhaustive.
The
26
criteria
used
to
-22-
Strategic planning for DP must be predicated upon some definition of
overall,
integrated success.
re-allocating
and
Setting objectives,
resources
fundamentally
is
determining priorities,
based
upon
diagnosis
a
of
differences between actual and desired.
In
performance
research,
this
criterion.
for
can
represent
actual
and
Combined into an index of success the severity
desired for each criterion.
of the problem
importance
and
be gauged
department,
by firm,
hierarchical
level,
or
diagnoses of cause can be made and priorities
Disaggregated,
improvement assigned.
flexibility to
The
perform
resource
trade-off
analysis at the aggregate level or causal diagnosis at the dis-aggregated
level
is
inherent
in
integrated
this
approach
defining
to
and
measuring
success.
The
material
results
shifts
Considerable
this
of
study
managerial
in
cooperation
from
the
are
dramatic
attention
and
people
right
clear,
and
resource
necessary
is
but
require
allocation.
to
address
performance improvements on the top ten priority criteria identified.
you or "necessary others"
are not convinced of the
define success, measure it,
necessary improvements.
the
planning
specifics,
process
measurements.
approach used
It
in
is
The
is
easily
be
applicability of these
then you must customize and repeat this exercise
results to your firm,
identify problem causes,
process chosen to
very
important.
understood,
quite practical
this research
and
to
It
for diagnosis,
a
and
—
prioritize
accomplish these steps in
yield
should
practical
use
If
enough
measureable
for
version
annual
of
the
DP strategic planning,
and
customized
prioritization of improvement efforts.
The next steps in
the
planning process vary somevyhat based upon the
results of the preceeding steps.
For these six companies,
for example, it
.
-23-
is
not
necessary to convince DP there is a problem,
between DP
and
differences
in
users on
criteria
importance
with
ratings
of
to develop
low performance,
various
criteria
or
to
agreement
reconcile
between
DP
and
and
the
users*
Achievable objectives must be determined
overall improvement effort.
for
each criteria
This is an iterative process of assessing the
levels of improvement possible given various levels of available resources.
It
is
important
consider the interrelationships between criteria
also
to
for possible precedence relationships and workload impact on the personnel
involved as in Exhibit 13.
Strategic
planning
for
DP
clearly
is
responsiblity of both user and DP management.
planning process is easy to specify.
follow this
process because it
is
In
the
necessary and
definitely the
In the abstract a strategic
specific, most
firms do not
essentially content-free and dependent
upon the specificity of corporate strategic plans and
their
indeterminate
linkage to DP activities.
The process for defining DP success used in this research suggests a
practical,
concrete,
understandable method to perform the necessary first
steps in strategic planning for DP.
it
is
undoubtedly
time
Based upon the results of this sample,
to define DP
success at the firm level,
diagnose
causes, and assign priorities for improvement.
*This is truer for the six collectively than for some firms in particular.
The lower the users'
ratings of success, the greater the need for
improving co-joint problem definition.
-24-
Summary of Results
There is a problem in DP success, it is widespread and severe.
DP and user managers have excellent agreement on the relative
success of the 26 criteria studied.
The pattern of relative strengths and weaknesses is the same for
all firms.
Moreover, there is excellent agreement between DP and
users on this pattern.
Relative strength exists in technical, operational, efficiencyDevelopment of new managerial systems, user
oriented criteria.
relations, and the managerial effectiveness-oriented criteria are
comparatively weak.
heart of the problem lies not in importance or performance
disagreements between users and DP, rather in the more basic
Both DP and users
imbalance between importance and performance.
fail to detect any desirable or justifiable relationship between
importance and performance on the criteria for success.
The
scarce
to
Priorities for
sufficiently address them all simultaneously.
action can be determined by the basic cause of low success
Large importancegaps between importance and performance.
performance gaps on ten criteria have been identified with
excellent agreement between DP and user management.
There are
too many
resources
improvement
areas of desired
available,
especially
management
for
the
attention,
—
Summary of Recommendations
It
The process chosen for DP strategic planning is critical.
must strongly link corporate needs with DP functions or criteria,
provide an integrated definition of DP success, and be practical
enough for annual planning.
Based upon an integrated definition and measurement of success,
areas of needed improvement should be identified, agreed upon,
co-joint
and
and
availability
prioritized.
Resource
responsibilities in improvements should be carefully considered.
Objectives must be set for each prioritized criteria and the
Plans should include reaching out
overall improvement effort.
for fresh inputs, possible precedence relationships between
criteria, available resources, impacts on involved personnel, and
the organization's capacity for change.
The progress of improvement efforts should be monitored,
resources reallocated as necessary, and a practical process for
annual re-definition and measurement of success instituted.
DP
is
both
too
improvement efforts.
expensive
and
too
critical
for
mis-allocated
Defining success for DP is a necessary first step in
strategic planning for effective improvement of DP.
-25-
Exhibit
1
26 Criteria of DP Success
Success
4.7
4.6
5.0
4.8
4.7
4.4
4.4
4.7
4.3
4.6
4.4
4.5
4.1
Group and Criteria
Technical
developing more monitor systems
F.
quality of DP systems analysts
technical competence of the DP staff
H.
3.
Security
data security and privacy
E.
Operations
running current systems (costs, ease of use, maintenance)
W.
availability and timeliness of report delivery to users
efficiency of hardware utilization
B.
hardware and systems downtime
C.
DP profitability from user billbacks and billable time ratios
Z.
X.
Proportion
sophistication of new systems
R.
increasing the proportion of DP effort expended in creating new systems
Q.
4.3
4.2
3.7
3.9
3.6
4.0
4.0
3.8
3.8
3.8
4.2
2.7
3.8
4.0
3.3
3 .2
Cost /Direction
P.
overall cost-effectiveness
N.
involvement of senior users in DP policy formulation and evaluation.
User Relations
user oriented systems analysts who know user operations
T.
appropriate DP budget size or growth rate
V.
communication with managerial uses
A.
0.
responsiveness to user needs
DP support for uses in preparing proposals for new systems
U.
Development
Y.
report contents (relevance, currentness, flexibility, accuracy)
Training Programs for users in DP capabilities
D.
attitudes of users to DP
G.
K.
developing more exception systems
developing more inquiry systems
L.
M.
developing more analysis systems
3.5
3.3
3.7
3.5
Backlog/Process
I.
the new system request backlog
S.
improving new system development (time, cost, quality, disruptions)
* Average success rating for all user managers.
-26-
Exhibit 2
Profile of Manufacturing Firms Surveyed
Firm
EXHIBn
5
Comparison of Success by Group of Criteria
Firm 5 and Firm 3
7
-I
6
_
5
_
4
-
2
7
excellent
5
good
3
inadequate
1
very poor
«
Firm
Firm
rating of DP success
=
5
(white bars)
3
(hatched bars) users' rating of DP success
users'
4.8
=
3.2
EXHIBIT 6
COMPARISON OF DP AND USERS RATINGS
OF SUCCESS
7
very
poor
inadequate
excellent
CO
-good
Cl.
CD
CO
cJ5
- inadequate
USER MANAGERS' RATINGS OF DP SUCCESS
EXHIBIT 7
COMPARISON OF DP and USERS RATINGS OF IMPORTANCE
not very
important
very
critical
important
<c
HCD
Q_
C T
IP
X
U L
EB
C/5
"1
^
USER MANAGERS' RATINGS OF IMPORTANCE
<
EXHIBIT 8
COMPARISON OF USERS' RATINGS OF IMPORTANCE AND PERFORMANCE
•^
sz
DC
u_
LU
Q_
UCD
^
QC
"7
cc:
very
critical
not very
important
USERS' RATINGS OF IMPORTANCE
EXHIBIT 9
COMPARISON OF DP^s RATINGS OF IMPORTANCE AND PERFORMANCE
6ir
o 5
1
^
not very
5
important
important
DP's RATINGS OF IMPORTANCE
wery
critical
EXHIBIT 10
COMPARISON OF FIRM 5 USERS' IMPORTANCE AND PERFORMANCE
SUCCESS
WASTE
W
F
—
H
C-
A
V
P
U
S
to
cc
LU
V)
G
L
M
LO
KILLER
OK
FIRM 5 users'
IMPORTANCE RATINGS
EXHIBIT II
COMPARISON OF FIRM 5 USERS- IMPORTANCF AND
PERFORMANCE
WASTE
SUCCESS
<
a:
H-W
-B
iij
u
z
<
VQ
oc
E
F
C
X
o
u.
oc
LU
a.
U
CO
a:
LU
CO
N
OK
KILLER
FIRM 3 users' importance RATINGS
EXHIBIT 12
Importance
-
Performance Gaps for DP and Users
1.5
1
G M
1.5
STT
Y
C3
ac
LU
Q_
L
Ll_
XV
E
U
H
P
C
I
W
B
LU
Q
<i:
T
en
o
I
K
-1.
-2
-2
-1
USERS (INPORTANCE
-
PERFORMANCE)
Download