Internal Performance Measurement System Characteristics: A Typology

advertisement
Eric Olsen
OSU Fisher College of Business
Internal Performance Measurement System Characteristics:
A Typology
By Eric Olsen
Working Paper, Center for Excellence in Manufacturing Management,
Fisher College of Business, The Ohio State University, 2003.
[Not for reproduction without permission of the author or the CEMM.]
Mar03 Rev M
Short Abstract: Measures are important to organizational effectiveness; however there is
little research on what constitutes an effective performance measurement (PM) system.
A typology of PM system characteristics is presented based on a review of existing
operations management literature. The typology’s ability to explain observed
characteristics of operational performance is illustrated.
Abstract
Measures are important to organizational effectiveness; however there is little research on
what makes an effective performance measurement system. Research on this topic was
conducted by developing a typology of internal PM system characteristics. Internal PM
is used by an organization for strategy formulation, execution, and operations control. To
develop this typology a review of existing literature on PM in operations was performed.
Throughout the literature many characteristics of good PM systems were mentioned on a
consistent basis. These characteristics were grouped and classified into a framework of
board categories or attributes. The typology’s ability to explain observed characteristics
Page 1
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
of operational performance is illustrated. Also, a research agenda to validate the
typology is proposed.
Introduction
Every manager has heard the platitude, “What gets measured, gets done!” Yet, there is
certainly a large chorus of managers, consultants, and academicians who point out that
American businesses’ over-reliance on traditional financial measures is not getting the
job done (Kaplan, 1983). Managers generally agree that measures are vitally important
to an organization's effectiveness, but there is little hard research on what constitutes an
effective performance measurement (PM) system in operations (Leong, Snyder, & Ward,
1990). Millions are being spent on reengineering efforts and enterprise resource planning
systems. What is the best measurement system to support those huge investments? The
speed and sophistication of today's computer systems make adding additional measures
seem trivial; however, managers are in danger of suffering from “data asphyxiation”.
There are two key questions that need to be addressed concerning the design of an
effective PM system. The first is what to measure. Specific individual items need to be
selected. The second, and perhaps more interesting question, is how to measure. The
“how” is a system issue. Measures do not act in isolation from each other or from the
environment. This paper presents an outline of a typology of internal PM system
characteristics as a first step in understanding how measures work together as a system to
make a business successful. Ultimately, matching assessable internal PM system
characteristics to business performance will permit the design of more effective
performance measurement systems. Also existing systems can be critically compared to
Page 2
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
the criteria developed in this research. Wasteful and ineffective characteristics could be
improved or eliminated. To shamelessly rip off another more current management
platitude, this research is about "measuring the right things, and measuring things right!"
Fit in Operations Management Literature
Performance measurement can be addressed at two different levels in operations
management. The first is “external” performance measurement. The basic concept
presented is that operations should have a valid, objective, independent means to assess
changes in performance. For business units, public record financial data generally are
available such as stock price, earnings per share, return on assets, and return on sales. It
can be argued that these are independent market based assessments of business
performance. Internal operations (typically assessed by evaluating cost, quality, delivery,
flexibility, and innovativeness) generally do not have requirements for systematic,
independent reporting of performance measures; and, if they did they would be subject to
situational interpretation. Also, internal operations data are generally treated as
proprietary.
Many times researchers are relegated to using perceptual measures of operational
performance. This is a controversial topic in OM literature (Boyd, Dess, & Rasheed,
1993; Starbuck & Mezias, 1996). Although external PM is a viable area for research and
contribution, it is outside the scope of this paper.
Page 3
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
The level of performance measurement that is of interest here is “internal”. Internal PM
may be used by an organization for strategy formulation, implementation, and operations
control. According to the literature, operations strategy can be viewed from either a
content or process perspective (Leong et al., 1990). The strategy content perspective
focuses on strategy as a set of competitive priorities and the structural and infrastructure
decisions necessary to support those decisions. Internal PM is most clearly associated
with the process of strategy. Leong, Snyder, and Ward (1990) summarized the research
in this area with the model depicted in figure 1. Their process of strategy is described as
a translation of corporate strategy to a service enhanced product.
Page 4
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
ENVIRONMENT
BOUNDARY
CORPORATE STRATEGY
STRATEGIC BUSINESS UNIT STRATEGIES
FUNCTIONAL STRATEGY FORMULATION
Manufacturing
Other Functional Areas
IMPLEMENTATION
Manufacturing
Other Functional Areas
CAPABILITIES
Manufacturing
Other Functional Areas
SERVICE-ENHANCED PRODUCT
Internal Performance Measures
Market Place (external) Performance
Figure 1: A Process Model of Manufacturing Strategy
(Adapted from (Leong et al., 1990))
Measurement of internal performance feeds back provides current position, rate, and
direction information to the formulation of functional strategy. (Hayes, Wheelwright, &
Clark, 1988) describe the purpose of performance measurement as providing a direction
and rate of improvement over time. Internal performance measurement performs a
critical role in the process of strategy.
Page 5
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Methodology
As critical as internal measurement is to the process of strategy there is little research on
its effective use. There are many recommendations from practitioners and consultants,
but little empirical research to validate claims. An important first step in the building of
an internal performance measurement theory is the creation of an effective classification
system as a fundamental cognitive aid. This can be accomplished by establishing a
typology for performance measurement system characteristics. Typologies are mainly
conceptually derived and should serve to make sense out of non-quantified observations
(Hambrick, 1984).
A review of existing literature on performance measurement in operations management
was performed in order to develop a typology. Characteristics of a good performance
measurement system from a variety of sources were obtained (sources used are listed in
appendix of this paper). Although not bibliographic, the list represents most of the
explicitly titled performance measurement texts. To be included, in the typology sources
need to describe characteristics of a PM system and not just list recommended measures,
although some sources did both. Also sources related to world-class manufacturing,
time-based competition, and lean manufacturing were considered as being relevant
sources for modern performance measurement system criteria. A total of 15 sources were
used to generate this typology.
Multiple non-unique characteristics of performance measurement systems were identified
in the terminology used by the sources. These were sorted into 12 broader unique
Page 6
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
categories of attributes using a technique analogous to an “affinity diagram”. Various
research associates were shown the grouping of characteristics into attributes and
suggestions for changes were accommodated. These attributes were grouped under four
dimensions (Table 1).
Table 1: Internal Performance measurement attributes and dimensions
Dimensions
Simplicity (complex to simple)
Elegance
Comprehensiveness (inadequate to adequate)
Internal Alignment (divergent to compatible)
Alignment
Strategic Alignment (internal to customer focus)
Adaptability
Consequentiality (low to high – intrinsic or extrinsic)
Actionability
Applicability (indirect to direct)
Causality (result – effect to cause)
Learning
Comparability (isolation to trend)
Frequency
Attributes
Table 2 summarizes the sources, with a frequency count, of the attributes subscribable to
an effective PM system.
Page 7
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Table 2: Performance Measurement System Dimensions and Attributes Frequency Count Summary
Dimension
Attribute
Simplicity 1
Count
Strategic Focus 3
Alignment
*
* *
7
*
*
11
*
Page 8
*
*
*
*
*
*
Compatibility 4
5
Adaptability 5
5
*
Applicability 6
8
*
Consequences 7
5
Learning (General) 8
8
Causality (Focus on
Direct Causes) 9
5
Causality
(Focus on Process)
10
3
Comparison
(Trends) 11
9
Frequency 12
2
Action-ability
Learning
Dixon,
Hayes,
Kaplan & Lynch &
Stalk &
Thom as &
et al Ford Globerson Hall, ed. et al Kaydos Norton
Cross Maskell McDougall Meyer Hout Tesoro
Martin
10
Elegance
Comprehensiveness
2
Brow n
*
*
*
*
*
*
* *
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
Internal PM Typology5clean-CEMM6.doc
*
*
*
*
*
8/26/03
*
*
*
*
*
*
*
*
*
*
*
*
*
*
*
Eric Olsen
OSU Fisher College of Business
Discussion of Attributes and Dimensions
The attributes and dimensions proposed are discussed below to provide practitioners and
researchers with a sense of the utility and importance of performance measurement system
research. Two key attributes are used to set up a four-quadrant matrix for each of the
dimensions. Within each matrix the ability of the typology to explain observed characteristics of
operational performance is demonstrated. This is a simplified technique used to demonstrate the
utility of what may be a very complex set of relationships and interactions. A useful conceptual
typology must be empirically validated. Therefore, in the conclusion to this paper a directed
agenda for additional research is proposed.
Elegance: The elegance of a performance measurement system can be viewed as a combination
of simplicity and comprehensiveness. Ten sources recommended one or more concepts that can
be categorized as having simplicity as a key dimension. Representative concepts include "simple
enough to be widely understandable” (Hayes et al., 1988), using “few versus many” measures
(Dow, Samson, & Ford, 1999), “easily understood terms related to everyday work” (Kaydos,
1999), and “complex measurements have no tangible meaning and trends within the
measurement can be hidden” (Maskell, 1991). Simplicity also is exemplified in the ability to
communicate the measures. For example being able to represent the measures visually and
graphically (Meyer, Tsui, & Hinings, 1993) would be an indicator of its simplicity.
The second dimension that describes the elegance of a performance measurement system is
comprehensiveness. Comprehensiveness asks the question, are the "right", or most important
factors, being measured? The “balanced scorecard” proposed by Kaplan and Norton (1992) is
Page 9
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
indicative of this dimension. Both financial and non-financial measures are important to a firm’s
success (Kaplan, 1983; Kaplan & Norton, 1992; Lynch & Cross, 1995). Kaydos (1999)
comments on the importance of a PM system having the characteristic of “completeness” or
“wholeness”. To be comprehensive the PM system must cover both the “right things and wrong
things”. As an example, a PM system should address trade-offs that might be encountered in
emphasizing throughput to the detriment of quality. The concept of “sufficient detail” or the
“Law of Requisite Variety” - controls should have as much variety as the system to be controlled
(Buckley, 1968; Kaydos, 1999; Weick, 1987), - also is addressed by the comprehensiveness
attribute.
A useful four-quadrant conceptualization of the PM system attribute of Elegance and its two
dimensions, simplicity and comprehensiveness, is presented in Figure 2.
Comprehensiveness
Elegance
Adequate
Inefficient
3
Elegant
4
Inadequate
Traditional
1
Risky
2
Complex
11/21/2002
11/21/2002
Simple
Simplicity
Eric Olsen – OSU Fisher College of Business
12
8
Figure 2: Four-quadrant representation of Elegance attribute.
Page 10
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Operations that have many complex, largely financial measures would fall into quadrant 1. This
is indicative of many “traditional” PM systems. A traditional PM system is accounting/financial
intensive and complex with many measures. However, it does on adequately cover all the
functions of an operation. If an operation reduces its system measurements down to only a few,
but these do not cover all the important aspects of the operation then that operation can be
characterized as “risky” (quadrant 2). There is an opportunity for the operation to be blindsided
by an issue it was not adequately monitoring. An operation that has many redundant or complex
measures to more than adequately address the important issues for that operation than it is at best
“inefficient” (quadrant 3). The operation is wasting resources on its PM system. Operations that
invest in large information systems where the perceived cost of additional measures is low or do
not continuously eliminate obsolete measures from their systems are subject this characteristic.
The place to be is in quadrant 4! A system that has adequate functionality with a minimum
number of measures is elegant. In this quadrant, the important operational issues are covered
with a few, simple, key measures. The PM system is cost effective and low risk.
Alignment: The performance measurement system can achieve alignment along two
dimensions. It should be aligned strategically with the ultimate focus being on the customer and
it should be aligned internally across functions and locations.
One of the prime objectives of a PM system is to provide direction to the organization. That
direction is a function of the competitive priorities of the operation. The PM system should
focus all business activities on internal and external customer requirements (Lynch et al., 1995).
It should focus on measures to which the customer can relate (Kaplan, 1992;(McDougall, 1988).
Page 11
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Measures should be “directly related to manufacturing strategy (and provide the ability to) assess
progress to goals constantly” (Maskell, 1991). Dixon, Nanni, and Vollmann (1990) point out
that a measurement system should:
“Be mutually supportive and consistent with business operation goals, objectives, critical
success factors, and programs (and) provide a set of measures for each organization
component that allows all members of the organization to understand how their decisions
and activities affect the entire business.”
Internally, a PM system should be aligned across functions and locations within the operation.
For example one of the criteria that Ford (1999) recommends in creating a PM system to support
lean manufacturing is that it be designed for both plant floor and management use. It should also
be “common for all operations”. Commonality (everyone using the same measures) may be
important, but perhaps a better general characteristic is compatibility. It may be necessary, and
desirable, for measures to vary between locations due to different strategic charters or different
strategic evolution (Maskell, 1991). It is to be hoped that different functions (i.e., production,
purchasing, quality, and logistics) would have different, yet compatible measures.
Figure 3 depicts a four-quadrant representation of the Alignment attribute.
Page 12
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Strategic Alignment
Alignment
Customer
Focus
11/20/2002
Internal
Focus
Sloppy
3
Aligned
4
Hostile
1
Cooperative
2
Divergent
Compatible
Internal Alignment
Moderator:
Adaptability
12
14
Figure 3: Four-quadrant representation of Alignment attribute.
Operations whose measures are internally focused (lack customer focus) and have divergent
measures between levels, functions, and locations are subject to conflict (quadrant 1). Different
parts of the organization are motivated by the PM system to accomplish different things. A
degree of hostility may even ensue. Traditional functional “silos” are a good analogy for PM
systems with this combination of dimensions. Operations that have internally focused, yet
compatible measures are termed “cooperative” (quadrant 2). PM systems with these
characteristics would tend to be low in conflict (everyone gets along), but also low in achieving
strategic objectives. PM systems in quadrant 3 demonstrate high enthusiasm for satisfying the
customer, but because everyone is measured by different criteria, it tends to be sloppy and
inefficient in achieving results. Finally, the combination of high customer focus on strategic
objectives and compatible measures across the organization leads to a well-aligned PM system.
Page 13
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Another attribute of Alignment that is that it implies by the dynamic nature of strategic focus its
adaptability to change. The system needs not only to be “tailored to the satisfaction of a specific,
important need”, but it also has to be able to do so in “a timely fashion” (Hayes et al., 1988).
The underlying assumption here is that customer and business priorities are going to change with
time and, therefore, the ability to align has a dynamic capability. “Measurements should be
changed or at least adjusted as the environment and your strategy changes” (Brown, 1996). The
characteristic of arability moderates the PM systems ability to maintain alignment.
Actionability: “Actionability” addresses how directly the performance measurement system ties
outcomes and consequences. It is important that a PM system identify when and where action is
needed in an operation (Kaydos, 1999). An actionable performance measurement system is a
function of the applicability of the measures and consequences of compliance. Applicability can
be either direct or indirect. Changes in output, defect level, or cycle time on the production line
are examples of direct measures. This information is useful in daily operations management.
Indirect measures are normally required by organizations outside the operation and do not affect
day-to-day operations. Labor variance, overhead absorption and measures required for external
financial reporting fall into this category. Another way to distinguish between direct and indirect
measures is to view the former as those related to causes and indirect measures related to results
(Meyer et al., 1993).
The consequences of either compliance or non-compliance to a particular measure are a function
of the impact on the operation (intrinsic) and the accountability and incentive structure in place
Page 14
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
(extrinsic). Whenever the target for a particular measure is not met, there should be a significant
and discernable effect on the operation and/or its workers. For example, if shipment deadlines
are not met, revenue is lost. This is an example of an intrinsic consequence in which the cause
and the consequence cannot be separated. Extrinsic consequences, such as reward and
compensations systems, may be arbitrarily established to motivate desirable behavior.
Accordingly, the ideal PM system will hold people accountable for their behavior (Lynch et al.,
1995). Kaydos (1999) proposes that no more than one person in the operation be responsible for
any given measurement. Conversely, Stalk and Hout (1990) advocate that team, not individual
measurement, is a powerful tool for fostering joint ownership of results. In addition, the
perceived level of trust and credibility in the measurement system is a key element in the
workplace. If information is filtered or distorted, trust will be low and will impact how
actionable the PM system is (Kaydos, 1999).
A four-quadrant conceptualization of the PM system attribute of actionability and its two
dimensions of applicability and consequentiality is shown in Figure 4.
Page 15
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Applicability
Actionability
11/20/2002
Direct
Indirect
Removable
3
Actionable
4
Impotent
1
Rationalizable
2
Low
High
Consequentiality
17
13
Figure 4: Four-quadrant representation of action-ability dimension.
PM that provides excess routine data to accounting for the purposes of financial reporting with
little real impact either on the production line or on the people who work are impotent (quadrant
1). Measures in this category should be minimized or eliminated. Measurements that have little
direct impact, but have a high (usually extrinsic) consequence associated with them, should be
rationalized (quadrant 2). Practitioners will recognize this category as those measures that
management values for reward and recognition purposes (high extrinsic consequences), but are
not utilized, or are misapplied, in daily management. These measurements are taken, but efforts
are need to be made to reduce their potentially negative impact on the operation. The objective
is rationalization or balancing of the level of consequences with the level of direct impact on the
operation. Measurements that are used only directly within the production operation and that
have few significant consequences should be eliminated or removed (quadrant 3). Finally, those
Page 16
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
measurements in quadrant 4 that directly effect the operation and have either high positive or
negative consequences should be actively managed.
Learning: Eight of the research sources used placed a heavy emphasis on the performance
measurement system’s ability to promote learning both at the individual and organizational
levels. The PM system plays a key role in promoting continuous improvement (Hayes et al.,
1988);(Lynch et al., 1995);(Hall, 1983);(Maskell, 1991); (Dixon, Nanni, & Vollmann, 1990)
(Dow et al., 1999). Causality and comparability are two key dimensions that make up learning
as the fourth and final attribute of an effective system. Causality is the degree to which the PM
focuses on the production process and direct causes of performance as opposed to secondary
effects. For example, scrap rate would be a direct cause of higher material and labor costs.
Causality is closely associated with understanding the process of production (Kaydos, 1999)
(Meyer et al., 1993) (Stalk & Hout, 1990). An increase in the level of process knowledge is an
indicator that learning is taking place.
Comparability is the degree to which the PM system uses trends over either time or point
differences to assess changes in performance. Plotting specific trends over time allows operators
and managers to associate experiences with performance measurement levels. If those measures
are causal then the learning is more effective. Trends juxtapose multiple data points. Point
differences compare only two data points and therefore can be less effective. Point differences
can be between current performance and performance in a reference period (such as last quarter
average), between current performance and a standards or target as in variance analysis, or
Page 17
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
between current performance and other internal or external benchmarks (as in percent of industry
average).
A four-quadrant conceptualization of the performance measurement system attribute of learning
and it key two dimensions of causality and comparability is shown in Figure 5.
Comparibility
Learning
Trend
Backward
Looking
3
Learning
4
Aimless
1
Episodic
2
Result (Effect)
Cause
Moderator:
Frequency
Point-to-Point
11/20/2002
Isolation
Causality
20
15
Figure 5: Four-quadrant representation of learning dimension.
Management decisions based on measures that are analyzed in isolation and based on secondary
results are akin to “shooting in the dark”. Quadrant 1 PM systems can be said to have the
characteristic of aimlessness. Neither history nor causality informs the decision process. If
individual causal measures are acted upon, then the emphasis is on the right element of the
system, but the direction for improvement is not informed (quadrant 2). A “knee jerk” or
episodic characteristic is a good way to describe measurement systems, which fall into this
quadrant. Quadrant 3 might be best represented by traditional trend analysis of financial data.
Page 18
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
PM systems in this category are backward looking. For example, fully loaded cost variance is
plotted over time rather than defect or scrap rates. A typical analogy used to describe this type of
PM system is that it is like “driving by looking in your rearview mirror”. A measurement system
that promotes learning uses a trend analysis of direct causes (quadrant 4).
Similar to alignment’s dynamic function of adaptability, the learning dimension is moderated by
the frequency of measurement. Not many sources except Kaydos (1999) and Stalk and Hout
(1990) address frequency of measurement explicitly. It is generally assumed that measurement
frequency will be appropriate to the measure of interest. To a point, the more frequent the
measurement, the more likely the participants are to learn. As frequency decreases, the learners’
ability to associate causality with the resultant measure also decreases. In highly dynamic
environments, the frequency of measurement should be commensurate with the variability of the
measure. However, there does need to be an actionable consequence to variations in the
measure. For example, allocated labor overhead may vary significantly week-to-week, but
action is only rational over a longer period.
Summary and Research Agenda
Even though internal performance measurement has been touted to play a critical role in the
process of operational strategy, there is little theory building or empirical research in this area. A
typology is proposed as a first step in the process of theory building. A logical next step is to use
case study analysis to test the viability and comprehensiveness of the proposed typology. New
theory can benefit significantly from the richness of a case study methodology (Eisenhardt,
1989). Once the proper variables of a performance measurement system are identified as valid
Page 19
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
constructs, scales to measure those variables can be developed. The initial focus of this research
should be in factories and production lines where the relationship between activities and
performance has been extensively studied. Large-scale “course grain” (Harrigan, 1983)
empirical studies across manufacturing industries can be undertaken with valid constructs for PM
system attributes and dimensions established. Once the dimensions of an internal performance
measurement system are validated in a known environment, an effort can be expanded to include
service operations, not-for-profit, and evolving business areas like e-commerce. The opportunity
for expansion of this research stream, and to increase the understanding of performance
measurement systems for managers, is significant.
Page 20
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
APPENDIX
PM Characteristic Sources:
(Brown, 1996; Dixon et al., 1990; Dow et al., 1999; Globerson, 1985; Hall, 1983; Hayes et al.,
1988; Kaplan et al., 1992; Kaydos, 1999; Lynch et al., 1995; Maskell, 1991; McDougall, 1988;
Meyer et al., 1993; Stalk et al., 1990; Tesoro & Tootson, 2000; Thomas & Martin, 1990)
Page 21
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Table A: Performance Measurement System Dimensions and Attributes with Source Comments
Dimension
Attribute
Source Comments
"... simple enough to be
Design a simple, consistent
widely understandable,
format. (Lynch & Cross,
credible, and useable within 1995)
the organization." (Hayes,
Wheelwright, & Clark, 1988)
Elegance
“Few vs many”. (Ford,
1999)
Simple and tangible. Clear
and graphical information.
(Meyer, 1993)
Show a balanced profile of
performance. Integrate
financial and non-financial
information. Measures
should not be tracked in
isolation. (Lynch et al.,
1995)
Measurement all the "right"
variables (most important
factors) and assign
consistent measures to
departments.
Completeness/wholeness covers right things and
wrong things. Sufficient
detail - controls should have
as much "variety" as the
system to be controlled.
(Kaydos, 1999)
Simplicity
Comprehensiven
ess
Page 22
8/26/03
Easily understood terms
related to everyday work.
Systematic. Easy to use.
Communicating strategy
and clarifying values.
Making accomplishments
visible. (Kaydos, 1999)
Simple and easy to use.
Measurement issues
directly. People it find
easier to use, and therefore,
more effective. Complex
measurements have no
tangible meaning and trends
within the measurement can
be hidden. Allow for
immediate and direct
assessment. (Maskell,
1991)
Fewer are better:
"...for practical purposes we
Concentrate on measuring find it hard to generate a list
the vital few key variables
of more than a few dozen
rather than the trivial many. relevant criteria."
(Brown, 1996)
"…criterion should be
replaced by others which
are more specific and
simple to measure. The
purpose of the PC
(performance criteria) must
be clear. A performance
criterion has to be defined
explicitly and clearly in order
to enable its
implementation." "A
measurable definition..."
would specify criteria..."
(Globerson, 1985)
Serves to improve the
Tell a story as a group - not
health of the whole business as individual numbers.
(e.g. Goldratt criteria).
(Ford, 1999)
(McDougall, 1988)
Internal PM Typology5clean-CEMM6.doc
Convey information thru as
few and as simple a set of
measures as possible.
(Dixon, Nanni, & Vollmann,
1990)
"Your system has to be
simple, scaleable, and
replicable so that users will
trust it. Leveraging existing
tools and technology, as well
as using a systematic
diagnostic process, can
enhance its readability".
"The data has to be
presented so that the user
can understand it and do
something about it." "...
meet their needs in terms of
format, structure, fields and
other data presentation
requirements." Standard
formats. (Tesoro, 2000)
Multiple indices can be
combined into a single index
to give a better overall
assessment of performance.
(Brown, 1996)
Eric Olsen
OSU Fisher College of Business
Table A: Performance Measurement System Dimensions and Attributes with Source Comments
Dimension
Attribute
Source Comments
Balanced scorecard
“Your performance measure
([Kaplan, 1992 #80][Norton, system should provide your
1991 #60]
users the capability to drill
down to details when
needed. It should present a
complete picture …and
cover your bases.” (Tesoro,
2000)
"Tailored to the satisfaction Show relationship between
of a specific, important
department and business
need, in a timely fashion."
operating system. Link
(Hayes et al., 1988)
operations to strategic
goals. Focus all business
activities on customer
requirements. Internal and
external. (Lynch et al.,
1995)
Alignment
Identifying problems and
opportunities. Strategic
alignment of objectives.
Changing company culture.
(Kaydos, 1999)
Strategic Focus
Demands improvement on
only the strategically chosen
variable - while the others
are not allowed to back
slide. Focuses on
competitive variables - those
the customer sees.
(McDougall, 1988)
Measurements should be
"PC must be derived from
based around the needs of the company's objectives."
customers, shareholders,
(Globerson, 1985)
and other key stakeholders.
Measurements should start
at the top and flow down to
all levels of employees in
the organization. (Brown,
1996)
Customer Perspective - how
do customers see us?
([Kaplan, 1992 #80][Norton,
1991 #60]
Page 23
8/26/03
Internal PM Typology5clean-CEMM6.doc
Directly related to
manufacturing strategy:
Assess progress to goals
constantly. People
concentrate on what is
measured. (Maskell, 1991)
Be mutually supportive and
consistent with business
operation goals, objectives,
critical success factors, and
programs. Provide a set of
measures for each
organization component that
allows all members of the
organization to understand
how their decisions and
activities affect the entire
business. Reveal how
effectively a customer's
needs and expectations are
satisfied. Focus on
measurement that customer
can see. (Dixon et al., 1990)
Select performance
Customer service and
measures that your
returns. (Thomas & Martin,
business users value and
1990)
use for guiding and
supporting decisions and/or
those that directly align with
your organizations goals.
(Tesoro, 2000)
Eric Olsen
OSU Fisher College of Business
Table A: Performance Measurement System Dimensions and Attributes with Source Comments
Dimension
Attribute
Compatibility
Adaptability
Source Comments
Internal Business
Perspective - what do we
excel at?(Kaplan, 1992
#80][Norton, 1991 #60]
Resource and work inputs.
Knowing what a process
can do. (Kaydos, 1999)
Yardsticks should be tuned
to your game plan. (Lynch
et al., 1995)
"... based on most relevant
and objective information
available." (Hayes et al.,
1988)
Actionability
Applicability
Page 24
Designed for plant floor and
management use. Common
for all manufacturing
operations. (Ford, 1999)
Vary between locations.
Difference in importance to
strategy or implementation
stage. Measure the same
thing in different ways.
Diversity inherent in worldclass manufacturing.
(Maskell, 1991)
Provide flexibility to your
users by giving them
reporting options. You may
have standard reports that
meet common needs of your
users, but provide them with
options so they can
customize reports to fit their
needs. (Tesoro, 2000)
Change over time as needs Measurements should be
"...scalable so that it can
"...since objectives are
change: Continuous
changed or at least adjusted adapt to changes in
periodically revised, the PC
improvement. (Maskell,
as the environment and your priorities, business
system has to be evaluated
1991)
strategy changes. (Brown, initiatives, and performance and changed accordingly."
1996)
measures." (Tesoro, 2000) (Globerson, 1985)
Validity - what counts and is
accepted and understood by
users. Quality, recourse,
work, and inputs.
Environmental factors.
Diagnosing problems.
Performance, behavior, and
diagnostic measurement.
Improving control and
planning. Better planning
and forecasting - identifying
when and where action is
needed. Constraints.
Making delegation easy and
effective. (Kaydos, 1999)
Primarily use non-financial Emphasize physical
Distinguish between cause
measurement. Equal or
measurements. (Ford, 1999) and results. Simple and
more important than
tangible. (Meyer, 1993)
financial. Financial are not
valid, or relevant to daily
ops. They are generally
confusing, misleading, and
potentially harmful. Provide
direct information of success
or failure relevant to work.
Lead in right direction (i.e.
time not labor). Provide fast
feedback to operators and
mgrs. Detect and resolve
problems as they occur.
Use visual signals and root
cause identification.
(Maskell, 1991)
Look first to physical results Measurements should be
Add control limits. (Lynch et
(not financial). Decisionlinked to the factors needed al., 1995)
making: decision cycle time for success: key business
and time lost waiting for
drivers. Measurements
decisions. (Stalk et al.,
need to have targets or
1990)
goals established that are
based on research rather
than arbitrary numbers.
(Brown, 1996)
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Table A: Performance Measurement System Dimensions and Attributes with Source Comments
Dimension
Attribute
Source Comments
Hold people accountable.
(Lynch et al., 1995)
Consequences
Innovation and Learning
Perspective - can we
continue to improve and
create value? ([Kaplan,
1992 #80][Norton, 1991
#60]
Learning
Accountability - only one
person responsible for each
measurement. Defining
clear responsibility and
objectives. Allocating
resources efficiently. "CYA"
and defending your position.
Being evaluated objectively.
More empowerment.
Freedom to delegate.
Getting everyone involved.
Trust and credibility - if
information is filtered or
distorted trust will be low.
Recognizing and rewarding
performance. (Kaydos,
1999)
Learning cycle. Guiding and
changing behavior.
Absence of fear. (Kaydos,
1999)
The PC must be under the
control of the evaluated
organizational unit. In other
words, the organizational
unit must have a significant
impact on the value of the
PC. Even if the
performance criterion is a
relevant one, there is no
guarantee that it is
absolutely under the control
of one organizational unit.
(Globerson, 1985)
Make sure that you identify Team measurement (not
the content owners for the individual or department).
measures in your system. . (Stalk et al., 1990)
(Tesoro, 2000)
Development of work force:
a. Increased skill evidenced
in their work. Versatility in
different jobs. Participation
in changes - participation in
training and number of
suggestions made and
implemented. Morale Absenteeism, complaints
(negative indicators) and
Enthusiasm (subjective).
(Hall, 1983)
Foster improvement rather Support organization
than just monitor: a.
learning. (Dixon et al.,
Motivation factor. Focus on 1990)
the positive. (Maskell, 1991)
When and how to change
behavior with respect to
cycle time and defects.
(Meyer, 1993)
Training level of work force.
(Thomas et al., 1990)
Learning
(General)
Promotes learning improvements yield
favorable reports.
(McDougall, 1988)
Page 25
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Table A: Performance Measurement System Dimensions and Attributes with Source Comments
Dimension
Attribute
Causality
(Focus on Direct
Causes)
Causality
(Focus on
Process)
Source Comments
"... based on most relevant
and objective information
available." (Hayes et al.,
1988)
Validity - what counts and is
accepted and understood by
users. Quality, recourse,
work, and inputs.
Environmental factors.
Diagnosing problems.
(Kaydos, 1999)
Shows direction and rate of
improvement over time.
Compare to competitors or
the best. (Hayes et al.,
1988)
Measure and manage
performance over time.
Focus on continuous
improvement. (Lynch et al.,
1995)
Primarily use non-financial Emphasize physical
measurement. Equal or
measurements. (Ford,
more important than
1999)
financial. Financial are not
valid, or relevant to daily
ops. They are generally
confusing, misleading, and
potentially harmful. Provide
direct information of success
or failure relevant to work.
Lead in right direction (i.e.
time not labor). Provide fast
feedback to operators and
mgrs. Detect and resolve
problems as they occur.
Use visual signals and root
cause identification.
(Maskell, 1991)
Understanding the process. Measure the value delivery Processing and production.
(Kaydos, 1999)
process. Illuminate process Measure value added as %
drivers (complete, rather
of total elapsed time, uptime
than comprehensive with
x yield, inventory turnover,
too many measures).
cycle time (per major phase
(Meyer, 1993)
of main sequence). (Stalk et
al., 1990)
Comparison
(Trends)
Page 26
8/26/03
Useful accuracy - in most
cases trend is important,
error is consistent if
measurement technique is
the same. Long-term
consistency - standard units
account for changes in mix,
volume, etc over time
(ratios). Accounting for the
performance gap - explain
>80%. Operational variance
inputs and outputs.
(Kaydos, 1999)
Internal PM Typology5clean-CEMM6.doc
Distinguish between cause
and results. Simple and
tangible. (Meyer, 1993)
Improvement trends in total Support … continuous
improvement projects and
improvement. (Dixon et al.,
total implemented, and total 1990)
improvement projects and
total implemented per
capita. Trends in costs or
toward target-cost goals and
productivity - output/(total IL
+ DL). Trends in
departmental inventory
levels - speed of flow, for
example. Quality trends.
Reduction in defect rates.
Improvement in process
capability. Improve in
quality procedures. (Hall et
al., 1983)
Eric Olsen
OSU Fisher College of Business
Table A: Performance Measurement System Dimensions and Attributes with Source Comments
Dimension
Attribute
Frequency
Page 27
Source Comments
(Hall et al., 1983)
Focused on trends and
forecasts. (Ford, 1999)
Sufficient measurement
frequency - measurement
as frequent as changes.
Timeliness - minimal gap
between stimulus and
response. (Kaydos, 1999)
Decision-making: decision
cycle time and time lost
waiting for decisions. (Stalk
et al., 1990)
8/26/03
Measurements should be a Purchase price variance.
mix of past, present, and
(Thomas et al., 1990)
future to ensure that the
organization is concerned
with all three perspectives.
(Brown, 1996)
Internal PM Typology5clean-CEMM6.doc
Performance Criterion
makes possible the
comparison of organizational
units in the same business.
Using this approach enables
management to identify the
most effective organizational
unit. Once differences in
performance levels are
identified, corrective actions
may be initiated on the units
with lower performance.
(Globerson, 1985)
Eric Olsen
OSU Fisher College of Business
REFERENCES
Boyd, B. K., Dess, G. G., & Rasheed, A. M. A. 1993. Divergence between archival and
perceptual measures of the environment: causes and consequences. Academy of
Management. The Academy of Management Review, 18(2): 204-226.
Brown, M. G. 1996. Keeping score : using the right metrics to drive world-class performance.
New York: Quality Resources : Distributed by AMACOM Books.
Buckley, W. F. 1968. Modern systems research for the behavioral scientist; a sourcebook.
Edited by Walter Buckley. Foreword by Anatol Rapoport. Chicago,: Aldine Pub. Co.
Dixon, J. R., Nanni, A. J., & Vollmann, T. E. 1990. The new performance challenge :
measuring operations for world- class competition. Homewood, Ill.: Dow Jones-Irwin.
Dow, D., Samson, D., & Ford, S. 1999. Exploding the myth: do all quality management practices
contribute to superior quality performance? Production and Operations Management,
8(1): 1-27.
Eisenhardt, K. M. 1989. Building Theories From Case Study Research. Academy of
Management. The Academy of Management Review, 14(4): 532-550.
Globerson, S. 1985. Performance criteria and incentive systems. Amsterdam ; New York:
Elsevier.
Hall, R. W. 1983. Zero inventories. Homewood, Ill.: Business One Irwin.
Hambrick, D. C. 1984. Taxonomic approaches to studying strategy: some conceptual and
methodological issues. Journal of Management, 10(1): 27-41.
Harrigan, K. R. 1983. Research Methodologies for Contingency Approaches to Business
Strategy. Academy of Management. The Academy of Management Review, 8(3): 398405.
Hayes, R. H., Wheelwright, S. C., & Clark, K. B. 1988. Dynamic manufacturing : creating the
learning organization. New York
London: Free Press ;
Collier Macmillan.
Kaplan, R. S. 1983. Measuring Manufacturing Performance: A New Challenge for Managerial
Accounting Research. The Accounting Review 58(4): 686-705.
Kaplan, R. S., & Norton, D. P. 1992. The Balanced Scorecard - Measures That Drive
Performance. Harvard Business Review, 70(1): 71-79.
Kaydos, W. J. 1999. Operational performance measurement : increasing total productivity.
Boca Raton, Fla.: St. Lucie Press.
Leong, G. K., Snyder, D., & Ward, P. 1990. Research in the process and content of
manufacturing strategy. Omega, Int. J. of Mgmt Sci., 18(2): 109-122.
Lynch, R. L., & Cross, K. F. 1995. Measure up! : yardsticks for continuous improvement (2nd
ed.). Cambridge, Mass.: Blackwell Business.
Maskell, B. H. 1991. Performance measurement for world class manufacturing : a model for
American companies. Cambridge, Mass.: Productivity Press.
McDougall, D. C. 1988. Learning the ropes: how to tell when you've found an effective
performance measurement system for manufacturing. Operations Management
Review(Fall 1987 & Winter 1988): 38-48.
Meyer, A. D., Tsui, A. S., & Hinings, C. R. 1993. Configurational approaches to organizational
analysis. Academy of Management Journal, 36(6): 1175-1195.
Page 29
8/26/03
Internal PM Typology5clean-CEMM6.doc
Eric Olsen
OSU Fisher College of Business
Stalk, G., & Hout, T. M. 1990. Competing against time : how time based competition is
reshaping global markets. New York
London: Free Press ;
Collier Macmillan.
Starbuck, W. H., & Mezias, J. M. 1996. Opening Pandora's box: Studying the accuracy of
managers' perceptions. Journal of Organizational Behavior, 17(2): 99-117.
Tesoro, F., & Tootson, J. 2000. Implementing global performance measurement systems : a
cookbook approach. San Francisco: Jossey-Bass/Pfeiffer.
Thomas, P. R., & Martin, K. R. 1990. Competitiveness through total cycle time : an overview
for CEOs. New York: McGraw-Hill.
Weick, K. E. 1987. Organizational Culture as a Source of High Reliability. California
Management Review 29(2): 112-127.
Page 30
8/26/03
Internal PM Typology5clean-CEMM6.doc
Download