D E A P

advertisement
New Zealand Applied Business Journal
Volume 1, Number 1, 2002
DATA ENVELOPMENT ANALYSIS AS A PERFORMANCE MEASURE FOR
TERTIARY EDUCATION INSTITUTIONS
Noel Yahanpath
Eastern Institute of Technology
Napier, New Zealand
Hong Tong Wang
This draft paper represents work in progress. Please do not cite, quote or otherwise use
without written permission of the author. Comments are welcome.
Abstract: This paper examines the development and application of Data
Envelopment Analysis (DEA), a linear programming-based technique that focuses on
productive efficiency, as an alternative method to measure, evaluate and benchmark
the performance of Tertiary Education Institutions (TEIs). The performance of TEIs
in different regions of New Zealand, are evaluated with analysis of the relative
efficiencies of the University and Polytechnic sectors separately and in combination
through the application of the DEA technique. The capabilities of DEA are explored
using a two dimensional approach at three different levels - single input and single
output measures; one input and two output measures; two input and two output
measures. The relative efficiencies of New Zealand Universities and Polytechnics
are evaluated separately and two distinct efficiency frontiers were immerged. This
study is not a strict application of multi-dimensional DEA, which uses computer
generated ‘frontier analysis’. However, for the purpose of this study research data
results were plotted using the Excel programme taking a two dimensional approach.
Currently, myself and Graeme Treasure at UNITEC are analysing the same data set
along with non-financial indicators but using frontier analysis software with the
intent of publishing the findings shortly.
Key words:
Data envelopment analysis, Tertiary Education, performance evaluation,
INTRODUCTION
There are often conflicting opinions regarding the goals of education and the relative
importance of those goals by the stakeholders of education. Questions such as how to
measure multiple outputs and outcomes in relation to multiple inputs, how to assess the
efficiency and effectiveness of the tertiary education sector, how to benchmark the tertiary
education sector and finally, how to improve the performance of an inefficient TEI, are
becoming more crucial for education policy providers, evaluators and education providers.
Research confirms that traditional benchmarking techniques, such as regression analysis,
accounting ratios and weighted average ratio either lack the capabilities of measuring
multiple inputs and outputs or provide an incomplete picture of overall performance or rely
heavily upon human judgement and error. De Young (1998) reported that the use of onedimensional accounting ratios to analyse efficiency in the banking sector provided an
incomplete picture of performance. Over-reliance on accounting ratios can reduce efficiency
through the cutting back of expenditure essential to a well-run institution. The weighted
average ratio technique analyses multiple inputs and outputs, but requires the judgment of
1
Volume 1, Number 1, 2002
New Zealand Applied Business Journal
management to agree on the weights that will be assigned to various variables. This can be
difficult, controversial and problematic. (see Appendix One). However, DEA overcomes
these limitations by applying weights that maximise the outputs of each Decision Making
Unit (DMU).
DEA is a linear programming-based technique that uses multiple inputs and outputs to assess
an organisation’s performance efficiency and is presented as an alternative method to address
the issues raised by education stakeholders. The DEA concept was originally developed by
Charnes, Cooper and Rhodes (1978) with the intention to create a performance measure that
business managers could use to evaluate the relative performance of various decision making
units (DMUs) with the methodology recognizing that DMUs use multiple inputs to achieve
multiple outputs. Subsequently the DEA technique has been used in studies to benchmark
the efficiency of real-world situations encompassing a range of industries. The DEA
technique focuses on productive efficiency, which relates to the level of inputs relative to the
level of outputs. An important fact to note is that DEA does not provide a method by which
to measure economic efficiency. Rather it provides a method by which an organisation can
gain efficiency through either maximising its outputs for given inputs or minimising its inputs
for given outputs. Economic efficiency on the other-hand is determined through using inputs,
outputs and market prices providing a method by which an organisation can obtain economic
efficiency through minimising costs or maximising profits. Integral to the DEA measure is
the supportive linear programming software which is used to assign weights that maximise
the output of each DMU, subject to the constraint that no other DMU would have an
efficiency score greater than one, if it uses the same set of weights. DEA produces an
efficient or best-practice frontier.
DMUs, which lie within the frontier, are relatively
inefficient compared to the DMUs, which lie on the frontier.
Since the introduction of the DEA technique, much theoretical and empirical research has
been done. Many studies have been published dealing with the use of DEA to benchmark
the efficiency in real-world situations, particularly in the public sector. For example, DEA
has been used to make provider comparisons of schools (Chalos & Cherian, 1995), human
service agencies (Ozcan & Cotter, 1994), real estate (Anderson, Lewis and Springer, 2000),
software developers (Chatzoglou & Soteriou, 1999). More recently there has been some
research interest in this field in New Zealand. Graeme Treasure, a lecturer at Unitec
undertook a study analysing data relating to banking sector performance in New Zealand,
using DEA, for his masters degree and intends to publish the findings. Julie Harrison, a
lecturer, at The University of Auckland, is currently studying data relating to the secondary
schools in New Zealand using DEA.
The application of DEA is demonstrated using four case studies and the analysis of these
cases is based upon a two dimensional approach. The first three cases represent three
different levels of measurement: one input and one output; one input and two outputs; and
two inputs and two outputs. The fourth case presents how DEA can be used to trace the
relative efficiency of an institution over several time periods.
2
New Zealand Applied Business Journal
Volume 1, Number 1, 2002
DISCUSSION
In the case of the education sector, decisions regarding the pricing and selection of courses
(products) are limited and are decided by the Ministry of Education (MoE). Additionally, the
pricing of courses is derived mainly from two sources, government funding of fees via the
MoE and direct course fees. Therefore, DEA is a relevant method of evaluating efficiency, as
it focuses on inputs and outputs, rather than market prices.
In 1998 the Ministerial Consultative Group identified the following objectives for TEI’s:





To provide a wide variety of high-quality programs that are relevant and responsive to
identified and expressed needs.
To achieve target outputs for student enrolments and performance.
To obtain, develop and manage resources effectively, efficiently and responsibly, and to
maintain financial viability through the efficient use of resources.
To provide equal opportunities for all people, regardless of gender, ethnic origin or
special needs and in keeping with the spirit and principles of the Treaty of Waitangi.
To provide programs and services that meet international standards and establish a
worldwide reputation.
Therefore, it is clear that TEI’s have multiple objectives and multiple outputs and outcomes.
Inputs such as total assets, total expenses and total full-time equivalent staff (FTES) are used
to achieve outputs (operating income, number of EFTS) and outcomes (pass rates, student
satisfaction rates, student employment rates). There are often conflicting opinions regarding
the goals of education and the relative importance of these goals, to the stakeholders of
education. Generally, when one objective is achieved, another is sacrificed. It is difficult,
and sometimes impossible, to maximise several objectives at the same time. Therefore,
inherent difficulties exist in measuring overall educational efficiency and effectiveness using
traditional analysis methods, such as simple ratio analysis.
In order to benchmark TEIs, it is necessary to use input and output measures which are
related to their objectives. These measures should include both financial and non-financial
indicators. Appropriate ratios include:





Total EFTS to total operating expenses
Total income to fixed assets
Total EFTS to FTES
Total EFTS to total assets
Total EFTS to net teaching area.
A research project conducted by Eastern Institute of Technology (EIT) business studies
student Hong Tong Wang (under the supervision of Noel Yahanpath) used DEA to assess the
relative efficiency of the University and Polytechnic sectors. The ratios identified were
applied through analysing raw data obtained from published annual reports of thirty-eight
TEIs and financial information provided by the MoE. In order to demonstrate DEA in its
simplest form and also some noise in information relating to the polytechnic sector, in the
cases 1, 2 and 3, data relating to the university sector was used. In case 4 total sectoral
3
Volume 1, Number 1, 2002
New Zealand Applied Business Journal
analysis was undertaken to illustrate isolation of two different efficiency frontiers. The
project methodology and findings are presented taking an incremental approach to illustrate
the DEA technique.
CASE 1: SINGLE INPUT AND SINGLE OUTPUT MEASURES APPROACH
The following graph demonstrates the single input (Total Assets)/single output (EFTS)
relationship of seven universities. In this case, two assumptions are made. Firstly, all seven
universities provide a similar range of educational products and services and that there is one
output measure of performance: total equivalent full-time students (EFTS). Secondly, that
this output is generated by one input, measured by the total assets held by a university.
The performance of each university in terms of generating EFTS from its total assets can be
plotted as a two dimensional graph (see Figure 1).
Output: EFTS
Figure 1: Relative Efficiency of Universities Using a Single Input
and Single Output Measure
25,000
20,000
15,000
10,000
5,000
-
100,000 200,000 300,000 400,000 500,000 600,000 700,000 800,000
Input: Total Assets ($000's)
Figure 1
As shown in Figure 1, a least squares regression line can be constructed. Although there is a
clear relationship shown, in this case, between the independent and dependent variables,
DEA does not require this. Instead it calculates a performance measure for each institution
relative to all other institutions. Providers that produce more output, quality, or outcome than
predicted have positive residuals, whereas providers that produce less output, quality, or
outcome than predicted have negative residuals. The best-practice provider for a given
performance category is the one with the highest positive residual. (See Appendix 2).
4
New Zealand Applied Business Journal
Volume 1, Number 1, 2002
CASE 2: ONE INPUT AND TWO OUTPUT MEASURES APPROACH
Now assume all seven universities provide a similar range of educational products and
services but that there are two output measures for performance: government grants and nongovernment funds (student fees and other income). Also assume that these are generated by
the one input, measured by the total assets held by a university. Given this information two
relative measures of performance can be constructed.


Government Grants/Total Assets (GG/TA)
Non-government Grants/Total Assets (NGG/TA)
Table 1 shows the data for seven universities for the 1999 financial year.
U1
U2
U3
U4
U5
U6
U7
GG/TA (x-axis)
22.35%
16.02%
21.76%
22.58%
22.04%
18.93%
25.56%
NGG/TA (y-axis)
27.46%
12.89%
47.66%
20.70%
30.30%
16.75%
29.04%
Table 1
Figure 2 illustrates the performance of each university in generating government funding and
income from students from its total assets with results plotted as a two dimensional graph.
Non-government
Funds/Total Assets
Figure 2: Relative Efficiency of Universities Using One Input
and Two Output Measures
60%
50%
U3
40%
30%
20%
U2
10%
U6
U5U1
U4
U7
S
0%
0%
5%
10%
15%
20%
25%
30%
Government Grants/Total Assets
Figure 2
From this graph, clearly U3 has the best result with respect to NGG/TA and U7 has the best
result with respect to GG/TA. Both may be said to be most efficient in one particular area of
revenue generation, but at the expense of the other area. The line horizontally from the Y-axis
to U3, U3 to U7 and vertically from U7 to the X-axis might then represent an efficient
frontier of performance. All other universities are less efficient than these two in generating
total revenue from their total assets. The degree of inefficiency may be measured by the
radial distance that it falls short of the efficient frontier of performance. For example, the
degree to which U2’s performance is inefficient may be measured as a percentage by the
distance along radial line U2-S:O-S. The degree of relative inefficiency in this case is about
5
Volume 1, Number 1, 2002
New Zealand Applied Business Journal
40%. It is important to note that the inefficiency of 40% is relative to the efficiencies
achieved by other institutions and not a ‘stand-alone’ measure. Further, given current
resources, the boundary represents the maximum outputs that can be currently achieved,
therefore, operations inside this frontier represents some form of inefficient output. The more
efficient use of assets by inefficient units could move towards the frontier.
CASE 3: TWO INPUT AND TWO OUTPUT MEASURES APPROACH
Case 1 ranked performance by one measure: Total operating income/Total assets. Case 2
introduced two output measures for a single input. Now let’s see how DEA can incorporate
multiple inputs and multiple outputs to construct an efficient frontier of performance for the
seven universities.
Again assume all seven universities offer the same range of educational products and
services. In addition to total operating income, there is another output measure: EFTS. Now
assume that the output measure EFTS is generated by total operating expenses. Thus, two
new comparative measures of performance might be constructed:


Total operating income/Total Assets (OI/TA).
EFTS/ Total expenses (EFTS/TE)
The data from the 1999 financial year for the seven universities is shown in Table 2.
U1
EFTS/OE (x-axis)
OI/TA (y-axis)
U2
U3
U4
U5
U6
U7
6.56%
8.76%
5.44%
8.46%
5.83%
8.52%
8.01%
49.81%
28.91%
69.42%
43.28%
52.34%
35.68%
54.60%
Table 2
Figure 3 illustrates each university’s performance in producing total income from its total
assets and serving total EFTS from its total expenses.
Total Income/Total Assets
Figure 3: Relative Efficiency of Universities With Two Input
and Two Output Measures
80%
70%
60%
50%
40%
30%
20%
10%
0%
U3
U7
U5
U1
U4
U6
U2
0%
2%
4%
6%
8%
Total EFTS/Operating Expenses
6
10%
New Zealand Applied Business Journal
Volume 1, Number 1, 2002
Figure 3
From this graph, it can be seen that:
 U3 has the best result with respect to OI/TA,
 U2 has the best result with respect to EFTS/ OE.
Both, U3 and U2, may be said to be most efficient in one particular area of output generation,
but at the expense of the other area. However, other universities may have decided to adopt a
more balanced approach to income and EFTS generation. For example, U7, while it does not
exceed U3’s performance in OI/TA and U2’s performance in EFTS/OE, it has exceeded the
performance of U5 and U1 in both EFTS/OE and OI/TA.
Comparing the results of Case 2 and Case 3, it can be found that in terms of performance
efficiency, there are diverse conclusions for U2, U6, and U4. In case 1, these three
universities are under the efficient frontier: each has some degree of inefficiency. In case 2,
by adding one pair of measurements – Operating Expenses to EFTS, the three universities are
on or nearly on the efficient frontier. That is to say, when evaluating the relevant efficiency of
the universities’ performance by how much total income is generated by total assets, the three
universities are inefficient. However, when considering how efficiently the institutions are
managing total expenditures to serve total EFTS at the same time, the three universities are
considered to be or nearly to be 100% efficient in relation to the other surveyed institutions.
It may be concluded that as the number of efficiency measures increase, the number of
universities that fall on the frontier may increase as well, since the approach by which they
achieve their efficiency is recognized. It is therefore necessary to use a large number of
tertiary educational institutions that are relatively homogeneous in order to obtain meaningful
results when a larger number of efficiency measures are considered. It should also be noted
that even institutions on the efficient frontier have room for improvement. In Figure 3, U2 is
on the efficient frontier, and therefore considered to be 100% efficient in relation to the other
universities. However, this institution produces the least total income from its asset base than
all other universities in the sample. Conversely, U3 is the most efficient institution in terms
of return on assets, but has the lowest ratio of total EFTS/operating expenses.
CASE 4: AN EVALUATION OF THE RELATIVE EFFICIENCY OF POLYTECHNICS AND
UNIVERSITIES
In the first three cases analysis is based only on the university sub-sector. However, in order
to obtain a meaningful result of relative efficiency, we add all the polytechnics’ performance
measures into the universities’ performance measures. We still use the two sets of
performance measures:


Total operating income/Total assets (OI/TA)
EFTS/Total expenses (EFTS/TE)
Graphing the result (Figure 4) it is found that the efficiency score of The Open Polytechnic is
significantly better than other TEIs in terms of EFTS/total expenditure and operating
7
Volume 1, Number 1, 2002
New Zealand Applied Business Journal
income/total assets. This is reasonable given that The Open Polytechnic provides distance
learning, so does not have to provide and maintain classrooms, lecture halls etc. Therefore,
the performance indicator of The Open Polytechnic should be viewed as an ‘outlier’ and
should be excluded when constructing a relative efficient frontier.
Figure 4: Relative efficiency of universities and polytechnics in New
Zealand (1999)
160%
Total Income/Total Assets
140%
120%
The Open
Polytechnic
100%
80%
60%
40%
The polytechnics' efficient
frontier
The universities'
efficient frontier
20%
0%
0%
2%
4%
6%
10%
8%
12%
14%
16%
18%
EFTS/Total Expenses
Figure 4
A further distinction can be made between universities and polytechnics. The performance
scores of the seven universities are separated, and form a lower efficient frontier than the
polytechnics. All of the universities’ EFTS/total expenditure measures are lower than that of
the polytechnics.
The 1999 Tertiary Ownership Monitoring Unit (TOMU) reported that the average total
expenditure per EFTS for each sector is significantly different. Total expenditure per EFTS
is:




$13,701 for University Sector,
$9,152 for Polytechnic Sector,
$11,557 for College of Education Sector, and
$8,285 for Wananga Sector.
This means that the different sectors are not comparable in terms of expenditure per EFTS
because of the variation in the nature of programs offered by each. In order to obtain
meaningful results when using DEA, it is important to assume all DMUs are relatively
homogeneous in offering the range of products and services by utilizing the range of inputs
(total assets, costs, total number of services, etc.). It is clear from the graph that two efficient
frontiers can be drawn, one for universities and another for polytechnics (excluding The Open
8
New Zealand Applied Business Journal
Volume 1, Number 1, 2002
Polytechnic) in New Zealand. It is also possible from the graph to measure the relative
efficiency of each polytechnic and university by the radial distance from the origin.
CASE 5: RELATIVE EFFICIENCY OF AN INSTITUTION OVER SEVERAL TIME
PERIODS
DEA can also be used to assess how the relative efficiency of a single institution has changed
over several time periods. We have selected an arbitrary institute (XYZ) for this purpose. It
has been found that XYZ is 100% efficient (in relation to other polytechnics) in terms of
generating total income using total assets, and managing total expenditures serving the total
EFTS, in the year of 1999. Now lets use the data available to track the relative efficiency of
XYZ in years 1995, 1996, and 1997.
In Figure 5 the radial distance from the origin indicates that XYZ’s efficiency score is about
85% for 1995 based on the two dimensions selected. Similarly, according to Figure 6, the
efficiency score of XYZ is around 90% for 1996, and about 95% for 1997 (see Figure 7).
Figure 5: Relative Efficiency of XYZ in 1995
Total income/Total assets
250%
200%
150%
100%
xyz
50%

0%
0%
2%
4%
6%
8%
10%
EFTS/Total expenses
Figure 5
9
12%
14%
16%
Volume 1, Number 1, 2002
New Zealand Applied Business Journal
Figure 6: Relative Efficiency of XYZ in 1996
Total income/Total assets
160%
140%
120%
100%
80%
60%

40%
xyz
20%
0%
0%
2%
4%
6%
8%
10%
12%
14%
EFTS/Total expenses
Figure 6
Total income/Total assets
Figure 7: Relative Efficiency of XYZ in 1997
140%
120%
100%
80%
60%
xyz

40%
20%
0%
0%
2%
4%
6%
8%
10%
12%
14%
EFTS/Total expenses
Figure 7
It can be concluded that XYZ has been improving its performance year by year, in terms of
generating income using total assets and managing total expenditures serving total number of
EFTS. However, we must emphasise that multi-dimensional DEA provides a more accurate
relative efficiency score, therefore, the results from the two-dimensional approach have to be
interpreted with caution
CONCLUSION
In this study we have briefly discussed the application of DEA as an alternative technique
and more effective measure by which to evaluate and benchmark the performance of New
Zealand TEIs. The inadequacies of current methods of performance evaluation such as
10
New Zealand Applied Business Journal
Volume 1, Number 1, 2002
regression analysis and one dimensional accounting ratios to address the issues raised by
education stakeholders, initiated the opportunity to explore the capabilities of DEA to
evaluate the performance of New Zealand TEIs. The study also illustrates the efficiencies of
the University and Polytechnic sectors.
On the basis of the limited measures that we have selected for this study, it can be seen from
the figure 4, that the polytechnic sector is relatively more efficient than the university sector.
Further figures 5,6 and 7 illustrate how DEA can be used to track the relative efficiency of a
given TEI, over several time periods.
The DEA technique was applied by taking a two dimensional approach at three different
levels – single input and single output measures; one input and two output measures; two
input and two output measures. However, the findings should be interpreted with caution as
the illustration of the DEA technique was based on a two dimensional approach, not through
the strict application of DEA supporting computer generated ‘frontier analysis’. Thus, if a
multi-dimensional approach were to be adopted using the relevant software then the results
would have been more realistic and complete. The use of a frontier analysis computer
programme to evaluate the tertiary sector, based on multi-dimensional DEA is a possible
direction for future research.
It is also noted that some attempt has been made in New Zealand to use this method of
analysis as a performance measure more recently. The complexity of the objective function
of TEIs and thereby the difficulty of selecting appropriate performance measure has been
discussed in this paper. The study highlights DEA as an alternative performance measure for
future decision makers in the tertiary sector.
11
Volume 1, Number 1, 2002
New Zealand Applied Business Journal
APPENDIX ONE – WEIGHTED AVERAGE RATIO ANALYSIS
One recommended method of attempting to bring greater clarity to the use of simple ratio
analysis is through the use of weighted averages (Gupta, 1994). In this approach, each of the
performance ratios could be assigned a weight reflective of its relative importance based on
the preferences of policymakers and policy evaluators. The commonly used formula is,
Weighted sum of outputs
Efficiency =
Weighted sum of inputs
which, introducing the usual notation can be written as
u1y1j + u2y2j +…
Efficiency of unit j =
v1x1j + v2x1j + …
where
u1 = the weight given to output I
y1j= amount of output 1 from unit j
v1 = the weight given to input 1
x1j = amount of input 1 to unit j.
This approach assumes, of course, that policymakers and policy evaluators can agree on a
deterministic value to be assigned to each of the ratios. This immediately raises the problem
of how such an agreed common set of weights can be obtained. The task of assigning weights
to ratios is one of the most difficult and controversial tasks with which policymakers and
policy evaluators are confronted.
12
New Zealand Applied Business Journal
Volume 1, Number 1, 2002
APPENDIX TWO – REGRESSION ANALYSIS
Regression analysis also is frequently used to make service provider comparisons using
performance-related data (Hatry & Fisk, 1992). Regression analysis involves using data on all
providers in the analysis to compute a production function that involves one or more inputs
serving as independent variables (output, quality, and outcome) and one dependent
performance measurement variable serving as the dependent variable.
In the linear regression model, the dependent variable is assumed to be a linear function of
one or more independent variables plus an error introduced to account for all other factors:
yi = 1 x1 +… +k xk + i
(i = 1, …, n)
In the above regression equation, yi is the dependent performance measurement variable, x1,
…, xk are the inputs serving as independent variables. Each regression equation essentially
becomes a forecast: for a given amount of input (e.g., resources), what amount of
performance (output, quality, or outcome) should be expected? The forecast represents an
average based on the performance of all providers for a given performance category. The
“gap” between the actual performance of an individual provider and the average performance
of all providers is determined by an examination of the regression residuals (a residual is the
difference between an observed value of a variable and the value predicted by the model.
That is, residual = observed value – predicted value). The following figure 1 illustrates the
regression line of one dependent variable (y) with one independent variable (x).
The regression analysis integrates the multiple inputs (x1, …,xk ) to measure each category’s
best average performance (yi). However, regression analysis suffers from the same major
limitation as does simple ratio analysis – the inability to analyze multiple performance
measurement (dependent) variables and arrive at some measure of best overall practice. It
only measures one output yi at a time.
Regression analysis is based on an “average performance standard”. Instead of determining
best overall practice based on all the performance categories, regression analysis identifies
only the best average performance (yi ) within each individual performance category. In this
sense, regression analysis is not much of an improvement over simple ratio analysis.
13
Volume 1, Number 1, 2002
New Zealand Applied Business Journal
REFERENCES
Anderson, R. I., Lewis, D., & Springer, T. M. (2000). Operating efficiencies in real estate: a critical review of
the literature. Journal of Real Estate Literature, 8, pp. 3-18.
Chalos, P., & Cherian, J. (1995). An application of data envelopment analysis to public sector performance
measurement and accountability. Journal of Accounting and Public Policy, 14, pp. 143-160.
Charnes, A., Cooper, W., & Rhodes, E. (1978). Measuring the efficiency of decision making units. European
Journal of Operations Research, 2, pp.429-444.
Chatzoglou, P. D., & Soteriou, A. C. (1999). A DEA framework to assess the efficiency of the software
requirements capture and analysis process. Decision Sciences, 30. pp. 503-631.
Chen, T., & Yeh, T. (2000). A measurement of bank efficiency, ownership and productivity changes in Taiwan.
Servic Industries Journal. January, pp. 95-109.
De Young, R. (1998). Management quality and X-inefficiency in national banks. Journal of Financial Services
Research. February, pp. 5-22.
Ozcan, Y. A., & Cotter, M. A. (1994). An assessment of efficiency of area agencies on aging in Virginia
through data envelopment analysis. The Gerontologist, 34, pp. 363-370.
Bendheim, C. L., Waddock, S. A., & Graves, S. B. (1998). Determining best practice in corporate-stakeholder
relations using data envelopment analysis. Business & Society, 37, pp. 305-338.
Education Forum (1998). Policy directions for tertiary education: a submission on the government green paper;
a future tertiary education policy for New Zealand: tertiary education review. Auckland: Education
Forum.
Engert, F. (1996). The reporting of school district efficiency: the adequacy of ratio measures. Public Budgeting
& Financial Management, 8(2), pp.247-271.
Gupta, D. K. (1994). Decisions by the numbers: an introduction to quantitative techniques for public policy
analysis and management. Englewood Cliffs, NJ: Prentice Hall.
Harrison, J.A. (2002). Comparative Performance Measure: an examination of different methods applied to
New Zealand secondary school data A Proposal for the Degree of Doctor of Philosophy (PhD)
University of Auckland.
Hatry, H., & Fisk, D. (1992). Measuring productivity in the public sector. Public productivity handbook. New
York: Marcel Dekker.
Nyhan, R. C., & Martin, L. L. (1999). Comparative performance measurement: a primer on data envelopment
analysis. Public Productivity & Management Review, 22, pp.348-363.
Sammon, W. L., Kurland, M. A., & Spitalnic, R. (1984). Business competitor intelligence: methods for
collecting, organizing, and using information. New York: Wiley.
Siems, T., & Barr, R. (1998, December). Benchmarking the productive efficiency of U. S. banks. Financial
Industry Studies, p.11.
Treasure, G. (2002). Relative efficiency of the New Zealand and Australian Banking and Finance Service
Industry; an application of Data Envelopment Analysis. Unpublished Master Thesis, University of
Auckland.
“TAMU”, (2000). Tertiary advisory monitoring unit. [Online]. Available: http://www.minedu.govt.nz.
[20/02/2001].
14
New Zealand Applied Business Journal
Volume 1, Number 1, 2002
Yin, R. (1998 ). DEA: a new methodology for evaluating the performance of forest products producers. Forest
Products Journal, 48, pp. 29-34.
ABOUT THE AUTHORS
Noel Yahanpath is a Finance Lecturer at the Eastern Institute of Technology, Napier, New
Zealand and Hong Tong Wang is a past student of the EIT..
15
Download