(Planning and Reporting) Regulations

advertisement
Submission to the Regulatory Impact
Statement on the Local Government
(Planning and Reporting) Regulations
Municipal Association of Victoria
March 2014
6
© Copyright Municipal Association of Victoria, 2014.
The Municipal Association of Victoria (MAV) is the owner of the copyright in the
publication Submission to the Regulatory Impact Statement on the Local Government
(Planning and Reporting) Regulations.
No part of this publication may be reproduced, stored or transmitted in any form or by
any means without the prior permission in writing from the Municipal Association of
Victoria.
All requests to reproduce, store or transmit material contained in the publication
should be addressed to Owen Harvey-Beavis on 9667 5584
The MAV does not guarantee the accuracy of this document's contents if retrieved
from sources other than its official websites or directly from a MAV employee.
The MAV can provide this publication in an alternative format upon request, including
large print, Braille and audio.
Table of contents
1
Introduction ............................................................................................................... 3
2
Performance reporting .............................................................................................. 3
2.1 Overview ............................................................................................................... 3
2.2 Performance framework and data accuracy ........................................................... 3
3
Specific responses about performance measures ..................................................... 7
3.1 Planning ................................................................................................................ 7
3.2 Service cost denominators..................................................................................... 8
3.3 Maternal and child health, HACC, food safety and public pools ............................. 8
3.4 Road indicators.................................................................................................... 10
3.5 Rating issues ....................................................................................................... 10
3.6 Financial performance indicators ......................................................................... 10
4
Other reporting requirements .................................................................................. 11
5
Conclusion and recommendations .......................................................................... 11
2
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
1 Introduction
The Municipal Association of Victoria (MAV) welcomes the opportunity to provide
feedback on the Regulatory Impact Statement (RIS) on the Local Government
(Planning and Reporting) Regulations 2014 (‘the regulations’).
The regulations effectively comprise of two components — the re-making of the 2004
regulations, which stipulate the reporting requirements in councils’ annual reports
and budgets and the new performance reporting framework. These two broad
components are considered separately and our comments are primarily directed at
the latter issue.
The MAV has significant concerns about the ambitious scope of the performance
reporting framework, which we believe will cost significantly more to implement than
is recognised by the RIS. In addition, we have serious reservations about the
meaningfulness of the data likely to be produced by the framework and recommend
that the department limit the scope of the performance framework to ensure data is
high quality, consistent and meaningful. In addition, we believe that contextual
information will be required to promote interpretation of the performance data.
2 Performance reporting
2.1 Overview
The MAV supports the adoption of a performance reporting framework for the local
government sector. We believe there is capacity to improve both the transparency of
the sector as well as provide important data to enable councils to review and
enhance their services. The MAV has been a supporter of a performance reporting
framework since the matter was subject to a performance audit by the Victorian
Auditor-General’s Office (VAGO)and we were actively involved in the VAGO working
group and their performance project.
However, we have reservations about the proposal as it currently stands. The key to
meeting the objectives of the framework will be to ensure that indicators are
meaningful and data are accurate. We do not believe the proposal, as it is currently
structured, will achieve these goals.
2.2 Performance framework and data accuracy
The proposed performance framework is based heavily on the work of the Review of
Government Service Provision, based in the Productivity Commission. This body
reviews the performance of state and territories through the Report on Government
Services (RoGS). The MAV is supportive of the broad conceptual model that has
been adopted from the RoGS framework as a means of identifying appropriate
performance measures.
This performance framework is a mature conceptual model that has specifically been
developed for assessing the performance of government entities. As such, we
believe that this model is fundamentally sound and is a good basis for developing a
set of performance indicators for local councils.
However, we have concerns about the breadth and ambition of the framework. It
comprises 71 performance measures and a ‘governance and management checklist’.
Each of these performance measures needs to be conceptually sound against the
measured service’s objective and will require accurate data population.
3
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
The MAV is concerned that there are serious risks in terms of the breadth of the
proposed performance reporting framework. We do not believe the proposed
performance measures are in all cases sufficient and we strongly endorse
concentrating on a small number of measures and their quality before broadening the
framework progressively over time. It should be noted there are still gaps within the
performance framework of RoGS, which indicates that selecting meaningful
measures is not a simple task. Given the potential damage that could accrue to
councils through false performance data as a result of imprecise or inadequate
measures, we strongly endorse a simplification of the framework.
It should also be noted that one of the major purported benefits of the framework as
described in the RIS is improvement of services through competition by comparison
of various councils. This is at odds with Local Government Victoria’s expressed view
that the major comparative purpose of the performance framework is measuring the
performance of individual entities over time. The MAV believes that time series
comparisons are fundamentally fairer and seek changes to the RIS to better reflect
the potential benefits that would be achieved by measuring performance over time
only.
The RIS makes an interesting comment that ‘the services of local councils are rarely
subject to competitive pressures, which make yardstick competition and the use of
performance indicators more important’ (p.5), as a means of arguing that the
performance framework will improve competition and performance in the sector. The
MAV submits that many council services are provided within a competitive context
(whether aged care, childcare, immunisations, leisure centres and sporting facilities,
etc). This is likely to become problematic if costing information is published for
services subject to tendering (HACC) or otherwise subject to a competitive
environment (leisure centres or childcare). Far from supporting improved services,
the performance measures impart a significant competitive disadvantage to councils
in their future provision of these services.
In any case, there is a need for work to be done to develop explanatory notes for the
public about what the data means and its context. This should be completed by the
state in consultation with the MAV, local councils and the line department with
responsibility for that particular service.
2.2.1 Data accuracy
The MAV is highly experienced in collecting and analysing council data across a
range of service areas. In particular, we have collected and reported on a significant
body of costing data to assess the reasonable cost of councils’ services as well as
comprehensive financial and asset management data. This work has indicated that
considerable resources are required to address data quality concerns and ensure
consistency between respondents. Interestingly, under the proposed model, the
obligation to undertake this standardisation would rest on councils, which is likely to
result in inconsistencies between councils in their methodologies (and hence data
that are not strictly comparable). Despite LGV providing a workbook to assist
councils complete the performance framework, it is unreasonable to expect
consistency in data without external overview or more comprehensive direction.
The question of the quality of data collection is central to the overall cost-benefit
analysis of the regulatory change contained in the RIS. Given the RIS is
unambiguous about the benefits deriving from the framework as ‘competition by
4
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
comparison’, it is essential that the data are high quality and provide a reasonable
basis to compare various councils.
As an example, under the RoGS framework, data are collected and provided by
specialist organisations which have considerable experience and expertise in this
task, such as the Australian Bureau of Statistics and the Australian Institute of Health
and Welfare. Even with data provided by these specialist entities, the RoGS
publishes information about the data quality issues applicable to performance
indicators to ensure that any comparisons acknowledge any limitations. The MAV
strongly recommends that the performance framework acknowledges the critical
importance of accurate data in achieving the primary objective of the framework –
allowing the public to make assessments of council performance and introducing
‘competition by comparison’ however it is defined within this process.
The MAV is concerned that under the proposed regulations, it will not include
appropriate descriptions of data quality to guide the reader on the interpretation of
data.
In the absence of data quality statements, it is highly likely that the fundamental
premise of the performance framework — comparisons between organisations — will
be flawed. Even time-series analysis of a single organisation will be undermined by
ambiguous data quality.
2.2.2 Data appropriateness
The MAV argues that the framework should produce measures that are appropriate
given the conceptual model. This means that the measures should link to the
service’s objectives.
The MAV has raised several specific concerns about indicators relating to the
Maternal and Child Health program, the HACC program, pool safety and food safety
monitoring directly with LGV. On each of these occasions, the MAV’s concerns have
related to the use of indicators that we believe are not appropriate against the
service’s objectives. These specific concerns are raised later in the submission.
The MAV believes, as a matter of good project governance, that ongoing work should
be undertaken to iteratively review performance indicators to achieve improvements
in the framework over time. There is a need to work more comprehensively with line
departments that have existing databases and a good understanding of the key
indicators required to report on services under their responsibility. The MAV has
been in dialogue with several departments and have identified shared concerns with
specific performance indicators adopted for the report. These concerns have been
raised directly with LGV and are further discussed in section 3.
2.2.3 Cost estimates in the RIS
The MAV has serious doubts about the accuracy of the estimated costs contained
within the RIS. In our experience, collecting cost data requires considerable oversight
and revision to ensure that data are collected on a consistent basis. The MAV
undertakes a range of costing analyses for advocacy purposes and data will typically
require clarification and frequent revision to ensure its accuracy and comparability
between responding councils. Issues such as common application of staff
(management) overheads, accounting methodologies for IT systems,
accommodation, travel, etc are frequently problematic and require careful oversight.
The notion that multiple costing analyses across a range of service areas could be
completed simply by councils while ensuring consistency in methodology within
5
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
23.75 hours for the completion of all service performance indicators is, from the
perspective of the MAV, completely unrealistic. Either the data produced under this
scenario would be of poor quality and not fit for purpose, and/or the effort required by
councils would be far in excess of the estimates provided under the RIS.
It should be noted that few councils have systems in place to accurately compute a
unit cost. In the absence of these reporting systems, costings analysis requires
contributions from multiple areas of council (such as payroll, facilities management,
IT, senior officers, etc) to accurately calculate the cost of a service. In the absence of
these cost centres, the resulting unit costing will be close to meaningless. If the State
progresses with the concept of ‘direct cost’ rather than the total service cost the MAV
submits that this results in data without significant meaning. This term is used across
the board without further elaboration or definition. The limitations of this approach are
clear — for example, Statutory Planning may include the costs of internal and
external referrals. Under a direct cost model, should referrals to the engineering or
environment department be incorporated within the direct cost? Does direct cost
mean all cost excluding corporate overheads and governance? It will not be clear
what cost data is excluded from the calculation and different councils are likely to
include different costs depending on how the organisation is structured (i.e. whether
costs are contracted our provided in-house). Consequently it is unclear what costs
will be captured by a ‘direct cost’.
Once reported and released there will be a push by councils to understand why their
figures are way higher (or lower) than other councils. For example, councils’
accounting systems for HACC are generally poor — they do not attribute cleanly to
HACC services for things like training, indirect hours, travel time and travel
allowance. There will be much higher costs in addressing this, although over the
longer term it may lead to improvements in accounting. Resources will be consumed
in the justification and explanation of the differences — which may be illusory and
misleading rather than real because of what councils are putting in or leaving out.
These costs are real and should be included within the RIS.
The RoGS methodology for data collection — which uses professional data collection
agencies to provide data — would be a more appropriate starting point for the
provision of data. This would have the benefit that the work could be completed by
parties that are experts in data collection and analysis. An initial source of data would
be administrative minimum data sets that are established under national data
management frameworks. Council services that operate in partnership with the State
Government – such as Maternal and Child Health, food safety, aged care, etc —
would be suited to provide these data while minimising the cost to councils.
However, should the framework be progressed as proposed, we argue strongly in
favour of significant increases in the cost estimates associated with data collection.
We simply do not believe that collecting data of the scope under the performance
framework is reasonable and incorporates all likely costs for councils.
2.2.4 Implementation issues
It has always been the MAV’s view that the best approach is to pick a small number
of indicators and ensure they are operating effectively before building up a
comprehensive suite of indicators. This would seek to progressively expand the
scope and scale of the performance monitoring framework, while simultaneously
ensuring the appropriateness and accuracy of the performance measures.
6
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
The mechanism of implementing the indicators and measures — regulations — is
cumbersome and will present impediments to iterative and evolutionary improvement
of performance measures as LGV and the sector learn from successive reporting
cycles. The RoGS report, for example, will frequently make adjustments to indicators
as the conceptual model advances and new data becomes available or limitations in
current indicators are identified. Including indicators in regulation will consequently
make further refinement of indicators more difficult, to the long-term detriment of the
framework.
For example, the governance and management checklist makes reference to
‘municipal emergency management plans’ being prepared and maintained in
accordance with the Emergency Management Act 1986. With the progressive
implementation of the Victorian Emergency Management Reform White Paper, it is
expected that this requirement will be amended, which will necessitate changes to
this checklist item.
The MAV believes that an alternative statutory instrument which would not require
amendment to regulations — such as gazettal — would be a more appropriate
mechanism to define the performance indicators.
3 Specific responses about performance measures
3.1 Planning
Councils currently report significant data to the state through the Planning Permit
Activity Reporting (PPARS) and the MAV encourages the performance framework to
not duplicate existing reporting requirements. We believe that where existing
information is already submitted to a department, the consolidated data should be
delivered by that line agency to LGV.
For example, the statutory planning indicators (a) and (b) are already reported on as
part of the annual PPARS and councils should not have to re-report this to another
agency within the same department.
Specific issues relating to cost data are apparent when considering the cost of
statutory planning activities captured in indicator (c). The interpretation of these
figures could be undermined by the application of a denominator that does not
provide for any variation in the complexity of the application. In previous analysis of
the costs of planning applications, the MAV concluded that the most significant
factors are the application’s quality and complexity. For example, a metropolitan
council that receives more complex applications will naturally have a high per
application costs than a small rural council. Cost associated with the application will
depend on the type of application, including whether it is a single dwelling, for
multiple lots, different uses, the existence of easement, restrictions, extension of time
or amendment to an existing permit. The unit costs for a council dealing
predominantly with standard applications will differ significantly from another with a
series of complex applications despite numbers being same.
Indicator (d) will be used as a mechanism to judge the quality of decisions being
made by council. For instance, if the majority of decisions that are appealed are
supported by VCAT, it suggests that council is making the right decision. It assumes
that the decisions made by VCAT are always right.
7
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
In relation to data collection, the State collects PPARS data on (a)(b) already. To
reduce the reporting burden, the MAV suggests that the LGV seek this information
from its peers in DTPLI, rather than request this information again from councils.
3.2 Service cost denominators
The MAV has serious concerns about some of the selected denominators in the
performance framework. We believe the starting point for costs should be based on
population, since this is more likely to drive costs than a denominator based on an
element of the service provision. Where there is a direct relationship between cost
and output, the MAV is generally supportive of using the number of units delivered for
the service as a denominator. This requires the service to be relatively homogenous
to deliver meaningful data.
For example, it is unclear why the number of councillors is an important variable in
the cost of governance. This indicator would be more appropriate if defined as
‘governance cost per head of population’,1 since the overall size of the organisation
and its community is more likely drive this cost. While the number of councillors will
bear some relationship to the size of the population, it is difficult to understand what
costs (with the exception of councillor allowances) will be directly relevant to
councillor numbers. Without accessing data produced by this measure, the MAV
expects that average governance costs will be skewed substantially when similar
sized councils have different councillor numbers. A starting point for this indicator
would be a population-based denominator since this is likely to drive governance
complexity and cost.
We have similar concerns about economic development service costs. It isn’t clear
(a) what constitutes economic development services (and ‘business development’)
and (b) why service cost should be driven by the number of businesses within the
municipality. In the case of this service area, many services could be considered as
broadly constituting economic development — such as stakeholder engagement,
planning system assistance, other regulatory assistance (such as food safety),
marketing and communications, rating relief, etc. Given the diversity of services likely
to be captured by this service area, surely costs will depend on the scope of
delivered services, the size of businesses within the municipality, the nature of those
businesses (e.g. retail versus primary production versus professional services), not
just the number of businesses. Again, a cost averaged across the population will be
more meaningful since the nature of economic activity is likely to depend on the size
of the population within the municipality.
Public library costs are reported on the basis of the number of visitors. Given the
scope of services provided by a library, the number of visits may not be the most
appropriate denominator. Costs will depend significantly on the nature of the visit and
the services consumed, whether they relate to borrowing a book, using computers,
etc. Libraries are not just in the business of books (terminals and internet use). Visits
can be physical and/or virtual and can be to static, mobile, or virtual libraries.
3.3 Maternal and child health, HACC, food safety and public pools
The MAV has been in direct contact with LGV and the relevant departments to raise
a number of direct concerns with indicators for the above services. These concerns
have related to the following matters:
1
Alternatively, it could be argued that the governance cost as a proportion of total costs may
also provide useful data.
8
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations







9
The existing indicators can be replaced by more appropriate and meaningful
measures with data already collected through state administrative
mechanisms.
The relevant departments have undertaken considerable work on
performance reporting on these measures and there are inconsistencies
between the proposed framework and currently reported performance
measures.
MCH costing data is subject to the memorandum of understanding between
the Victorian Government and the MAV in which the Association establishes a
reasonable cost of services through a comprehensive costing analysis of the
sector. Reporting direct costs only of the MCH service is likely to result in
inconsistencies with the unit price negotiated through the MOU. Consistency
with the existing pricing analysis is critically important.
HACC is currently being transitioned from a state-managed responsibility
(under a tripartite agreement) to the Commonwealth. The experience of other
states and aged care provided by the Department of Veterans’ Affairs
indicates a likelihood that the service will be subjected to competitive tender
process in the future. Reporting on costs for each service provided under
HACC is likely to compromise the sector’s future provision of this program by
publicly releasing data on their costs. Additionally, the HACC measures do
not report on delivered meals, which is a core service provided by local
government.
Issues exist in reporting on public pool inspections since not all inspections
are carried out by an environmental health officer and a distinction needs to
be made between council operated pools and externally managed or owned
pools. Furthermore, some rationale should be provided about the exclusion of
income data from the analysis of cost structures (i.e. the framework reports
on net costs rather than total cost).
Food safety cost issues are problematic from the perspective of identifying an
accurate cost of the service and also reporting on the ‘direct’ rather than
complete unit cost. Developing a comparable costings allocation for one
aspect of an environmental health unit’s activities will take time and effort to
achieve. These could end up contributing to higher operating costs which
could lead to increased registration fees for food businesses. There have
been two research studies in the last 10 years which have demonstrated the
difficulty in obtaining comparable data on food safety regulatory costs, one
undertaken by the MAV in 2002, the other commissioned as part of the
DH/MAV Food Safety Coordination Project consideration of costs in 2008.
Both were completed at significant cost.
MAV is concerned that the close connection between costs and fees charged
by councils for registration under the Food Act means that only accounting for
direct costs may significantly understate the costs which should legitimately
be covered by registration fees to food businesses. This is despite VCEC
being clear in its report ‘Simplifying the Menu: Food Regulation in Victoria’
that there are legitimate regulatory costs that should be paid for by
businesses selling food, rather than being borne through the general ratebase. For example, capital items bought specifically for food safety regulatory
activities (such as IT systems, and cars for travel for officers for inspections).
Legal and insurance costs may also not be covered in the definition of a
‘direct cost’ if the service comes from the corporate area of council. While
acknowledging that service costs are an appropriate measure to aspire to,
MAV suggests that the service cost indicator for food safety be deferred until
more robust costings work is undertaken.
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
These services are operated under state legislation or in partnership with another
level of government. As a consequence, there are existing minimum data sets
already being collected. In addition, the services have existing governance structures
that provide ample opportunities to assess and produce alternative performance
indicators.
Given the sensitivity of data in these areas, the MAV welcomes the negotiation that
has already occurred with the state about revising a number of the indicators. In
addition, we seek an ongoing commitment to working closely with councils, the line
departments and the MAV to revise the performance framework over time.
3.4 Road indicators
The road indicators only report on sealed roads despite unsealed roads making up a
considerable proportion of roads in some councils. Furthermore, terms in the service
indicators are used without a definition — such as sealed local road reconstruction
and resealing. Further information is required on whether this relates to road
renewals or maintenance and it is essential that these are reported on separately.
3.5 Rating issues
Rating issues are contained within a number of indicators. The following matters are
germane to several indicators or governance checklist items. Firstly, the term rating
strategy is used when it is not clear whether this indicator is only referring to the
rating system in place. A definition of rating strategy is required to provide guidance
to councils.
Secondly, average residential rates per residential property as a means to assess the
‘revenue level’ are a problematic indicator. It is unclear why this is an accurate or
useful efficiency measure given the exclusion of significant farms, commercial and
industrial properties. An alternative approach is to assess affordability measures for
rates by assessing rates per income quartile range for vacant, residential, farm and
non-farm business.
Similarly, rates effort (rates as a proportion of capital improved value) should be
based on a measure of income rather than aggregate CIV. The income of the
community is a much better mechanism to measure its capacity to pay.
3.6 Financial performance indicators
The MAV raises the following matters in relation to this suite of indicators:



10
Differentiating between terms ‘expenditure’ and ‘expenses’ might be
appropriate in sustainable capacity as expenses may include asset write-offs
that are not true costs for councils. This measure should exclude these oneoff accounting expenses to obtain a more accurate identification of council
expenditure.
The average expenditure per property assessment should be changed to
average expenditure per head of population as council service costs are more
likely to be driven by the size of the population rather than the number of
assessments. This will be particularly problematic in growth areas with higher
average household sizes (where a cost per assessment will overestimate the
average costs of council) or in areas with significant agricultural activity, in
which a single farm enterprise will frequently be composed of multiple
assessments.
The MAV requires further clarification on why the level of unexpended specific
purpose grants reflects on efficiency. Further, expended specific purpose
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations


grants should be clearly defined — the measure as currently defined could be
volatile since it does not necessarily measure the expected or budged use of
specific purpose grants versus actual expenditure. Volatility will be caused by
timing issues related to the receipt of grants for major projects that span
multiple years.
Infrastructure per capita should be changed to value of infrastructure per
head of municipal population to ensure consistency in the phrasing with other
indicators in the ‘sustainable capacity indicators’ section.
Sustainable capacity measure (4)(d) relating to the SEIFA index of relative
socio-economic disadvantage may be strengthened by also reporting on the
relative socio-economic advantage within a municipality to provide more
comprehensive contextual information about the community’s attributes, since
areas with high relative socioeconomic disadvantage may also contain high
levels of advantage.
4 Other reporting requirements
A majority of the MAV’s commentary is on the performance reporting aspects of the
proposed regulations. However, provision 14(2)(e)(ii) will result in the CEO’s
administrative staff to be named in the Annual Report. It is suggested the word
‘officers’ should be replaced with the words ‘Senior Officers’, and the definition
updated to include a definition of ‘Senior Officer’ (mirroring the definition in the Act) to
remove this seemingly unintended requirement.
5 Conclusion and recommendations
As previously articulated, the MAV is supportive of a performance framework for local
government.
However, we have concerns about the proposed model. We believe that it is overly
ambitious and the data produced will be limited in its accuracy and appropriateness.
This is likely to cause significant issues for councils as the public data is used to
measure the efficiency and effectiveness of the sector, yet has substantial limitations.
We believe that the RIS grossly underestimates the costs of councils collecting and
reporting on the data. As an experienced organisation in collecting and analysing
council data, the cost of reporting on a single unit cost is far greater than the
aggregate level of costs reported in the RIS.
The MAV believes that the following steps should be taken to ensure that the
performance framework delivers on its objectives of fairly and accurately reporting on
performance information of councils:
1. Reduce the number of performance measures significantly to ensure that
effort in concentrated on a small number of indicators. Progressive expansion
of the scale of the framework can occur over a number of years as the sector
and LGV build their expertise in reporting on performance.
2. Shift the legislative instrument describing the detailed indicators from
regulation to an alternative statutory instrument that is simpler to amend as
further iterations of performance measurements are developed.
11
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
3. Establish governance processes for those indicators reported on that include
councils, the line department and the MAV to assess and refine indicators.
The mechanism must also incorporate data expertise to ensure that any
indicators are appropriate and accurate.
12
Submission to the Regulatory Impact Statement on the Local Government (Planning and Reporting) Regulations
Download