ADRI PowerPoint presentation

advertisement
ADRI
A model for critical analysis
A training resource to assist with CDU’s preparation for audit
Prepared by A/Prof Martin Carroll
PVC Learning, Teaching & Community Engagement
September 2010
Overarching Principle
CDU will be proactive and transparent in
its approach to audit. This is an
opportunity to support our ongoing quality
improvement efforts.
What is a Quality Audit?
A public and systematic determination of whether:
• CDU’s goals and objectives are based on
appropriate regulations, standards (note Appendix
E) and benchmarks;
• CDU’s planned arrangements are suitable to
achieve those goals (i.e. the overall approach);
• CDU’s actual practice conforms to the planned
arrangements (i.e. the deployment);
• The arrangements achieve the desired results;
• CDU is learning from a self-evaluation of its
approach, deployment and results, and can
demonstrate improvements.
What is a Quality Audit?
A public and systematic determination of whether:
• CDU’s goals and objectives are based on
appropriate regulations, standards (note Appendix
E) and benchmarks;
• CDU’s planned arrangements are suitable to
achieve those goals (i.e. the overall approach);
• CDU’s actual practice conforms to the planned
arrangements (i.e. the deployment);
• The arrangements achieve the desired results;
• CDU is learning from a self-evaluation of its
approach, deployment and results, and can
demonstrate improvements.
Quality
Reviewing
AuditGraduate
‘Double
Loop’
Destinations
Process
ADRI Cycle
of Quality
Assurance
Evidence examples:
• Internal program review
reports
• Input obtained from
industry & professional
bodies
• Resulting action plans
• Reports on progress
against action
plans
Evidence examples:
• Graduate destination
statistics
• Evidence of employer
satisfaction (e.g. survey
results)
Topic:
Evidence examples:
• Uni graduate
destination targets
• Industry needs
analysis
• Careers Service Plans
• Curriculum
demonstrating links to
industry needs
Graduate
Destinations
Evidence examples:
• Interviews with
academic staff,
careers service staff,
recent graduates
and employers
Internal
Reviewof
the System
(results
a Portfolio
section
with
OFI)
It
ADRI
can is
be
aapplied
model
toof quality
any
topic
assurance
– in inthis
used
example,
by agencies
theStrengths
analysis
in& many
of a
Externalcountries
Review
of thearound
Systemgraduate
(results
Audit
Report
with Coms, Affs, Recs)
university’s
the in
world
destinations
(e.g.section
AUQA)
Key Point
Each aspect of ADRI involves
consideration of different types of
evidence. Only the totality makes sense
when analysing the big picture.
I
A
Approach
R
D
What CDU proposes to achieve
• Statements of intent (goals, objective etc.) will be
found in multiple sources:
– applicable Legislation
– Strategic Plan 2010-2014 and internal plans
– Bylaws, policies, procedures, guidelines
– Cycle 1 Audit Report
– MCEETYA National Protocols
– An unpacking of the Themes
– External standards
I
A
Approach
R
D
What CDU proposes to achieve
• Internal and external statements of intent
must align.
• The processes for setting statements of intent
require evidence of appropriate
benchmarking and stakeholder involvement
(this is AUQA’s proxy for “Fitness of Purpose”
auditing).
I
A
Approach
R
D
How CDU proposes to achieve its purpose
• Operational Plans – detailing what should be
done by when, by whom, to what standard and
with what resources.
• Manuals – detailing how processes should be
implemented.
• Professional development and training aligned
to the University’s operational needs.
• Alignment of resource allocation to plans.
I
A
Approach
R
D
Broad Questions
• What external requirements apply to CDU? How
are they incorporated into the Strategic Plan?
• Does CDU have set of goals, objectives, strategies
and targets that are clearly understood by the
governing bodies and staff?
• Was there appropriate consultation,
benchmarking and analysis in developing the
Strategic Plan?
• What standards and external benchmarks have
been applied?
I
A
Approach
R
D
Broad Questions (cont.)
• Does the planning process incorporate
appropriate risk management?
• Does everyone know what they are supposed
to be doing, how and why?
• Are goals well supported with plans, manuals
and training?
• Are there clear means for monitoring progress
against the goals?
I
A
Approach
R
D
Cautions
• An assessment of Approach, on its own, does
not tell the whole story – only what is intended.
• Need to look at a wide range of sources to get
the full Approach.
• Alignment of various goals and strategies ought
to be by design, not accident.
• Equally, alignment of plans with resource
allocation models ought to be by design.
I
A
Deployment
R
D
Deployment Dimensions
• Also known as ‘implementation’ or ‘process’.
• Looks at how CDU is implementing its approach.
• In other words, do the plans and bylaws happen
in reality?
• This is best tested through interviews; tap into
people’s ‘lived experiences’.
• Also includes consideration of input factors such
as the quality of resources.
I
A
Deployment
R
D
Broad Questions
• What do people do?
• How do they know if they are doing a good job?
• Do all staff have the necessary authority and
resources to deliver what is expected of them?
• Do they have the necessary skills and
knowledge?
• Is the organisational structure a help or a
hindrance to deployment?
I
A
Deployment
R
D
Broad Questions
• Are there appropriate indicators for monitoring
the effectiveness and efficiency of processes?
How are these reported and used?
• Are there appropriate means for intervening if
necessary? How well do they work?
• Where the approach is deliberately not being
followed, why not? How are changes to the
planned processes managed?
• Are people allowed to contribute ideas?
I
A
Deployment
R
D
Cautions
• It is insufficient to only focus on deployment.
It must relate to an approach and lead to
results.
• It is essential to ‘triangulate’ anecdotal
evidence about deployment (e.g. from
interviews) with other sources of information,
until the issue is ‘saturated’.
TRIANGULATE!
Corroborate information by obtaining original evidence
from more than one type of source
Information
Source A
Topic
Information
Source B
Information
Source C
SATURATE!
Keep collecting evidence until nothing new
of significance emerges
I
A
Results
R
D
Results Dimensions
• Quality cannot be assured by only focusing on
the goals, plans, inputs and processes.
• There must be an emphasis on what is
actually achieved – the results!
• Every intent must have at least one reported
result.
• Every result should link back to a statement of
intent.
I
A
Results
R
D
Results Dimensions
• It is essential that a causal relationship can be
shown between the approach, the
deployment and the eventual result.
Otherwise the result may be just chance.
• If you know that A+B+C = 19 you still do not
know what B is (i.e. whether each step in the
process is adding value or not).
7 + (-6)
C = 19
A
B + 18
I
A
Results
R
D
Results Dimensions
• Results will be both quantitative and qualitative.
• In order to be meaningfully interpreted, results
ought to be expressed as trends over 3-5 years,
with targets and benchmarks.
• Results for many goals will be aggregated from
the results of its component objectives.
• Examples of good practices in reporting results
follow...
PERFORMANCE MEASURES
(Ensure
that the
measurement
scale is fully
understood!)
Performance Measure
With result, trend, targets, benchmark averages & best practices
Actual Results
2006
Targeted Results
Peer Average
Upper and Lower Quartiles
2007
2008
2009
2010
Annual Result for Goal #1
2011
PERFORMANCE MEASURES
One qualitative + quantitative approach
A
Low awareness of issue.
Ad hoc plans.
Random training.
Commitment attained.
Planning framework.
Training available.
D
Ad hoc practices.
Not linked to plans.
Not monitored.
Numerous good practices.
Practice aligns with plans.
Processes are analysed.
R
I
Results not linked to plans. Most plans have results.
Results not measured.
Most results reported.
Where measured, variable. Most targets achieved.
Staff input limited.
Review processes limited.
Improvements are random.
Staff input allowed.
Review framework.
OFI and GP identified.
Systematic approach.
Leading-edge vision.
Full set of plans.
Plans aligned & integrated.
Training linked to plans. Training comprehensive.
Good practice systemic.
Plans inform practice.
Monitoring in place.
Processes benchmarked.
Practice informs plans.
Good practices promoted.
All plans have results.
All results reported.
Targets achieved.
Stretch targets established.
Results are analysed.
Targets exceeded.
Staff input encouraged. Staff directly empowered.
Reviews systematic.
Self-reviews effective.
Benchmarking undertaken.
Benchmark host.
Annual Result for Goal #1
I
A
Results
R
D
Broad Questions
• For each statement of intent, what are the
results?
• Does CDU know exactly how and why those
results were achieved?
• Have these results been appropriately
contextualised (e.g. using targets, trends and
benchmarks)?
• What meaning/interpretation does CDU
derive from the results?
I
A
Results
R
D
Cautions
• It is insufficient to only consider results.
• Results only make sense in the context of the
approach and deployment.
• The manner in which a result is presented can
influence how it is interpreted, so adopt an
attitude of ‘healthy skepticism’. That is what
the Auditors are trained to do.
I
A
Improvement
R
D
Improvement Dimensions
• This dimension looks at what CDU knows
about itself in order to get better and better.
• Intentions should be regularly recalibrated.
• Processes should get more efficient and more
effective over time.
• Results should indicate increasing success.
• This requires a comprehensive system of
review – not just consideration of results.
I
A
Improvement
R
D
Broad Questions
• What data about CDU performance is routinely
collected and reported? How is the validity of
the data ensured? What happens to the data?
• How is the Strategic Plan (and other plans)
reviewed and revised?
• What review processes are in place for CDU’s
major activities? How does CDU know that the
review processes are effective?
I
A
Improvement
R
D
Broad Questions (cont.)
• Is the process of self review, learning and
improvement endemic throughout the
organisation?
• Are all staff empowered and encouraged to
contribute to ongoing improvement efforts?
• What has changed/improved as a result of the
review processes?
• What does CDU intend to change/ improve as a
result of the review processes?
I
A
Improvement
R
D
Cautions
• If not driven by sound values, this aspect of
ADRI can be very threatening and
disempowering for staff (who may feel that
their jobs are in jeopardy) and students (who
may feel that their grades are in jeopardy).
• Findings from surveys and reviews are often
not used effectively.
• ‘Wet Paint’ syndrome. This is fine – if it leads
to improvements.
Key Point
It is ok to use the audit to catalyse
improvements. In fact, it is desirable as
evidence of quality improvement in action.
But new improvements must be reported
as such, and set in an ongoing context.
I
A
Improvement
R
D
Cautions
ADRI Tips
• This is the overarching method of self review
used for AUQA audits – by auditees and by
auditors.
• For any given topic, at any point in time, CDU
will be strong in some aspects of ADRI and less
strong in others. This is normal. A quality audit
looks for evidence of the quality improvement
cycle in action.
• ADRI can be applied at macro and micro levels,
but for audit purposes it is best applied at the
subsection (specific topic) level.
Analysis vs. Problem Solving
• ADRI is a method for analysing a total QA
system. It can be used internally or externally. It
can identify strengths as well as opportunities
for improvement (OFI).
• ADRI is not a planning framework, although it
provides useful information for planning.
• ADRI does not identify the best solution to an
OFI. To do so would require problem-solving
methods (such as benchmarking).
• Therefore, AUQA recommendations arising from
ADRI ought to focus on what needs to be
improved, not how it needs to be improved.
Reviewing
& improving
Approach,
Deployment
& Results
Mission.
Planning,
designing
processes and
training staff
Analysing
measures of
outcomes
against
goals
Implementing
plans and
monitoring
processes
For further information:
www.cdu.edu.au/audit2011
Martin Carroll, PVC Learning, Teaching & Community Engagement
Tel: 8946 6333 Fax: 8946 6199
Email: martin.carroll@cdu.edu.au
Sinead Vincent, Project Support Officer
Tel: 8946 7206 Fax: 8946 6928
Email: audit2010@cdu.edu.au
Download