KSU * QMS (Quality Management System)

advertisement
Quality and Accreditation (1/3)
Certification of Kingdom Tower
for “FIT for PURPOSE”
Series of tests of “Fitness” of sub-systems:
• Foundation Sub-Systems
 Piling Sub-Systems
 Materials Sub-Systems, etc
• Electrical Sub-Systems
 Wiring Sub-Systems
 Voltage and surge Sub-Systems, etc
• Water and Waste System
 Pipes Sub-Systems
 Water Flow Sub-Systems, etc
• Other support Sub-Systems
Quality and Accreditation (2/3)
Certification of KSU for
“FIT for PURPOSE”
Series of tests of “Fitness” of sub-systems:
• Facilities Foundations Sub-Systems
 Buildings Sub-Systems
 Education Facilities Sub-Systems,
etc
• Academic Foundation system
 Colleges and Programs Sub-Systems
 Faculty Sub-Systems, etc
• Learning Resources system
 Library Sub-Systems
 ICT Sub-Systems, etc
• Other Teaching – Learning – Research
support Sub-Systems
Quality and Accreditation (3/3)
Building OR
Creating
Strong and
Sustainable
Foundations
of System
and SubSystems
Auditing AND
Assessment of
the QUALITY of
Foundations of
the System and
Sub-Systems
Affecting
Excellence and
Ensuring its “Fit
for Purpose”
Accreditation or
Certifying that
the System and
Sub-Systems are
“Fit for Purpose”

ACCREDITATION
is built on
QUALITY
KSU – QMS (Quality Management System) 1/2
“Chicken and Egg”
Issue of which comes first?
“Accreditation OR Quality”?
KSU is addressing the Quality Issue
through its
KSU – QMS (Quality Management System)
IQA = EQA
“Standards and Criteria”
KSU – QMS (Quality Management System) 2/2
SIMPLE
SOPHISTICATED
Uses the NCAAA Standards
and Criteria as the Blueprint
Same Standard and Criteria
applicable to Institution and
College and Program
Developmental philosophy of
planning and improvements
Quality Assessment
methodology,
Strategic Performance
Management
STRONG
SUSTAINABLE
Management through
Measurement
Systemic and Systematic
Performance Based
Non – Prescriptive
Process and Results Oriented
Aimed at Improvements and
Innovations
Long-term Orientation
Holistic and Integrative
Characteristics of KSU – QMS (1/2)
Comprised of a set of subsystems interacting together to achieve a
specified set of goals
An identified set of IPOO
steps leading to achievements of its
outputs an outcomes
Summation of
the total > Summation of Individual
Applicable to the institution
and College, Programs or Administrative
Units
Characteristics of KSU – QMS (2/2)
Aligned top to bottom,
horizontally and vertically
Does not tell you what
systems, tools, techniques or mechanisms to
use to achieve your goals and objectives
–
Measurements bring about better
Management
PMS, IMS and QMS
KSU – QMS is part of SPMS (Strategic
Performance Management System)
Key Similarities and Differences between
NCAAA and KSU – QMS Systems (1/6)
Standards, Criteria and Items:
Standards, Criteria and Items:
 2 sets of Standards, Criteria and
 1 comprehensive set of Standards,
Items, 1 for institution and 1 for
Criteria and Items applicable for
program
both for institution and program, as
 The Standards, Criteria and Items of
the performance of the programs
the Institution and program are
aggregates and summates into the
similar with the Institution one being
institution performance
more comprehensive
 There are 11 Standards and 58
 There are 11 Standards and 58
Criteria based on the NCAAA
Criteria
institution set which are classified
as Process-Oriented Values
 The KPI and Benchmark are
classified as the Results-Oriented
Values
Key Similarities and Differences between
NCAAA and KSU – QMS Systems (2/6)
KPI and Benchmark:
 Has an open approach set of KPI and
Benchmark to be defined by the institution
or program
 The KPI and Benchmark are not computed
into the overall performance of the
institution or program
KPI and Benchmark:
 Has both an open and closed approach to
KPI and Benchmark:
 Closed generic set defined by the
institution for all programs and the
institution as a whole

Open set to be defined by the institution
and program
 The generic set of KPI and Benchmarks are
applicable across board to all programs
which are aggregated and summated into
the overall institution performance
 2 sets of KPI are used, Qualitative and
Quantitative KPI
 The Qualitative set uses a PDCA and ADLI
criteria to determine the performance level
criteria
 The quantitative set uses the normal
percentage, ratios or numeric to determine
the performance ranges
Key Similarities and Differences between
NCAAA and KSU – QMS Systems (3/6)
Audit and Assessment:



Audit and Assessment:
The institution and program does 
a self-assessment and prepares an
assessment report and is assessed
by an external team appointed by
NCAAA or recognized
accreditation agencies

Audit and assessment is done
once every five year

Accreditation and Certification is
based on outcome of audit and
assessment
The institution and program
does a self-assessment and
prepares an assessment report
and is assessed by an external
team appointed by KSU
Audit and assessment is done
once every year
Opportunities for improvement
is based on IQA audit and
assessment Performance Report
(IQAAPR)
Key Similarities and Differences between
NCAAA and KSU – QMS Systems (4/6)
Management:

Management:
The audit and assessment report 
will be used as the basis of a
development plan by the
institution or program

The IQAAPR will be used as
the basis of an annual
operation plan for continuous
improvement and innovation
by the institution or program
The annual operation plan is
linked to the roll-over of the
institution or program strategic
plan
Key Similarities and Differences between
NCAAA and KSU – QMS Systems (5/6)
Assessment Approach:
Assessment Approach:
 A 6 levels Star System is used to determine
 A 6 levels Scaled Performance Scoring
the performance of each Standard, Criteria
System using a weighted score approach is
and Item
used to determine the performance of each
 A “relevant” or “not relevant” system is used
Standard, Criteria and Item Processto screen out items that are not applicable to
Oriented Values contributing to 20% of the
the program
overall performance achievement score
 The assessment is not based on the
 A 6 levels Scaled Performance Scoring
comparison with past performance The Items
System using a weighted score approach is
and Criteria are summated and aggregated
used to determine the performance of each
into the determination of performance for
KPI and Benchmark Results-Oriented
each Standard
Values contributing to 20% of the overall
performance achievement score
 The performance of each criteria also takes
into account the “goals set” and “goals
achieved” leading to “development” and
“effectiveness” being measured
contributing to remaining 20% of the
performance achievement score.
Key Similarities and Differences between
NCAAA and KSU – QMS Systems (6/6)
Assessment Approach (Continued):


Assessment Approach (Continued) :
The KPI and Benchmark are not

computed into the overall performance of
the institution or program
The overall performance or the institution
and program is based on the aggregation 
and averaging of the Stars into a 5.0
points scaled performance system


The Items and Criteria are summated
and aggregated into the determination of
performance for each Standard which
forms the Process-Oriented Values
The KPI and Benchmarks forms the
Results-Oriented Values
The overall performance of the
institution or program is the summation
both the Process-Oriented Standards,
Criteria and Items Values and the
Results-Oriented Values
The overall performance is based on the
weighted scoring for both the ProcessOriented and Results-Oriented Values
leading to a 1000 points scale system
KSU Standard, Criteria and Item and KPI
Process – Based Values Criterion
 KSU Standard – 11 Standards
 KSU Criteria – 80 Criteria
 58 Main Criteria
 11 KPI Criteria (generic to all colleges and
programs)
 11 Additional KPI Criteria (defined by
colleges and programs)
Results – Based Values Criterion
 64 KPI (29 Quantitative types, 35
Qualitative types)
KSU Standard, Criteria and Item requirement
KSU – QMS Standards, Criteria and Items
o
Standard 1: Mission and Objectives
Explanations
STANDARD Requirement
1.1 Appropriateness of the Mission
1.1.1 The mission for the school and program
should be consistent with the mission of
the institution, and the institution’s
mission with the establishment charter
of the institution.
1.1 CRITERIA Requirement
1.1.1 ITEM details Requirement
1.1.2 The mission should establish directions
for the development of the institution,
schools or programs that are appropriate
for the institution, schools or programs
of its type and be relevant to and serve
the needs of students and communities
in .
1.1.2 ITEM details Requirement
1.1.3 The mission should be consistent with
Islamic beliefs and values and the
economics and cultural requirements of
the .
1.1.4 The mission should be explained to its
stakeholders in ways that demonstrate
its appropriateness.
1.1.3 ITEM details Requirement
1.1.4 ITEM details Requirement
Process – Based Assessment using ADLI
 "Process" refers to the methods used and to improve when addressing
the standards, criteria, items and key performance indicators
requirements in the KSU – QMS:
 A "APPROACH" refers to the methods used to accomplish the process,
it appropriateness and effectiveness and degree to which the
approach is repeatable and based on reliable data and information
 D "DEPLOYMENT" refers to the extent to which approach is applied in
addressing Item requirements relevant and important to the HEI, its
consistency and coherence across all appropriate work units
 L "LEARNING" refers to refining the approach through cycles of
evaluation and improvement, encouraging breakthrough change
through innovation, sharing refinements and innovations across units
 I “INTEGRATION" refers to the extent to which approach is aligned
with organizational needs identified with measures, information, and
improvement systems being complementary across processes and
work units and plans, processes, results, analyses, learning, and actions
are harmonized across processes and work units to support
organization-wide goals
Results – Based Assessment using LeTCI
 "Results" refers to the organization's outputs
and outcomes in achieving the requirements in
processes above. The four factors used to
evaluate results are:
 Le “LEVEL” – The current level of performance and its
performance trend over a time period.
 T “TREND”– The rate (i.e., the slope of trend data)
and breadth (i.e., the extent of deployment) of
performance improvements
 C “COMPARISON” – The performance relative to
appropriate comparisons and/or benchmarks
 I “INTEGRATION” – The linkage of the results
measures (often through segmentation) to important
student and stakeholder; program, offering, and
service; and in Process Items.
1.1 Appropriateness of the Mission
1.1.1 The mission for the school and program should be
consistent with the mission of the institution, and
the institution’s mission with the establishment
charter of the institution.
4
1.1.2 The mission should establish directions for the
development of the institution, schools or
programs that are appropriate for the institution,
schools or programs of its type and be relevant to
and serve the needs of students and communities
in Saudi Arabia.
1
1.1.3 The mission should be consistent with Islamic
beliefs and values and the economics and cultural
requirements of the Kingdom of Saudi Arabia.
1
1.1.4 The mission should be explained to its stakeholders
in ways that demonstrate its appropriateness.
1
Overall Assessment
1
50
0.5
60
0.6
80
0.8
30
0.3
2.2
8th
Column
9th
Column
Overall
Perf.
50%
7th
Column
Effective
70%
2.2
6th
Col.
Develop.
5th
Col.
Goals
Achv.
55
4th
Column
Goals Set
Standard 1 Mission, Goals and Objectives
3rd
Col.
Score (%)
Institutional, School and Program Context
Weights
2nd
Column
1st Column
Weighted
Score
Overall Scaled Performance Scoring of Process –
Based values Standard
0
0
1.76
1 The weighted score for each
item is derived from SCORE
* WEIGHTS.
2. The overall weighted score
(2.2) is an averaged
summation of each of the
weighted score of each
item and contributes 80%
to overall performance.
3. As there is no
“development” and
“effective”, 20% is lost, and
the final Overall
performance is 1.76 (which
is 0.8 * 2.2)
1.76
Scaled Performance Scoring of Process –
Based values of each Item
Institutional, School and Program Context
Standard 1 Mission, Goals and Objectives
1.1 Appropriateness of the Mission
1.1.1
The mission for the school and program should be
consistent with the mission of the institution, and the
institution’s mission with the establishment charter of the
institution.
Weights
Score
(%)
55
4
1
Weighted
Score
2.2
50
0.5
Process – Based Values Criterion Scoring Guidelines
SCORE
0% or 5% OR
No Star
10%, 15%, 20% or
25% OR 1 Star
30%, 35%,
40% or 45% OR
2 Stars
50%, 55%,
60% or 65% OR
3 Stars
70%, 75%,
80%, or 85% OR
4 Stars
90%, 95% or 100% OR
5 Stars
PROCESS – based Performance Scoring Guidelines
The practice, though relevant, is not followed at all based on the following:

No SYSTEMATIC APPROACH (methodical, orderly, regular and organize) to Standards requirements is evident; information lacks specific methods, measures,
deployment mechanisms, and evaluation, improvement, and learning factors. (A)

Little or no DEPLOYMENT of any SYSTEMATIC APPROACH (methodical, orderly, regular and organize) is evident. (D)

An improvement orientation is not evident; improvement is achieved through reacting to problems. (L)

No organizational ALIGNMENT is evident; individual standards, areas or work units operate independently. (I)
The practice is followed occasionally but the quality is poor or not evaluated based on the following:

The beginning of a SYSTEMATIC APPROACH (methodical, orderly, regular and organize) to the BASIC REQUIREMENTS of the Standards is evident. (A)

The APPROACH (methodical, orderly, regular and organize) is in the early stages of DEPLOYMENT in most standards or work units, inhibiting progress in achieving the basic
requirements of the Standards. (D)

Early stages of a transition from reacting to problems to a general improvement orientation are evident. (L)

The APPROACH is ALIGNED with other standards, areas or work units largely through joint problem solving. (I)
The practice is usually followed but the quality is less than satisfactory based on the following:

An EFFECTIVE, SYSTEMATIC APPROACH, (methodical, orderly, regular and organize) responsive to the BASIC REQUIREMENTS of the Standards, is evident.
(A)

The APPROACH is DEPLOYED, although some standards, areas or work units are in early stages of DEPLOYMENT. (D)

The beginning of a SYSTEMATIC APPROACH (methodical, orderly, regular and organize) to evaluation and improvement of KEY PROCESSES is evident. (L)

The APPROACH is in the early stages of ALIGNMENT with the basic Institution, College or Program or Administrative Unit needs identified in response
to the Institution, College or Program or Administrative Unit Profile and other Process Standards. (I)
The practice is followed most of the time. Evidence of the effectiveness of the activity is usually obtained and indicates that satisfactory standards of
performance are normally achieved although there is some room for improvement. Plans for improvement in quality are made and progress in implementation is
monitored.

An EFFECTIVE, SYSTEMATIC APPROACH (methodical, orderly, regular and organize), responsive to the OVERALL REQUIREMENTS of the Standards, Criteria and Items is
evident. (A)

The APPROACH is well DEPLOYED, although DEPLOYMENT may vary in some standards, areas or work units. (D)

A fact-based, SYSTEMATIC (methodical, orderly, regular and organize) evaluation and improvement PROCESS and some organizational LEARNING are in place forimproving
the efficiency and EFFECTIVENESS of KEY PROCESSES. (L)

The APPROACH is ALIGNED with the Institution, College or Program or Administrative Unit needs identified in response to the Institution, College or
Program or Administrative Unit Profile and other Process Standards. (I)
The practice is followed consistently. Indicators of quality of performance are established and suggest high quality but with still some room for improvement.
Plans for this improvement have been developed and are being implemented, and progress is regularly monitored and reported on.

An EFFECTIVE, SYSTEMATIC APPROACH (methodical, orderly, regular and organize), responsive to the MULTIPLE REQUIREMENTS of the Standards, Criteria and Items is
evident. (A)

The APPROACH is well DEPLOYED, with no significant gaps. (D)

Fact-based, SYSTEMATIC (methodical, orderly, regular and organize) evaluation and improvement and organizational LEARNING are KEY management tools; thereis clear
evidence of refinement and INNOVATION as aresult of organizational-level ANALYSIS and sharing. (L)

The APPROACH is INTEGRATED with the Institution, College or Program or Administrative Unit needs identified in response to the Institution, College
or Program or Administrative Unit Profile and other Process Standards. (I)
The practice is followed consistently and at a very high standard, with direct evidence or independent assessments indicating superior quality in relation to other
comparable institutions. Despite clear evidence of high standards of performance plans for further improvement exist with realistic strategies and timelines
established.

An EFFECTIVE, SYSTEMATIC APPROACH (methodical, orderly, regular and organize), fully responsive to the MULTIPLE REQUIREMENTS of the Standards, Criteria and Items
is evident. (A)

The APPROACH is fully DEPLOYED without significant weaknesses or gaps in any areas or work units. (D)

Fact-based, SYSTEMATIC (methodical, orderly, regular and organize) evaluation and improvement and organizational LEARNING are KEY organization-wide tools;refinement and
INNOVATION, backed by ANALYSIS and sharing, are evident throughout the organization. (L)

The APPROACH is well INTEGRATED with the Institution, College or Program or Administrative Unit needs identified in response to the Institution,
1.6 Key Performance Indicators or
Benchmarks
1.6.1 Level of stated institution’s, schools’
or programs’ philosophy or
commitments; processes to formulate
strategy and plans, and plans are
implemented; development of KPI
achievement to measure the plans,
implementation and achievements in all
missions. (Levels)
15
5
40
2
1.6.2 Level of institution’s schools’ or
programs’ strategy map alignment
achievement with the national HE
strategies (Levels)
1.6.3 Percentage of institution’s, schools’ or
programs’ goal achievements according
to the operational indicators that is set.
(%)
5
0
0
5
50
2.5
4.5
1
1
Overall Perf.
Goals Achv.
20% 30%
Effective
55
Develop.
Standard 1 Mission, Goals and Objectives
Goals Set
Weighted
Score
Weights
Institutional, School and Program Context
Score (%)
Overall Scaled Performance Scoring of Results – Based
values KPI (Key Performance Indicators)
4.5
1 The weighted score for
each item is derived from
SCORE * WEIGHTS.
2. The overall weighted
score (4.5) is an averaged
summation of each of the
weighted score of each
item and contributes 80%
to overall performance.
3 As there is both
“development” and
“effectiveness”,
representing 20% the final
Overall performance is 4.5
(which is (0.8 * 4.5 + 0.2 *
4.5)
Performance Assessment of a qualitative KPI
Level 1

No systematic approach in terms of the (A) Approach for the Strategic Planning Process as per the (P), (D), (C) and (A) of PDCA (Plan, Do,
Check, Act) is evident; information is subjective, unreliable and vague.
Level 2

The beginning of a systematic approach in terms of the (A) Approach for the Strategic Planning Process with (P) according to the basic
requirements of the Standard as supported by documented evidence.
Major gaps exist in (D) Deployment that would inhibit progress in the Strategic Planning Process in achieving the basic requirements of the
(L) Levels of performance as evidenced by the documented Goals and KPI achievements.
Early stages of a transition from reacting to problems to a general improvement orientation are evident.


Level 3



Level 4



Level 5



Level 6



An effective, systematic approach in terms of the (A) Approach and (D) Deployment for the Strategic Planning Process, (P), (D) and (C)
responsive to the basic requirements of the Standard in the (L) Levels of performance as evidenced by the documented Goals and KPI
achievements.
The Strategic Planning Process approach is deployed, although some areas or work units are in early stages of deployment based on its (P)
and (D).
The beginning of a systematic approach to evaluation and improvement of key processes in Strategic Planning Process based on (L) Levels
of performance or outcome indicators is evident.
An effective, systematic approach in terms of the (A) Approach and (D) Deployment and (L) Learning in Strategic Planning Process, with
(P), (D) and (C) according and responsive to the overall requirements of the Standard in the (L) Levels and (T) Trend performance as
evidenced by documented Goals and KPI achievements.
The approach is well deployed, although deployment and learning may vary in some areas or work units and is aligned with basic
organizational needs identified in the other (L) Levels and (T) Trends of Goals and KPI achievement.
A fact-based, systematic evaluation and improvement process based on performance or outcome indicators are in place for improving the
(L) and (T) Trend of efficiency and effectiveness of key processes and outcomes or outputs.
An effective, systematic approach in terms of the (A) Approach, (D) Deployment, (L) Learning and (I) Integration in the Strategic Planning
Process, with (P), (D), (C) and (A) responsive to the overall requirements of the Standard in the (L) Levels, (T) Trend and (C) Comparison
performance as evidenced by the documented Goals and KPI achievements supporting current and changing educational service.
The approach is well deployed, with no significant gaps and is well integrated with the institution, school or program organizational needs
as identified in the other (L) Levels, (T) Trends and (C) Comparison of Goals and KPI achievement.
A fact-based, systematic evaluation and improvement process and organizational learning/sharing are key management tools; there is
clear evidence of refinement, innovation, and improved integration as a result of organizational-level analysis and sharing based on
performance or outcome indicators.
An effective, systematic approach in terms of the A) Approach, (D) Deployment, (L) Learning and (I) Integration in the Strategic Planning
Process, with (P), (D), (C) and (A) fully responsive to all the requirements of the (L) Levels, (T) Trend and (C) Comparison and (I) Integration
performance as evidenced by the documented Goals and KPI achievements supporting current and changing educational service.
The approach is fully deployed without significant weaknesses or gaps in any areas or work units and is fully integrated with the institution,
school or program organizational needs as identified in the other (L) Levels, (T) Trends, (C) Comparison and (I) Integration of Goals and KPI
achievement.
A very strong, fact-based, systematic evaluation and improvement process and extensive organizational learning/sharing are key
management process and tool; strong refinement, innovation, and integration, backed by excellent organizational-level analysis and
sharing based on performance or outcome indicators are evident.
Results – Based Values Criterion Scoring Guidelines
SCORE
0% or 5%
RESULTS – based Performance Scoring Guidelines




10%, 15%,
20%, or 25%

30%, 35%,
40%, or 45%

50%, 55%,
60%, or 65%










70%,75%,
80%, or 85%




90%,95%,or 100%




There are no organizational PERFORMANCE RESULTS or poor RESULTS in the standards and areas reported.
TREND data are either not reported or show mainly adverse TRENDS.
Comparative information is not reported.
RESULTS are not reported for any standards, criteria or items or areas of importance to the Institution, College or Program or
Administrative Unit KEY MISSION or Institution, College or Program or Administrative Unit requirements.
A few organizational PERFORMANCE RESULTS are reported; there are some improvements and/or early good PERFORMANCE
LEVELS in a few standards, criteria or items or areas.
Little or no TREND data are reported, or many of the TRENDS shown are adverse.
Little or no comparative information is reported.
RESULTS are reported for a few standards, criteria or items or areas of importance to the Institution, College or Program or
Administrative Unit KEY MISSION or Institution, College or Program or Administrative Unit requirements.
Improvements and/or good PERFORMANCE LEVELS are reported in many standards or areas addressed in the Standards
requirements.
Early stages of developing TRENDS are evident.
Early stages of obtaining comparative information are evident.
RESULTS are reported for many standards, criteria or items or areas of importance to the Institution, College or Program or
Administrative Unit KEY MISSION or Institution, College or Program or Administrative Unit requirements.
Improvement TRENDS and/or good PERFORMANCE LEVELS are reported for most s standards, criteria or items or areas
addressed in the Standards requirements.
No pattern of adverse TRENDS and no poor PERFORMANCE LEVELS are evident in standards, criteria or items or areas of
importance to Institution, College or Program or Administrative Unit KEY MISSION or Institution, College or Program or
Administrative Unit requirements.
Some TRENDS and/or current PERFORMANCE LEVELS – evaluated against relevant comparisons and/or BENCHMARK – show
standards or areas of good to very good relative PERFORMANCE.
Institution, College or Program or Administrative Unit PERFORMANCE RESULTS address most KEY student, STAKEHOLDER, and
PROCESS requirements.
Current PERFORMANCE LEVELS are good to excellent in most standards, criteria or items or areas of importance to the
Standards requirements.
Most improvement TRENDS and/or current PERFORMANCE LEVELS have been sustained over time.
Many to most reported TRENDS and/or current PERFORMANCE LEVELS—evaluated against relevant comparisons and/or
BENCHMARKS—show areas of leadership and very good relative PERFORMANCE.
Institution, College or Program or Administrative Unit PERFORMANCE RESULTS address most KEY student, STAKEHOLDER,
PROCESS, and ACTION PLAN requirements.
Current PERFORMANCE LEVELS are excellent in most standards, criteria or items or areas of importance to the
Standards requirements.
Excellent improvement TRENDS and/or consistently excellent PERFORMANCE LEVELS are reported in most standards, criteria
or items or areas.
Evidence of education sector and BENCHMARK leadership is demonstrated in many standards, criteria or items or
areas.
Institution, College or Program or Administrative Unit PERFORMANCE RESULTS fully address KEY student, STAKEHOLDER,
PROCESS, and ACTION PLAN requirements.
Thank you
Download