measure

advertisement
Lean Six Sigma Project
Presentation Template
[PROJECT TITLE]
[Project Period from mmm-yy to mmm-yy]
[Company Name
Division / Deptt Team]
Introduction
• About the company
• Products, Location
• Wh
DEFINE
DEFINE
A. Project Charter should contain
a. Project Title:
1. It should include clearly the name of the process to be
improved
b. Business Case
1. What is the “Pain” issue? (Problem symptoms)
2. How long it has been there? (Historical trends)
3. Current financial and non-financial impacts of the issue
4. Future impacts if improvement is not made now
5. How is the issue related to strategy?
DEFINE
A. Project Charter should contain (contd..)
c.
Goals and objectives
1. SMART Objective for improvement
2. Stage targets (if long-term stage by stage
improvement is required)
d.
Expected Benefits from the project
1. Involve Finance in quantifying the expected
benefits (quantify reduction in Cost of Poor
Quality)
2. For stage-by-stage improvement, what are
short, medium, long term objectives
e. Team members/Leader, Project
Sponsor,
f. Project Timeframe (initial)
1.
Project plan in Gantt chart /network
diagram form
DEFINE
B. “SIPOC” or “COPIS” Diagram
a.
b.
Correctly identify SIPOC/COPIS (supplier, input, process, output,
customer)
Make a high level block diagram of the process relating to the issue
C. “VOC”
a.
b.
Understand “Voice of Customer” based on available data (customer
SLA, specification, feedback) or
Generate a VOC data collection plan including MSA.
D. CTQs and related Big Ys
a.
b.
c.
d.
e.
Identify the “Critical to Quality” parameters of deliverables to customer .
Define Big Y(s) of the process
Classify “Y”s as per Kano Model – Hygiene, Higher the better, Delight
factors
Make CTQ Tree to cascade Big Ys into smaller deliverables (Y1, Y2…)
Prepare CTQ Specification Table to clearly define all Ys.
Defect Definition – Define what is not acceptable
DEFINE
E. Stratification & Prioritization of Ys
a.
b.
c.
F.
Collect historical / current data regarding the Ys
Stratify data for smaller Ys
Analyse historical / collected data e.g. by applying Regression Analysis to
identify significant smaller Ys (e.g. Y = f(Y1, Y2….,Yn),
Project Scope / Boundary
a.
b.
c.
Make Pareto Chart and identify major Ys to focus upon
Define Project Scope/Boundary using SIPOC
Define the boundary of the project in terms of which Ys will be included
in the project and what will be done with the remaining Ys. (e.g. plan
another project, consider it in the next “6σ projects wave”)
G.Set Baseline
a.
b.
c.
Define baseline performance for all Ys in the project scope based on
historical / collected data
Work out process capability (Sigma Level) for selected Ys based on
historical /collected data.
Summarize the “as is” process performance with comments
DEFINE
G. Set Targets
a. Obtain benchmark data for selected Y based on VOC / SLA and/or
competitive / best-in-class performance benchmarks
b. Define SMART targets for selected Ys and target process capabilities
(Sigma Levels) for them
c. Test the targets statistically to verify if they are significantly different from
the current performance level (Hypothesis test)
d. Review and re-define “defects” and their project targets
e. Review and re-define the project timeframe and milestones with due
dates
H. Approval of the Charter
a. Document the Project Charter
b. Champion and Sponsor review and approve the Charter
c. Charter is registered with Program Management Office for follow
up.
DEFINE – Key
Points
• Define phase is all about Ys, don’t bring in
Xs (causal factors) at this stage
• Don’t propose solutions in “Define” phase.
• The purpose of data analysis is only to
clearly define the project scope and set
SMART targets for Ys based on VOC (and
other considerations).
MEASURE
MEASURE
A. Data Collection Plan
a. It is preferable to collect current data from the
process however in some cases historical data may
be utilized (if the reliability of data is well established)
b. Use Prioritisation matrix or FMEA (if existing) to
identify critical data to be collected.
c.
Prepare Data Collection Plan specifying:
i.
ii.
iii.
iv.
v.
vi.
vii.
Parameter to be measured
Whether discrete or continuous
Which location
Measurement method (procedure/code/standard)
Sample Size and Sampling method : How was determined?
How will the data be summarized
Confirmation of calibration of measuring devices and
calibration plan if necessary.
MEASURE
B. Measurement System Analysis (MSA)
a. Verify calibration status and accuracy of measuring
devices used in the process. Calibration error should be
zero.
b. Demonstrate MSA (if applicable):
1. Gauge R&R for continuous variables measurement
2. Statistical Test for attribute Ys
MEASURE
C. Actual Measurement
a. Show sample of raw data in tabulated form
b. Summarise data
c. Give comments on the process performance
based on data
MEASURE – Assess Process
Capability
D. Process Capability Assessment
a. Carry out Process Capability Analysis
and determine
1. Cp, Cpk &
2. Pp, Ppk
3. Current defect ppm and process Sigma Levels
of the process
E. Estimate Process Sigma Level
Note: Assessment of Process Capability and Measurement of
Process Sigma level may be carried out in define phase, in case
reliable historical data is available.
ANALYZE
ANALYZE – Process
1. Detailed Process Map in appropriate ways
a.
b.
c.
d.
Flowchart or
Spaghetti Diagram
Function diagram of the process system (TRIZ)
Any other
2. Identify (for lean)
a. Value Adding activities,
b. Non-value Adding but necessary activities
c. Wasteful activities
3. Identify constraints / bottlenecks in the flow
4. Make Value Stream Maps
a. As is VSM
b. “Should be” or target VSM
ANALYZE - Data
1. Use PFMEA or Cause and Effect diagram to identify
all “Potential” causes
2. Prepare a Cause Validation Table, and apply the
appropriate validation methods:
3. Classify each potential cause into “Strong”, “Weak” and
“Insignificant/Irrelevant” Categories
4. List the “possible” causes separately for further
validation.
5. Prepare validation plan for the possible causes.
ANALYZE - Data
5. Validate “possible” cause by using
appropriate tests/tools as needed:
6. Apply deep analysis to further drill down
validated causes to reach “root causes”
7. Conclude Analysis by clearly identifying
validated possible causes and their Root
causes.
8. If DOE / Regression is carried out, predict
optimum settings and validate them.
IMPROVE
IMPROVEA. Based on the identified and validate root
causes, generate possible solutions.
B. Use creative / innovative thinking to
explore solutions
C. Try to incorporate mistake proofing
measures.
D. Prioritize solutions using appropriate
methods/criteria
IMPROVE
E. Plan for implementation
a. Trial runs / pilot implementation
i.
ii.
iii.
iv.
Trial / pilot implementation plan
Risk analysis of the pilot / trial plan
Plan for MSA and data collection
Method to evidence improvement
b. Finalize full scale implementation plan, with phases as
appropriate
c. Carry out thorough risk analysis of the agreed solution for full
scale implementation
d. Define stage gates including handing over the process to the
process owner.
CONTROL
CONTROL
A. Document / amend Procedures,
standards, work instructions incorporating
the improvements in the process
B. Define control plan for Xs
C. Install appropriate control charts (SPC)
for Ys and critical Xs.
D. Document “Out of Control Guidelines”
for the operators
CONTROL
E. Evaluate Process Sigma Level and compare with preimprovement Sigma level
F. Update Process FMEA and recalculate RPNs, which should
go down.
G. Carry out cost-benefit analysis, include non-financial
benefits
H. Identify opportunities for further improvement
a. In the same process and
b. Horizontal deployment opportunities (other similar processes)
I. Give details of Reward and Recognition within the company.
J. Share knowledge by publishing / presenting the case study
Lessons Learnt
• Identify what lessons were learnt
regarding problem solving method, project
management, teamwork etc
Download