Uploaded by teh.bnc

Quality Control Final Review Notes

advertisement
Brian Campbell
Quality Control Final Notes
1






Chapter 1 – Quality Basics






Quality is defined as the consumer’s individual and reasonable
needs, requirements, and expectations.
Processes perform value-added activities on inputs to create
outputs.
Variation is present in any natural process. No two products or
occurrences are exactly alike.
Specifications are used to help define a customer’s needs,
requirements, and expectations (they are the voice of the
customer).
Productivity is doing something efficiently; quality focuses on
effectiveness, doing the right things right.
The monitoring and control of quality has evolved over time.
Inspection, quality control, statistical quality control, statistical
process control, and total quality management are all aspects of
the evolution of quality.
Chapter 2 – Quality Advocates



Dr. Shewhart developed statistical process control charts as well
as the concepts of controlled and uncontrolled variation.
Dr. Deming is known for encouraging companies to manage for
quality by defining quality in terms of customer satisfaction.
Dr. Deming created his 14 points as a guide to management. They
are:




















Quality definition: Conformance to Requirements
Quality System: Prevention of Defects
Quality Performance Standard: Zero Defects
Quality Measurement: Costs of Quality

Quality means goodness, luxury, shininess, weight, etc.
Quality is intangible, and therefore not measurable
There exists an economics of quality, that quality means building luxuries
into a product.
Workers are to blame for poor quality.
Quality originates from the quality department.
To Crosby, there is a difference between a successful customer
and one who is merely satisfied.
Crosby on costs of quality (the costs associated with providing
customers with a product or service that conforms to their
expectations):


Prevention costs
Detection costs
Dr. Ishikawa encouraged the use of the seven tools of quality,
including the one he developed: the cause-and-effect diagram.
Dr. Taguchi is known for his loss function describing quality and
his work in the design of experiments.
Chapter 3 – Quality Management Systems










ISO 9000 is a requirements-based assessment that supports the
development of a quality management system.
ISO 9000 standards focus on functional requirements and recordkeeping to support international cooperation in business-tobusiness dealings.
Quality systems like TS 16949 combine requirements from ISO
9000 with sector – and customer-specific – requirements.
Many quality systems require that suppliers create, document,
and implement a quality management system.
ISO 14000 is primarily focused on the efforts made by an
organization to minimize any harmful impact on the environment
that its business activities may cause.
Six Sigma is a methodology that seeks to improve profits through
improved quality and efficiency.
Companies competing for the Malcolm Baldrige National Quality
Award must perform well in the following categories: leadership;
strategy; customers; measurement, analysis, and knowledge
management; workforce; operations; results.
The Malcolm Baldrige Award uses results-based performance
assessments to improve competitiveness.
The Baldrige Award criteria, with their standards for leadership
and customer-driven quality, most closely reflect the total quality
system.
The Japanese equivalent award for quality is the Deming award.
Chapter 4 – Quality Improvement: Problem Solving


Problem solving is the isolation and analysis of a problem and the
development of a permanent solution.
The following steps should be taken during the problem-solving
process:
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.


Crosby also defined 5 incorrect assumptions about quality:




Adopt the new philosophy.
Cease dependence on inspection to achieve quality.
End the practice of awarding business on the basis of price
alone. Instead minimize total cost.
Constantly and forever improve the system of production and
service.
Institute training on the job.
Institute leadership.
Drive out fear.
Break down barriers between departments.
Eliminate slogans, exhortations, and targets for the workforce.
Eliminate arbitrary work standards and numerical quotas.
Substitute leadership.
Remove barriers that rob people of their right to pride of
workmanship.
Institute a vigorous program of education and selfimprovement.
Put everybody in the company to work to accomplish the
transformation.
Dr. Juran’s process for managing quality includes three phrases:
quality planning, quality control, and quality improvement.
Dr. Feigenbaum defined quality as “a customer determination
which is based on the customer’s actual experience with the
product of service, measured against his or her requirements –
stated or unstated, conscious or merely sensed, technically
operational or entirely subjective – always representing a moving
target in a competitive market.”
Crosby describes four absolutes of quality management:





Create a constancy of purpose toward improvement of
product and service, with the aim to become competitive and
to stay in business and provide jobs.

Costs associated with dissatisfied customers (external)
Rework
Scrap
Downtime
Material costs
Waste of any resource in the production of a product or provision of a
service




Recognize a problem exists
Form an improvement team
Develop performance measures
Clearly define the problem
Document and analyze the problem/process
Determine possible causes
Identify, select, and implement the solution
Evaluate the solution
Ensure permanence
Continuous improvement
The following are techniques used in problem solving:
brainstorming, Pareto charts, WHY-WHY diagrams, flowcharts,
force-field analysis, cause-and-effect diagrams, check sheets,
histograms, scatter diagrams, control charts, and run charts.
Problem solvers are tempted to propose solutions before
identifying the root cause of the problem and performing an indepth study of the situation. Adhering to a problem-solving
method avoids this tendency.
Brainstorming is designed for idea generation. Ideas should not be
discussed or criticized during a brainstorming session.
Flowcharts are powerful tools that allow problem solvers to gain
in-depth knowledge of a process.
Cause and effect diagrams enable problem solvers to identify the
root causes of succinctly stated problems.
Steps must be taken to ensure that the new methods or changes
to the process are permanent.
Chapter 5 – Statistics
Brian Campbell
Quality Control Final Notes













Quality assurance relies primarily on inductive statistics in
analyzing problems
Accuracy and precision are of paramount importance in quality
assurance. Measurements should be both precise and accurate.
Histograms and frequency diagrams are similar. Unlike a
frequency diagram, a histogram will group the data into cells.
Histograms are constructed using cell intervals, cell midpoints,
and cell boundaries.
The analysis of histograms and frequency diagrams is based on
shape, location, and spread.
Shape refers to symmetry, skewness, and kurtosis (peakedness of
the data… leptokurtic distributions have a high peak, while
platykurtic distributions have a flatter curve).
The location or central tendency refers to the relationship
between the mean, mode, and median.
The spread or dispersion of data is described by the range and
standard deviation.
Skewness describes the tendency of data to be gathered either to
the left or right side of a distribution. When a distribution is
symmetrical, skewness is zero.
A normal curve can be identified by the following five features:
1. It is symmetrical about a central value
2. The mean, mode, and median are all equal
3. It is unimodal and bell-shaped
4. Data cluster around the mean value of the distribution
and fall away toward the horizontal axis
5. The area under the normal curve equals 1: 100%
In a normal distribution 99.73% of the measured values fall within
±3 standard deviations of the mean; 95.5% will fall within ±2
standard deviations or the mean; and 68.3% of the data fall within
±1 standard deviation of the mean.
The area under a normal curve can be calculated using the Z table
and its associated formula
The central limit theorem proves that the averages of nonnormal
distributions have a normal distribution.
Chapter 6 – Variables Control Charts

Control charts enhance the analysis of a process by showing how
that process performs over time. Control charts allow for early
detection of process changes.

Control charts serve two basic functions: they provide an
economic basis for making a decision as to whether to investigate
for potential problems, adjust the process, or leave the process
alone; and they assist in the identification of problems in the
process.

Variation, differences between items, exists in all processes.
Variation can be within-piece, piece-to-piece, and time-to-time.

The X-bar chart is used to monitor the variation in the average
values of the measurements of groups and samples. Averages
rather than individual observations are used on control charts
because average values will indicate a change in the amount of
variation much faster than individual values will.

The X-bar chart, showing the central tendency of the data, is
always used in conjunction with either a range or a standard
deviation chart.

The R and s charts show the spread or dispersion of the data.

The centerline of a control chart shows where the process is
centered. The upper and lower control limits describe the spread
of the process.

A homogeneous subgroup is essential to the proper study of a
process. Certain guidelines can be applied in choosing a rational
subgroup.

Common, or chance, causes are small random changes in the
process that cannot be avoided. Assignable causes are large
variations in the process that can be identified as having a specific
cause.

A process is considered to be in a state of control, or under
control, when the performance of the process falls within the
statistically calculated control limits and exhibits only common, or
2




chance, causes. Certain guidelines can be applied for determining
by control chart when a process is under control.
The X-bar chart is useful for determining process centering.
The Range chart is useful for determining the amount of variation
present in a process.
Patterns in a control chart indicate a lack of statistical control.
Patterns may take the form of changes or jumps in level, runs,
trends, or cycles – or may reflect the existence of two populations
of mistakes.
The steps for revising a control chart are:
1. Examine the chart for out-of-control conditions
2. Isolate the causes of the out-of-control condition
3. Eliminate the cause of the out-of-control condition
4. Revise the chart
Chapter 7 – Process Capability

Process capability refers to the ability of a process to meet the
specifications set by the customer or designer

Individuals in a process spread more widely around a center value
than do the averages

Specification limits are set by the designer or customer. Control
limits are determined by the current process.

6sigma is the spread of the process or process capability

Cp, the capability index, is the ratio of tolerance (USL – LSL) and
the process capability (6sigma)

Cr, the capability ratio, is the ratio of the process capability
(6sigma) and the tolerance (USL – LSL).

Cpk is the ratio that reflects how the process is performing in
relation to a nominal, center, or target value.
Chapter 8 – Other Variable Control Charts

Individuals and moving-range charts are used to monitor
processes that do not produce enough data to construct
traditional variables control charts.

Moving-average and moving-range charts are also used when
individual readings are taken. Once a subgroup size is chosen, the
oldest measurement is removed from calculations as each
successive measurement is taken. These charts are less sensitive
to changes in the process.

Charts that plot all the individual subgroup values are useful when
explaining the concept of variation within a subgroup.

Multi-variable analysis studies the spread of individual
measurements within a sample.

Media and range charts, though less-sensitive than variables
control charts, can be used to study the variation of a process.
Two different methods exist to create median and range charts.

Run charts can be constructed using either attribute or variables
data. These charts show the performance of a particular
characteristic over time.

Charts for variable subgroup size require a greater number of
calculations because the values of A2, D4, and D3 in the formulas
for control limits change according to the subgroup sample size.

Precontrol charts compare the item being produced with
specification limits. Precontrol charts are useful during machine
setups and short production runs to check process centering.

The nominal X-bar and R chart uses coded measurements to
monitor process centering and spread on short production runs.
Brian Campbell
Quality Control Final Notes
Chapter 9 – Probability

Seven theorems exist to explain probability:
1. Probability is expressed as a number between 0 and 1
2. The sum of the probabilities of the events in a situation
is equal to 1.0
3. If P(A) is the probability that an Event A will occur, then
the probability that A will not occur is 1- P(A).
4. For mutually exclusive events, the probability that
either event A or event B will occur is the sum of their
respective probabilities.
5. When events A and B are not mutually exclusive
events, the probability that either event A or event B
will occur is the sum of their individual probabilities
minus the probability that both events will occur. P(A
or B) = P(A) +P(B) – P(both)
6. If A and B are dependent events, the probability that
both events will occur is the P(A) x the probability that
B will occur, given that A has occurred (P(B|A).
7. If A and B are independent events, the probability that
they both occur is the product of their individual
probabilities P(A and B) = P(A) x P(B)

Discrete and continuous probability distributions exist to describe
the probability that an event will occur.

The hypergeometric, binomial, and Poisson distributions are all
discrete probability distributions.

The normal distribution is a continuous probability distribution.

The binomial and Poission distributions can be used to
approximate the hypergeometric distribution.

The Poission and normal distributions can be used to approximate
the binomial distribution.
Chapter 10 – Quality Control Charts for attributes

Fraction nonconforming (p) charts can be constructed for both
constant and variable sample sizes because the sample size can be
isolated in the formula

For the fraction nonconforming chart, the product or service
being provided must be inspected and classed as either
conforming or nonconforming

Number nonconforming (np) and percent nonconforming charts
may be easier to interpret than fraction nonconforming (p) charts.

It is possible to construct either a p or a u chart for variable
sample size. Individual control limits for the different sample sizes
may be calculated or nave may be used. When nave is used, care
must be taken to correctly interpret the points approaching or
exceeding the upper control limit.

Charts for counts of nonconformities (c) and nonconformities per
unit (u) are used when the nonconformities on the product or
service being inspected can be counted. For c xharts, n=1; for u
charts, n>1.

Charts for nonconformities per unit (u charts) can be constructed
for both constant and variable sample sizes.

When interpreting p, np, c, or u charts, it is important to
remember that values closer to zero are desirable.
Chapter 11 – Reliability

Reliability refers to quality over time. The system’s intended
function, expected life, and environmental conditions all play a
role in determining system reliability.

The three phases of a product’s life cycle are early failure, chance
failure, and wear-out.

Reliability tests aid in determining if distinct patterns of failure
exist.

Failure rates can be determined by dividing the number of failures
observed by sum of their test times

0, or the average life, is the inverse of the failure rate

A system’s availability can be calculated by determining the mean
time to failure and dividing that value by the total mean time to
failure plus the mean time to repair.

Reliability is the probability that failure will not occur during a
particular time period
3






For a system in series, failure of any one component will cause
system failure. The reliability of a series system will never be
greater than that of its least-reliable component.
For a system in parallel, all the components in parallel must fail to
have system failure. The reliability in a parallel system will be no
less than the reliability of the most reliable component.
Overall system reliability can be increased through the use of
parallel, backup, or redundant components.
FMEA is a method that seeks to identify failures and keep them
from occurring.
FMEA divides the system, process, service, or part into
manageable segments and records the ways the segment may
fail.
FMEA uses the risk priority code to rate the degree of hazard
associated with failure. When complete, recommendations are
made to eliminate the risk priority codes with a value of 1.
Chapter 12 – Advanced Topics in Quality

Quality function deployment is a method that allows users to
integrate the voice of the customer into the design or redesign of
a product, service, or process.

Quality function deployment first captures information from the
customer and then determines the technical requirements
necessary to fulfill the customer requirements.

An experimental design is the plan or layout of an experiment. It
shows the treatments to be included and the replication of the
treatments.

Experimenters can study all the combinations of factors by using
fractional factorial experiment designs like Plackett-Burman or
Taguchi.

Design of experiments seeks to investigate the interactions of
factors and their effects on the response variable.
Chapter 13 –Quality Costs

Collecting quality costs provides a method to evaluate system
effectiveness

Quality costs can serve as a basis or benchmark for measuring the
success of internal improvements.

Prevention costs are the costs associated with preventing the
creation of errors or nonconformities in the first place.

Appraisal costs are the costs incurred when a product or service is
studied to determine if it conforms to specifications.

Internal failure costs are the costs associated with errors found
before the product or services reach the customer.

External failure costs are the costs associated with errors found by
the customer.

Intangible failure costs are the costs associated with the loss of
customer goodwill and respect.

Once quantified, quality costs enhance decision making if they are
used to determine which projects will allow for the greatest
return on investment and which projects are most effective at
lowering failure and appraisal costs.

Quality-cost information should be used to guide improvement.
Chapter 14 – Product Liability

Civil suits seek money damages for injuries to either a person,
property, or both.

When a product or service has been provided in a careless or
unreasonable manner, negligence occurs.

If a manufacturer makes and sells a defective product or provides
a defective service, they can be held strictly liable if the defect
renders the product or service unreasonably dangerous and the
defect causes an injury.

Under contributory negligence, the plaintiff is found to have acted
in such a way as to contribute to the injury.

An expressed warranty can either be written or oral, and it is part
of the conditions of sale.

An implied warranty is implied by law
Brian Campbell
Quality Control Final Notes


Product liability exposure can be reduced through thoughtful
design and development of products or services. Designers should
design to remove unsafe aspects, guard against unsafe use,
provide product warnings, design to standards, and advertise and
market the product or service appropriately.
A program to reduce the risks associated with product liability
must embrace design, marketing, manufacturing, service
provision, packaging, shipping, and service.
From review:

Two types of data – variable and attribute. Variable is measurable
(temperature, power, etc.), attribute data is binary (pass/fail).

Specification = customer’s requirements

Control limit = voice of the process

Probability: the chance something might happen

Causes of variation: common & special. Common (body
temperature, blood pressure, etc.)

X-bar and R charts are used in process control. Not necessarily the
best for all applications.

Process that’s in control is predictable & repeatable.

Pareto analysis: 80/20 rule (80% of profits come from 20% of
customers)

Cost of quality – can put a company out of business. – Chapter 2

What is brainstorming?

Quality awards – Japan issues the Deming award.

Customers are the best resource for developing new products

Most important aspect of choosing a supplier: capability

What’s a graphic representation of a frequency distribution: A
Histogram

Variation – no two items are ever exactly the same

3 types of averages – Mean, media, Mode.

Normal distribution

Process out-of-control: points outside the control limit

Total Quality Management – quality in every aspect of
production/business

Importance of continuous improvement – pushed out of business
by competition

Deming – father of modern quality and continuous
improvement…work in post-WWII Japan (clean slate, receptive).

Name one types of proactive QMS: use of statistical process
control, formalized procedures

One type of reactive QMS: inspection

Value-stream mapping: attempts to reduce non-value-added
activities in production

Reliability = quality over time

ATI devices: 3D scanner was neat

Costs of quality details

Most expensive costs are internal

Product liability: expressed or implied warranty

Expressed: clear, in writing

Implied: assumed

Internal audits catch issues before they affect customers
(assuming people are honest). External audits are costlier and
more difficult, can provide better data

Customer’s role in quality system: provide specifications (what’s
truly important)

Invention: creating something truly new

Innovation: iterating on existing work

FMEA: Failure Mode Effect Analysis – useful in improving quality –
eliminate root causes of failures

MTBF: Mean time between failure – specific to devices that can
be repaired. Total lifespan across devices/# of failures

MTTF: Mean time to failure – specific to non-repairable devices.
Total lifespan across devices/# of devices
4
Download