Statistical Process Control
Statistical Process Control
production method. SPC tools and procedures can help you monitor process behavior, discover issues in
internal systems, and find solutions for production issues. Statistical process control is often used
interchangeably with statistical quality control (SQC).
SPC is a statistical process used in manufacturing and quality control to ensure all processes operate
within acceptable limits.
SPC uses control charts to show graphical representations of data over time--detecting deviations from
the norm.
SPC data can also be used for continuous improvement for higher quality and efficiency.
Statistical Process Control (SPC) is a statistical method used in manufacturing and quality control to
monitor and control a process.
In the manufacturing industry, a poor product—defined as not meeting spec— often results from a poor
process.
SPC collects and analyzes data from product and process measurements.
The goal is to determine what falls within vs outside of the process capability, leading to preventive or
corrective actions when needed.
SPC tools
There are 14 quality control tools used in SPC. Seven are quality control tools, and the other seven are
supplemental tools.
1. Cause-and-effect diagram
2. Check sheet
3. Control chart
4. Histograms
5. Pareto chart
6. Scatter diagram
7. Stratification
Supplemental Tools:
1. Data stratification
2. Defect map
3. Event logs
4. Process flowchart
5. Progress center
6. Randomization
Control chart
A control chart visually represents the gathered data with pre-set control limits. Think of the control
chart as a bell curve laid sideways to see the time sequence of data points.
The control limits, as explained above in the variation section, are ±3 standard deviations from the
mean.
Anything that falls within that range between the upper control limit (UCL) and the lower control limit
(LCL) is acceptable.
Anything outside that range is unacceptable and defined as special cause variation as shown below.
In total, 14 quality control tools are used in statistical process control, which are divided into seven
quality control tools and seven supplemental tools:
Cause-and-effect diagrams
Also known as the Ishikawa diagram or the fishbone diagram. Cause-and-effect diagrams are used to
identify several causes of any problem. When created, the diagrams look like a fishbone, with each main
bone stretching out into smaller branches that go deeper into each cause.
Check sheets
These are simple, ready-to-use forms that can be collected and then analyzed. Check sheets are
especially good for data that is repeatedly under observation and collected by the same person or in the
same location.
Histograms
Histograms look like bar charts and are graphs that represent frequency distributions. They are ideal for
numbered data.
Pareto charts
These are bar graphs that represent time and money or frequency and cost. Pareto charts are
particularly useful to measure problem frequency. They show the 80/20 Pareto principle: addressing 20
percent of the processes will resolve 80 percent of the problems.
Scatter diagrams
Also known as an X-Y graph. Scatter diagrams work best when paired with numerical data.
Stratification
This is a tool to separate data that simplifies pattern identification. Stratification is a process that sorts
objects, people, and related data into layers or specific groups. It is perfect for data from different
sources.
Control charts
Selection of the right control chart depends on the type of data: variable (continuous) or attribute
(discrete). Variable data is preferred when available.
1. Xbar R chart
2. Run chart
3. XmR chart
4. Xbar S chart
5. EWMA chart
6. Median R chart
1. P chart
2. NP chart
3. C chart
4. U chart
After all data is plotted on the control chart, it should fall between the control limits.
Anything outside the control limits is a special cause and must be addressed.
1. Failed controller.
3. Process shift.
4. Change in measurement system.
7. Operator mistake.
Central Tendency
The central tendency is stated as the statistical measure that represents the single value of the entire
distribution or a dataset. It aims to provide an accurate description of the entire data in the distribution.
Mean
The mean represents the average value of the dataset. It can be calculated as the sum of all the values in
the dataset divided by the number of values. In general, it is considered as the arithmetic mean. Some
other measures of mean used to find the central tendency are as follows:
Geometric Mean
Harmonic Mean
Weighted Mean
It is observed that if all the values in the dataset are the same, then all geometric, arithmetic and
harmonic mean values are the same. If there is variability in the data, then the mean value differs.
Calculating the mean value is completely easy. The formula to calculate the mean value is given by:
Mean=x1+x2+..+xn/n
The histogram given below shows that the mean value of symmetric continuous data and the skewed
continuous data.
In symmetric data distribution, the mean value is located accurately at the centre. But in the skewed
continuous data distribution, the extreme values in the extended tail pull the mean value away from the
centre. So it is recommended that the mean can be used for the symmetric distributions.
Median
Median is the middle value of the dataset in which the dataset is arranged in the ascending order or in
descending order. When the dataset contains an even number of values, then the median value of the
dataset can be found by taking the mean of the middle two values.
Consider the given dataset with the odd number of observations arranged in descending order – 23, 21,
18, 16, 15, 13, 12, 10, 9, 7, 6, 5, and 2
Here 12 is the middle or median number that has 6 values above it and 6 values below it.
Mode
The mode represents the frequently occurring value in the dataset. Sometimes the dataset may contain
multiple modes and in some cases, it does not contain any mode at all.
Central tendency is a statistic that represents the single value of the entire population or a dataset.
Some of the important examples of central tendency include mode, median, arithmetic mean and
geometric mean, etc.
The central tendency can be found using the formulas of mean, median or mode in most of the cases. As
we know, mean is the average of a given data set, median is the middlemost data value and the mode
represents the most frequently occurring data value in the set.
The purpose of the central tendency is to provide an exact representation of the entire collected data. It
is often defined as the single value that is representative of the data.
Quality Management
Quality management is the act of overseeing all activities and tasks needed to maintain a desired level of
excellence.
Quality management includes the determination of a quality policy, creating and implementing quality
planning and assurance, and quality control and quality improvement.
Quality management is the act of overseeing all activities and tasks that must be accomplished to
maintain a desired level of excellence. This includes the determination of a quality policy, creating and
implementing quality planning and assurance, and quality control and quality improvement. It is also
referred to as total quality management (TQM).
In general, quality management focuses on long-term goals through the implementation of short-term
initiatives.
A Quality Management System (QMS) is a formalized system that documents policies, processes and
procedures for achieving quality policies and objectives. QMS systems help businesses coordinate their
activities to meet customer expectations, regulatory and compliance needs and improve the efficiency
of its processes.
ISO 9001 is the the most prominent approach to Quality Management Systems and helps standardize
how a QMS is designed.
Since quality is the key competitive differentiator in today’s global markets, implementing a QMS helps
ensure that your processes run effectively and efficiently, lower costs and reduce waste.
3. Improves Compliance
5. Improves Documentation
There are four main components of Quality Management: quality planning, quality assurance, quality
control and quality improvement. The process of implementing all four components in an organization is
referred to as Total Quality Management (TQM).
Quality Management (and TQM) focuses not only on the quality of the outputs (products & services) but
also the inputs - the tasks and processes by which the outputs were created. Ideally, the quality of a
product and/or service is not only increasing but the process by which the product and/or service is
created is becoming better, thus achieving more consistent, higher quality products and services.