0% found this document useful (0 votes)
11 views10 pages

Assignment Statistics Ankit Agarwal May 2024

Uploaded by

r08a16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views10 pages

Assignment Statistics Ankit Agarwal May 2024

Uploaded by

r08a16
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Assignment: Statistical quality control

Ankit Agarwal, 29th May 2024

Following the DMAIC methodology, the team completed the Define process by analyzing the business
process map, SIPOC and voice of customers as part of the Define phase as part of first exercise (Day 1).
This information formed the background to define the relevant CTQs. Since the exercise is performed by
Safety Engineers, CTQs that have direct safety impacts are rated higher.

Before moving further into the details, it


is important to mention that the DMAIC
methodology is used to analyze risk using
various tools in each step. DMAIC is a
systematic way of performing a Six Sigma
analysis used in complex processes.

As a first step to perform the risk


assessment, a fishbone (Ishikawa) is
created for the relevant CTQ to define
the potential causes (X’s). It helps to
brainstorm and visually organize all the
potential causes that might be
Figure 1: The Risk Assessment process
contributing to the CTQ. By mapping out
these causes and their sub-categories, the Fishbone helps to move towards root cause analysis.

Figure 2, provides an example of


the fishbone analysis performed
for one the CTQs. This provides a
first look into the analysis step
and helps identifying possibilities
but not certainties. Given limited
resources, the improvement
needs to focus on those X’s that
can lead to significant
improvements in the CTQ. Hence,
the fishbone is followed up with a
Cause-and-Effect Matrix. In this
Figure 2: An example of Fishbone analysis tool, all the X’s are ranked against
each relevant CTQs i.e. the
strength assessment (or likelihood) of each cause and the severity of its effect on the CTQ. This allows to
prioritize the most critical causes that are most likely contributing significantly to the issue.

After the cause and effect matrix is performed together with the team, the resulting X’s are plotted on a
simple graph based on the priority number (high to low). Several X’s are found to be close to each other.
This indicated that there are multiple contributing factors in the CTQ. These X's range from human error
(inconsistent sampling, analysis, and data recording) to equipment issues (calibration, breakdowns,
outdated) to personnel training/knowledge requirements.

Figure 3: Result of C&E matrix analysis


One year after implementing corrective actions identified through Fishbone and Cause-and-Effect (C&E)
analysis, a review is being conducted to gauge their effectiveness. This evaluation aims to assess whether
the implemented measures have successfully addressed the identified issues and resulted in the desired
improvements. Both automated and manual data have been collected over a period of one year and
available for risk evaluation.
To improve efficiency within the analysis phase, a two-pronged approach can be beneficial: Process
Analysis (process door) and Data Analysis (data door).

A. Process Analysis: This approach involves directly examining the process itself to identify potential
bottlenecks and opportunities for improvement. Key tools and techniques under this approach
include:
o Process Mapping: Visualizing the steps in a process using flowcharts can help identify areas
of inefficiency, and delay.
o Value Stream Mapping: This technique (referred to as Makigami) highlights the value-adding
steps within a process and identifies areas of "waste" (non-value-adding activities). Reducing
such waste can minimize errors and improve overall process flow. The rationale behind this
is that complex processes create more opportunities for mistakes. Streamlining the process
simplifies tasks and reduces error risk.
o Focus on Streamlining: The goal of process analysis is to identify and eliminate redundant and
risk-inducing steps, ultimately leading to a more efficient and reliable process.
Process Optimization Through Redundancy Review: This section proposes a review of a specific process
step to identify potential areas for streamlining. The process involves document checks by both security
guard and laboratory. This step was performed during the lecture.
Current Process Analysis:
1. Input (I): Security Guard receives CMR (transport document) and WIP (waste identity passport).
2. Output (O): Security Guard verifies documents and grants entry or initiates a "non-conformity
chain" if discrepancies are found.
3. Input (I): Laboratory receives the same documents (CMR and WIP).
4. Output (O): Laboratory verifies documents and initiates a "non-conformity chain" for
discrepancies.

Proposed Optimization:
• Security Guard: Focus on core security functions.
o Rationale: Security guards lack training in chemical components, making WIP verification
less reliable.
• Laboratory: Maintain document verification due to expertise.
o Rationale: Laboratory personnel are trained to understand both documents and verify.
Eliminating Customer Reaction Check:
Safety considerations don't require knowledge of customer response to discrepancies. Proposal is to
exclude this information from document checks.
B. Data-Driven Analysis: This section outlines a structured approach to process analysis using data. The
approach, known as "Data-Driven Analysis or Data door," emphasizes a sequential execution of three
key stages (if necessary) for rigorous and actionable insights. It's important to follow these stages in
order, as each builds upon the findings of the previous one. If a stage provides a clear picture of the
root cause, it might be unnecessary to proceed further.
Stage 1: Visual Exploration The first stage focuses on visualizing data to identify patterns, trends, and
potential issues. This involves using various graphical tools, such as:
• Histograms: These reveal the distribution of data, highlighting skewness, outliers, or unexpected
patterns.
• Time Series Plots: These charts show data points over time, helping detect trends, shifts, or
cyclical patterns.
• Pareto Charts: These highlight the most significant factors contributing to an issue, following the
80/20 rule (80% of the effect comes from 20% of the causes).
• Control Charts: These monitor process stability and identify deviations from desired control limits.
• Box Plots: These provide a quick overview of data distribution by summarizing quartiles and
identifying outliers.
These techniques give a comprehensive picture of the data and potential areas for improvement.
Depending on the insights gained here, it may or may not be necessary to proceed to further stages.

Stage 2: Statistical Inferences This stage utilizes statistical methods to draw more robust conclusions
about the data. Here are some key tools used:
• Confidence Intervals: These estimate the range within which a population parameter is likely to
lie, based on a sample.
• Hypothesis Testing: This involves testing assumptions about a population parameter (e.g.,
average) to determine if there's evidence to support a specific claim.
• Correlation and Regression Analysis: These assess the strength and direction of relationships
between variables. Regression analysis can even predict the impact of one variable on another.
• Normality Testing: This verifies if a data set follows a normal distribution, which is a prerequisite
for many statistical tests.

Stage 3: Design of Experiments (DOE)


If the initial stages don't provide a definitive answer, this final stage involves planning and conducting
controlled experiments. DOE allows us to test the influence of multiple variables on the process and
identify the optimal settings for achieving desired outcomes. It's particularly valuable for understanding
how variables interact with each other.

In essence, the Data Door and Process Door work together during DMAIC analyses:
• The Process Door helps define the "what" and "how" of the process being analyzed.
• The Data Door helps analyze the "why" behind process performance and identify areas for
improvement.

The following approach is followed for the Data door:

Figure 4: Data door: Data preparation

a. Prepare: Some part of the data is collected using automated system and part of it is manually filled
in. The data was checked for logic and completeness. Some findings are:
i. Truck residence time as significant number of trucks have a low residence time (<5mins). This
seems not reasonable when we consider the practicalities.
ii. Many trucks have similar residence time e.g. 0:00:13; 0:05:43; 0:02:50 etc. which points to some
manipulation on the residence time.
iii. All country codes are in order “Cxx” but one of them is P-L
iv. The manual part of data is filled but it is hard to identify typing errors

b. Grouping: To get more insight on the data table, following groups are added:
i. Day of the week
ii. Residence Time
iii. Country of origin (haulier)
iv. Standard/non-standard of truck
v. Waste type
vi. Kind of trucks
Next, excel based pivot tables are used to calculate the 5 CTQs. The whole exercise is performed as a
group exercise (participated by Femke, James and Ankit)

CTQ1: Waste documentation is correct (E-CMR and E-ID/WIP)

The observations and conclusions are:


➢ Correct CMR: 3603 out of 4000 (90,1%)
➢ Correct CMR and WIP: 3531 out of 4000 (88,3%)
Improvement target: The CTQ1 is at ~88% which means
the first logical six sigma-based target is to get 2σ, which
is ~ 95%. This improvement goal is critical for ensuring
high-quality documentation and minimizing errors in
waste management processes.
Basic Interpretation
➢ Positive correlation between correct CMR and correct WIP
➢ 50% of Incorrect CMRs have unavailable WIP but not correlated to correctness of WIP

CTQ2: Waste identification is correct (E-ID = Waste)


The observations and conclusions are:
➢ Out of the 3873 available WIPs, 3796 are in line with the analysis of waste in the lab, accounting for
97,3% of the available WIPs.

Improvement target: The CTQ2 is at ~97% which


means the next logical six sigma-based target is to get
3σ, which is ~ 99.7%.

Positive correlation between WIP document and Lab confirmation of WIP.

CTQ 3: PPE determination is correct


The observations and conclusions are:
➢ 85 NC out of 4000 instances (~2,1%) where a problem (NC) was observed by production. It is assumed
that in these instances, the operator was not wearing the correct PPE. NC detected by labs are not an
issue as production is informed about the NC and they wear correct PPE. The NC detected by
production are different than that of lab as they look more into the physical appearances (big size
blocks, solid particles in liquid waste, liquid in solid waste,…) rather than chemical properties.

Improvement target: The CTQ3 is at ~97.9% which


means the next logical six sigma-based target is to get 3σ,
which is ~ 99.7%.

CTQ4: No non-conformities detected in waste (both lab and production)


The observations and conclusions are:
➢ In total, 3740 out of 4000 cases with no NC were
detected but both Lab and production, accounting for
93.5% of the total cases.

Improvement target: The CTQ4 is at ~93.5% which


means the target is to get 2σ, which is ~ 95%.

CTQ5: Sampling and unloading is as fast as possible

While faster sampling and unloading times are desirable from an efficiency standpoint, safety
considerations require a balanced approach. Using the in and out data of trucks, a simple residence time
(in minutes) is calculated and plotted on a histogram to understand the underlying trend.
The histogram shows a significant number of trucks (~350) with a residence time of less than 11 minutes.
Majority of them are showing conformity which means they go to unloading area, get a conformity check
from production and unload and then leave. To have all of it in less than 11 minutes does raise question
on the integrity of the data recording.
This histogram does not show a normal distribution behavior indicating that multiple factors are playing
a role in the outcome. A normality check with a QQ plot is performed and the data in residence time is
not normally distributed as it is showing heavy tailed distribution at both ends.

Graphical Analysis (1st step for Data door): The goal is to identify potential root causes of these issues by
applying graphical analysis.

Pareto Analysis: Given the discrete nature of the data (categorized as conforming or non-conforming),
Pareto charts were determined to be a suitable tool for further analysis. Following the Pareto Principle
(80/20 rule), Pareto charts can help identify the most frequent non-conformance (NC) categories, which
are likely to represent the root causes for most of the issues.

An initial Pareto chart was created


encompassing all NCs identified by both the
laboratory and production. This served as a
starting point for pinpointing the most
common defects.

To delve deeper, the data was segmented into


several subsets for more granular Pareto
analysis:
• Day of the Week: This analysis explored potential variations in NC frequency across different days.
• Time Block of Arrival (Before/After Noon): This subset investigated if arrival times influenced NC
occurrences.
• Truck Type: This analysis examined if specific truck types were more prone to NCs.
• Waste Type: This analysis explored associations between specific waste types and higher NC rates.
• Trailer Presence: This subset assessed if the presence of a trailer impacted NC frequency.
• Country of Origin: This analysis investigated if NCs were more prevalent from specific countries.

While most subsets yielded limited insights, the country analysis revealed a potential area for further
investigation. The Pareto chart for this subset highlighted Country C03 and Country C99 as significant
contributors to NCs, together accounting for a substantial portion of the total.
Drilling Down by Country:

To further explore the issues with these two countries, a Pareto


chart was created to identify the specific countries with the
highest NC frequencies. This confirmed that C03 and C99
together accounted for 64% of all NCs.

Customer-Level Analysis within High-NC Countries:


To delve deeper into the reasons behind NCs in C03 and C99, the analysis focused on individual customer
contributions within each country. In Country C03: Customers 152, 150, 058, 038, 023, and 030 were
identified as significant contributors, accounting for 34% of the total NCs in this country. In Country C99:
Customers 152, 150, 023, 102, and 151 were identified as top contributors, accounting for 39% of the
total NCs in this country. Interestingly, Customers 152, 150, and 023 appeared on both lists, suggesting
potential systemic issues affecting their NC.
Expanding the Investigation:
To ensure a comprehensive assessment, the entire dataset was re-examined to analyze NCs across all
countries. The aim is to determine if the observed pattern with Customers 152, 150, and 023 held true
across the entire dataset, not just in C03 and C99. A Pareto chart was created to visualize this analysis
and identify the major customer contributors to NCs across all countries.

Consistent Patterns and Customer Focus:


Analyzing the complete dataset reinforces the initial observations. Customers 152, 150, and 023
consistently appear as major contributors to NCs across all countries. This finding strengthens the
hypothesis that these customers may have systemic issues impacting their compliance rates. Customer
102 also emerges as another significant contributor. Collectively, these four customers account for ~30%
of all NCs identified in the entire dataset.

Considering absolute number of trucks per customer:


It's important to acknowledge that the likelihood of encountering NCs increases proportionally with the
volume of waste delivered by a customer. To account for this, a new visual analysis was created (using
the same pivot table) that examines the relative number of NCs compared to the total number of trucks
delivered by each individual customer i.e. % of trucks NC compared to it own total.

Customer Analysis and Prioritization


The analysis reveals a clear pattern: certain customers deliver waste with higher NC rates compared to
the total volume they contribute. This necessitates a closer look at both high-volume and high-percentage
NC contributors.
Key Customers and NC Rates:
• Customer 152: This customer contributes a significant volume (259 units) with a moderate NC
rate (14.3%). While the NC rate isn't the highest, the large volume warrants further investigation.
• Customer 150: Despite a high volume (218 units), this customer has a relatively low NC rate (6.9%)
compared to others.
• Customer 102: This customer stands out with a very high NC rate (53.6%) despite delivering a
lower volume (28 units). This suggests potentially severe quality or compliance issues.
• Customer 023: Like Customer 150, this customer delivers a significant volume (147 units) with a
relatively low NC rate (6.8%).
Additional High-NC Customers:
Several other customers also exhibit high NC rates (>15%), although their overall volumes are lower:
• Customer 007: 56% NC rate (25 deliveries with 14 NCs)
• Customer 086: 21.4% NC rate (14 deliveries with 3 NCs)
• Customer 139: 19% NC rate (21 deliveries with 4 NCs)
• Customer 010: 15.8% NC rate (19 deliveries with 3 NCs)

Prioritization for Improvement:


Based on the findings, our immediate focus should be:
• Customer 102: This customer has the highest NC rate, demanding a targeted intervention
strategy.
• Customer 152: While the NC rate is moderate, the high volume necessitates investigation to
identify and address potential root causes.
Customers 007, 086, 139, 010, and 030 warrant attention as well due to their high NC rates, even with
lower overall volumes.

Focus on Targeted Actions:


Customers 150 and 023, although delivering significant volumes, have comparatively lower NC
percentages. However, their contribution to overall NCs due to volume cannot be disregarded.

Conclusion:
The analysis clearly shows that specific customers are the primary contributors to NCs. Further statistical
analysis like hypothesis testing or regression might not be necessary at this stage. Instead, the priority
should be to develop targeted actions to address the identified root causes and improve compliance.

Process Improvement Strategy: This section outlines a multi-pronged approach to address the identified
issues and improve overall waste management compliance.
1. Customer engagement, auditing and follow it up with corrective action plan.
2. Collaborative Learning via training, workshops, and role play.
3. Continuous monitoring and real time feedback.
4. Establish six sigma benchmark and define measurable KPI: to elaborate on this → Set specific,
measurable targets for reducing NC rates. For example, the goal might be to achieve less than 5% NCs for
high-volume customers and less than 10% for all others within a defined timeframe (e.g., next quarter).
5. Streamlining documentation and integrity in data collection.
Enhancing fishbone and C&E matrix:
The previous fishbone and Cause-and-Effect (C&E) matrix analysis focused heavily on internal factors
contributing to NCs, such as inadequate training, maintenance issues, and detection by guards, labs, and
operators. While valuable, this approach primarily examined internal company operations and neglected
to consider potential inefficiencies within the company itself, such as redundant guard tasks or the lack
of source tracing for NCs. Addressing the root cause of an NC directly translates to a more efficient
downstream process. Also, the data analysis provided insights on external e.g. certain countries and
customers.

Going forward, it's crucial to incorporate the customer as a potential "X" factor in future fishbone
diagrams. By doing so, the customer is more likely to be identified as a high-ranking factor in the C&E
matrix, enabling quicker and more targeted interventions to improve the overall process. This reinforces
the importance of a comprehensive understanding of the entire process and a willingness to think
creatively when constructing an Ishikawa diagram. This approach ultimately increases the chances of
pinpointing root causes during the C&E matrix or Failure Mode and Effects Analysis (FMEA) stages.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy