0% found this document useful (0 votes)
54 views179 pages

Quality Midsem

This document provides an introduction to quality engineering and design of experiments. It discusses control charts for online quality control and offline control methods used in the design phase. Design of experiments can be used to identify key variables, maximize yield and minimize cost. Examples are provided on experimental design for injection molding and plasma etching processes. Analysis of variance (ANOVA) is introduced as a tool for determining which factors have a statistically significant effect on a response.

Uploaded by

HEMANTH KAJULURI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views179 pages

Quality Midsem

This document provides an introduction to quality engineering and design of experiments. It discusses control charts for online quality control and offline control methods used in the design phase. Design of experiments can be used to identify key variables, maximize yield and minimize cost. Examples are provided on experimental design for injection molding and plasma etching processes. Analysis of variance (ANOVA) is introduced as a tool for determining which factors have a statistically significant effect on a response.

Uploaded by

HEMANTH KAJULURI
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 179

Introduction to

Quality Engineering
Prof. Sayak Roychowdhury
What this course is about
• Design of Experiments
• Control charts deal with online quality control
• Off-line control methods used in design phase of the
product/process/service
• Neverending process of quality improvement
Example

• Fisher’s (optimal) experimental


design & ANOVA in USA

https://www.agry.purdue.edu/ext/corn/news/timeless/images/US_Corn_Yld_Trend.png
Nobel Prize and what not
• https://theconversation.com/how-randomised-trials-
became-big-in-development-economics-128398
Engineering Example
• Improvement of Injection Molding Process
(Park, K., & Ahn, J. H. (2004). Design of experiment considering two-way interactions and its application to
injection molding processes with numerical analysis. Journal of Materials Processing Technology, 146(2), 221-
227.)

• For mold design, four input factors:


• Positions of subgate, positions of main gate, diameters of sub-
runners, diameters of main runners
• Output (response):
• Injection pressure (minimize)
• Flow balance (filling time difference)
• Cooling Efficieny
Engineering Example
Engineering Example
• For process conditions, 8 input factors

• Response: Mean part weight


Why DoE?
• Identify key decision variables (screening experiments)
• Identify important factors and associated levels to
maximize yield (profit) and minimize cost
• Choice of appropriate parameters and levels for
manufacturing phase or delivery of services
• Robust design (Taguchi Methods) to incorporate
robustness against uncontrollable/noise factors
Food for Thought
• What process/product/service you want to improve?
• What are the input parameters (factors)?
• What are the measurable outputs?
ANOVA
Prof. S. Roychowdhury
The Statapult
Example: More than 2 Levels

• More than 2 samples


• Distance travelled in setting 1 (inches)
11,13,12,10,11
• Distance travelled in setting 2 (inches)
17,14,13,15,15
• Distance travelled in setting 3 (inches)
19,17,21,23,18
Question: Does setting significantly
affect travelling distance?
Plasma Etching
Plasma Etching

Etch Rate
Power (W) 1 2 3 4 5
160 575 542 530 539 570
180 565 593 590 579 610
200 600 651 610 637 629
220 725 700 715 685 710
1 Factor: More than 2 Levels

• Concept of ANOVA
• ANOVA Table
• Formulas
• Conclusion
Central Limit Theorem (CLT)
• Definition: If 𝑥1 , … , 𝑥𝑛 are independent random variables with mean 𝜇𝑖 and variance 𝜎𝑖2 ,
𝑦−σ𝑛𝑖=1 𝜇𝑖
and if 𝑦 = 𝑥1 + ⋯ + 𝑥𝑛 , then the distribution of approaches the 𝑁 0,1
σ𝑛 2
𝑖=1 𝜎𝑖
distribution as 𝑛 approaches infinity.
(Montgomery D.C., Introduction to Statistical Quality Control)

• It implies that the sum of 𝑛 independently distributed random variables is approximately


normal, regardless of the distribution of the individual variables.

• If 𝑥𝑖 are independent and identically distributed (IID), and distribution of each 𝑥𝑖 does not
depart radically from normal distribution, then CLT works quite well for 𝑛 ≥ 3 𝑜𝑟 4.
(common in SQC problems)
Important Sampling Distributions Derived
from Normal Distribution
1. 𝜒 2 distribution: If 𝑥1 , . . 𝑥𝑛 are standard normally and
independently distributed then 𝑦 = 𝑥12 + 𝑥22 … + 𝑥𝑛2
follow chi-squared distribution with 𝑛 degrees of
freedom.
2. 𝑡-distribution: If 𝑥 is standard normal variable and 𝑦 is chi-
squared random variable with 𝑘 degrees of freedom, and𝑥 if
𝑥 and 𝑦 are independent then the random variable 𝑡 = 𝑦 is
𝑘
distributed as 𝑡 with 𝑘 degrees of freedom.
3. If 𝑤 and 𝑦 are two independent random chi-sq distributed
variables with 𝑢 and 𝑣 degrees of freedom, then the ratio
𝑤
𝑢
𝐹= 𝑦 follows F distribution with (𝑢, 𝑣) degrees of freedom
𝑣
ANOVA
Minitab: Stat-> ANOVA-> One way ANOVA,
Graphs: Select Boxplot, Normal Probability Plot of Residuals

Example 3: Null hypothesis is rejected at 𝛼 = 0.05. The average distance


travelled differs significantly with the settings.
ANOVA How do I
explain
(model) the
variation ?
ANOVA
Means Model
𝒚𝒊𝒋 = 𝝁𝒊 + 𝝐𝒊𝒋
𝑖 = 1, . . 𝑎 levels
𝑗 = 1, . . 𝑛 observations in each level
Null hypothesis 𝐻0 : 𝜇1 = 𝜇2 … . = 𝜇𝑎
𝐻1 : 𝜇𝑖 ≠ 𝜇𝑗 for at least one pair of 𝑖, 𝑗

Effects Model
𝒚𝒊𝒋 = 𝝁 + 𝝉𝒊 + 𝝐𝒊𝒋
𝑖 = 1, . . 𝑎 levels
𝑗 = 1, . . 𝑛 observations in each level
Null hypothesis 𝐻0 : 𝜏1 = 𝜏2 … . = 𝜏𝑎 = 0
𝐻1 : 𝜏𝑖 ≠ 0 for at least one 𝑖
ANOVA
Grand
Mean Random Error
Component 𝑁(0, 𝜎 2 )
Linear Statistical Model
𝒚𝒊𝒋 = 𝝁 + 𝝉𝒊 + 𝝐𝒊𝒋
𝑖 = 1, . . 𝑎 levels
Treatment Effect
𝑗 = 1, . . 𝑛 observations in each level
(Effect due to level i)
ANOVA Calculations

𝑎 𝑛 𝑎 𝑎 𝑛
2 2 2
෍ ෍ 𝑦𝑖𝑗 − 𝑦.
ത . = 𝑛 ෍ 𝑦ത𝑖 . −𝑦ത.. + ෍ ෍ 𝑦𝑖𝑗 − 𝑦ത𝑖 .
𝑖=1 𝑗=1 𝑖=1 𝑖=1 𝑗=1

𝑆𝑆𝑇 = 𝑆𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡 + 𝑆𝑆𝐸


Assumption: Each of the 𝑎 populations is coming from Normal distribution.
If 𝐻0 is true then 𝐹0 (see ANOVA Table) follows 𝐹 distribution with 𝑎 − 1
and 𝑎 𝑛 − 1 degrees of freedom.
Procedure: Reject 𝐻0 if 𝐹0 > 𝐹𝛼,𝑎−1,𝑎(𝑛−1) or P-value <0.05 , meaning differences
among the average responses at different levels are significant.
Assumptions for ANOVA
• The model error is assumed to be independent and
normally distributed random variables, with mean 0 and
variance 𝜎 2
• 𝑦𝑖𝑗 ~𝑁(𝜇 + 𝜏𝑖 , 𝜎 2 )
• 𝜖𝑖𝑗 ~𝑁(0, 𝜎 2 )
• Observations are mutually independent
• The variance 𝜎 2 is assumed to be same for all levels of the
factor.
Statistical Analysis
• 𝑆𝑆𝑇 is the sum of square of normally distributed random
variables.
𝑆𝑆𝑇
• It can be shown 2 ~𝜒 2 (𝑁 − 1)
𝜎
𝑆𝑆𝐸
• It can be shown 2 ~𝜒 2 (𝑁 − 𝑎) and
𝜎
𝑆𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠 2
• ~𝜒 (𝑎 − 1) if 𝐻0 : 𝜏𝑖 = 0 ∀𝑖 is true
𝜎2
• But all three sum of squares are not independent.
Cochran’s Theorem
• Let 𝑍𝑖 be 𝑁𝐼𝐷(0,1) for 𝑖 = 1,2, … 𝜈 and
σ𝜈𝑖=1 𝑍𝑖2 = 𝑄1 + 𝑄2 + ⋯ + 𝑄𝑠
Where 𝑠 ≤ 𝜈, 𝑄𝑖 has 𝜈𝑖 degrees of freedom.
Then 𝑄1 , . . 𝑄𝑠 are independent chi-square random variables
with 𝜈1 , … , 𝜈𝑠 degrees of freedom respectively, if and only if

𝜈 = 𝜈1 + ⋯ + 𝜈𝑠
Cochran’s Theorem
𝑆𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠 𝑆𝑆𝐸
Cochran’s theorem implies that and are
𝜎2 𝜎2
independently distributed chi-square random variables.

So if 𝐻0 of no difference in treatment is true the ratio


𝑆𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠 /(𝑎−1)
𝐹0 = = 𝑀𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠 /𝑀𝑆𝐸 is distributed
𝑆𝑆𝐸 /(𝑁−𝑎)
as 𝐹 with (𝑎 − 1) and (𝑁 − 𝑎) dofs.
ANOVA Calculations
ANOVA F-crit

𝑭𝟎.𝟎𝟓,𝟐,𝟏𝟐
Key Considerations for ANOVA
• Experiments have to be performed in random order so
that the environment in which the treatments are applied
is as uniform as possible. The experimental design should
be completely randomized design.
• When “𝑎”no. treatment levels are specifically chosen by
the experimenter, the conclusions cannot be extended for
similar treatments that were not considered. This is called
Fixed Effects Model
• When "𝑎“ no. of treatment levels are chosen randomly
out of a larger population of treatments, 𝜏𝑖 (treatment
effects) are random variables and we try to estimate the
variability in 𝜏𝑖 . This is called Random Effects Model.
Key Considerations for ANOVA
• In Fixed Effects Model we test hypotheses about the
treatment means.
• The conclusions for FEM are only applied to the factor
levels considered.
• The model parameters (𝜇, 𝜏𝑖 , 𝜎 2 ) can be estimated from
FEM.
• In Random Effects Model, conclusions can be exteneded
to all treatments.
• In REM, 𝜏𝑖 s are random variables, and the variability of
the 𝜏𝑖 s are estimated.
Which Factors are Significant

no
F-test finds
Terminate
Significance?

Perform Pair-wise
“LSD” Comparisons Terminate
Or Tukey’s test
Least Significant Difference
1 1
▪ LSD = 𝑡1−𝛼 , 𝑛𝑜. 𝑜𝑓 𝑑𝑜𝑓𝑠 𝑜𝑓 𝑟𝑒𝑠𝑖𝑑𝑢𝑎𝑙𝑠 * 𝑀𝑆𝐸 ( + )
2 𝑛1 𝑛2

▪ Check if |𝑦ത𝑖 − 𝑦ത𝑗 | > 𝐿𝑆𝐷 for all 𝑖, 𝑗 ∈ 1 … 𝑎, 𝑖 ≠ 𝑗


▪ You can make plots to show which
1

2
3
levels are different

Final conclusion: Setting 3 provides significantly more


travel, than setting 1 and 2. There is no significant difference
among the average distances travelled as a result of settings 1
and 2.
Tukey’s Test
• When null hypothesis is rejected, all pairwise means to be
compared 𝐻0 : 𝜇𝑖 = 𝜇𝑗 ; 𝐻1 : 𝜇𝑖 ≠ 𝜇𝑗
• Tukey’s procedure has significance level 𝛼 when sample sizes
are equal and atmost 𝛼 when sample sizes are not equal. It
keeps family error rate at selected level 𝛼.
• Tukey’s procedure uses Studentized Range Distribution

y𝑚𝑎𝑥 −ഥ
ymin
• Studentized Range Statistic q =
𝑀𝑆𝐸
𝑛
𝑀𝑆𝐸
• If |𝑦ത𝑖 . −𝑦ത𝑗. | > 𝑇𝛼 = 𝑞 1−𝛼 (𝑎, 𝑓)
𝑛
• where 𝑎 = number of treatment levels and 𝑓 is dof associated
with MSE, then two means are significantly different
Confidence Intervals
• 100 1 − 𝛼 confidence intervals for all pairs of means may be
constructed
𝑀𝑆𝐸
𝑦ത𝑖 . −𝑦ത𝑗 . −𝑞1−𝛼 𝑎, 𝑓 ≤ 𝜇𝑖 − 𝜇𝑗
𝑛
𝑀𝑆𝐸
≤ 𝑦ത𝑖 . −𝑦ത𝑗 . + 𝑞1−𝛼 𝑎, 𝑓
𝑛
For unequal sample sizes:
Check if the absolute differences of sample means is greater
than
𝑞1−𝛼 𝑎,𝑓 1 1
𝑇𝛼 = 𝑀𝑆𝐸 +
2 𝑛𝑖 𝑛𝑗
Estimating Model Parameters
• 𝑦𝑖𝑗 = 𝜇 + 𝜏𝑖 + 𝜖𝑖𝑗 For 𝑖 = 1, . . 𝑎 ; 𝑗 = 1, … 𝑛
• The model parameters can be estimated as below:
• 𝜇ො = 𝑦.ത.
• 𝜏Ƹ 𝑖 = 𝑦ത𝑖 . −𝑦.
ത . For 𝑖 = 1, . . 𝑎
• 𝑖 𝑡ℎ treatment mean: 𝜇ො𝑖 = 𝑦ത𝑖 .
Confidence interal of 𝑖 𝑡ℎ treatment mean:
𝑀𝑆𝐸 𝑀𝑆𝐸
𝑦ത𝑖 . − 𝑡1−𝛼,𝑁−𝑎 ≤ 𝜇𝑖 ≤ 𝑦ത𝑖 . + 𝑡1−𝛼,𝑁−𝑎
2 𝑛 2 𝑛
Model Adequacy Checking
• To check the assumption of normality of the error term
find the residuals: 𝑒𝑖𝑗 = 𝑦𝑖𝑗 − 𝑦ො𝑖𝑗 = 𝑦𝑖𝑗 − 𝑦ത𝑖 .
• The residuals should be structureless.
• Normal probability plot of residuals is an effective way to
check the assumption of normality
• Check outliers with standardized residuals
𝑒𝑖𝑗
𝑑𝑖𝑗 =
𝑀𝑆𝐸
68% of 𝑑𝑖𝑗 should be within ±1, 95% within ±2 and almost
all (99.73%) should be within ±3
Variance Stabilizing Transformation
Observations Transformation

Poisson Distribution Square root: 𝑦𝑖𝑗 = 𝑦𝑖𝑗 or

𝑦𝑖𝑗 = 1 + 𝑦𝑖𝑗

Lognormal Distribution Logarithmic: 𝑦𝑖𝑗 = log 𝑦𝑖𝑗
Binomial Distribution ′
Arcsin: 𝑦𝑖𝑗 = arcsin√(𝑦𝑖𝑗 )
More on ANOVA and
Intro to Contrast
Prof. Sayak Roychowdhury
Model Parameter Estimation
• Model for 1-way ANOVA
𝑦𝑖𝑗 = 𝜇 + 𝜏𝑖 + 𝜖𝑖𝑗
Estimates of 𝜇 and 𝜏𝑖 maybe derived by
𝜇ො = 𝑦ത∙∙
𝜏Ƹ 𝑖 = 𝑦ത𝑖∙ − 𝑦ത∙∙ 𝑖 = 1,2, … 𝑎
• The mean 𝜇𝑖 = 𝜇 + 𝜏𝑖 for level 𝑖 is estimated by
𝜇ො𝑖 = 𝜇ො + 𝜏Ƹ 𝑖 = 𝑦ത𝑖∙
• 100 1 − 𝛼 % confidence interval for 𝜇𝑖
𝑀𝑆𝐸 𝑀𝑆𝐸
𝑦ത𝑖∙ − 𝑡1−𝛼,𝑁−𝑎 ≤ 𝜇𝑖 ≤ 𝑦ത𝑖∙ + 𝑡1−𝛼,𝑁−𝑎
2 𝑛 2 𝑛
Model Parameter Estimation
• Difference in treatments maybe estimated by
2𝑀𝑆𝐸 2𝑀𝑆𝐸
𝑦ത𝑖∙ − 𝑦ത𝑗∙ − 𝑡1−𝛼,𝑁−𝑎 ≤ 𝜇𝑖 − 𝜇𝑗 ≤ 𝑦ത𝑖∙ − 𝑦ത𝑗∙ + 𝑡1−𝛼,𝑁−𝑎
2 𝑛 2 𝑛

• This way of determining confidence interval may increase


the experimentwise error rate. To mitigate that
Bonferroni method is used, where 𝑡1− 𝛼 ,𝑁−𝑎 is considered
2𝑟
instead of 𝑡1−𝛼,𝑁−𝑎 , where 𝑟 is the number of
2
comparisons.
Contrast
• From the outcome of ANOVA, if significance is found, the next step
is to analyze how the effects are influencing the response
• We may test the null hypothesis: the higher RF power produce
same Etch Rate
𝐻0 : 𝜇3 = 𝜇4 or 𝜇3 − 𝜇4 = 0 (1)
𝐻1 : 𝜇3 ≠ 𝜇4 or 𝜇3 − 𝜇4 ≠ 0
• We may test the null hypothesis: the average Etch rate produced by
low RF Power is equal to that in high RF Power

𝐻0 : 𝜇1 + 𝜇2 = 𝜇3 + 𝜇4 or 𝜇1 + 𝜇2 − 𝜇3 − 𝜇4 = 0
𝐻1 : 𝜇1 + 𝜇2 ≠ 𝜇3 + 𝜇4 or 𝜇1 + 𝜇2 − 𝜇3 − 𝜇4 ≠ 0
(2)
Contrast
• Contrast is the linear combination of the parameters of
the form Γ = σ𝑎𝑖=1 𝑐𝑖 𝜇𝑖 , where σ𝑎𝑖=1 𝑐𝑖 = 0
• The example hypotheses can be expressed in the form
𝐻0 : σ𝑎𝑖=1 𝑐𝑖 𝜇𝑖 = 0
𝐻1 : σ𝑎𝑖=1 𝑐𝑖 𝜇𝑖 ≠ 0
• 𝑐𝑖 s are called contrast constants
• For (1): the contrast constants are 𝑐1 = 𝑐2 = 0,
𝑐3 = 1, 𝑐4 = −1
• For (2): the contrast constants are 𝑐1 = 𝑐2 = +1,
𝑐3 = 𝑐4 = −1
Hypothesis Testing with Contrast
• Null hypothesis 𝐻0 : σ𝑎𝑖=1 𝑐𝑖 𝜇𝑖 = 0
• Treatment averages 𝐶 = σ𝑎𝑖=1 𝑐𝑖 𝑦𝑖∙
𝜎2 𝑎
• Variance of 𝐶: 𝑉 𝐶 = σ𝑖=1 𝑐𝑖2
𝑛
𝐶
• If 𝐻0 is true then follows 𝑁(0,1). Since 𝜎 2 is
𝑉 𝐶
estimated by 𝑀𝑆𝐸 , the new statistic would be:
σ𝑎 ത 𝑖∙
𝑖=1 𝑐𝑖 𝑦
𝑡0 =
𝑀𝑆𝐸
σ𝑎 𝑐
𝑖=1 𝑖
2
𝑛
• The null hypothesis is rejected if 𝑡0 > 𝑡1−𝛼,𝑁−𝑎
2
F-test
2
σ𝑎 ത 𝑖∙
𝑖=1 𝑐𝑖 𝑦
• 𝐹0 = 𝑡02 = 𝑀𝑆𝐸 is an F- statistic
σ𝑎 𝑐
𝑖=1 𝑖
2
𝑛
• Null hypothesis will be rejected if 𝐹0 > 𝐹1−𝛼,1,𝑁−𝑎
𝑆𝑆𝑐
𝑀𝑆𝑐
• 𝐹0 = = 1
,
𝑀𝑆𝐸 𝑀𝑆𝐸
2
σ𝑎 ത 𝑖∙
𝑖=1 𝑐𝑖 𝑦
where 𝑆𝑆𝑐 = 1 𝑎 is contrast sum of squares with 1
σ𝑖=1 𝑐𝑖2
𝑛
degree of freedom
Confidence Interval for Contrast
• 100 1 − 𝛼 % confidence interval is given by

𝑀𝑆𝐸 𝑎
σ𝑎𝑖=1 𝑐𝑖 𝑦ത𝑖∙ − 𝑡𝛼,𝑁−𝑎 σ𝑖=1 𝑐𝑖2 ≤ σ𝑎𝑖=1 𝑐𝑖 𝜇𝑖 ≤
2 𝑛

𝑀𝑆𝐸 𝑎
σ𝑎𝑖=1 𝑐𝑖 𝑦ത𝑖∙ + 𝑡𝛼,𝑁−𝑎 σ𝑖=1 𝑐𝑖2
2 𝑛
Example
Factor: RF power setting at 4 levels. 5 etch rates were
collected at each level. From the data
𝑦1∙ = 551.2 ; 𝑦2∙ = 587.4 ; 𝑦3∙ = 625.4; 𝑦4∙ = 707.0
𝑦. . = 12355; 𝑀𝑆𝐸 = 333.70
Test the hypothesis that
𝐻0 : 𝜇1 + 𝜇2 = 𝜇3 + 𝜇4
Scheffe’s Method for Comparison
• Scheffe (1953) proposed a method for any and all possible
contrasts
• Suppose there are 𝑚 contrasts given by
Γ𝑢 = 𝑐1𝑢 𝜇1 + ⋯ 𝑐𝑎𝑢 𝜇𝑎 𝑤ℎ𝑒𝑟𝑒 𝑢 = 1, . . 𝑚
• The corresponding contrast in treatment average is
𝑎
𝐶𝑢 = ෍ 𝑐𝑖𝑢 𝑦ത𝑖∙ 𝑤ℎ𝑒𝑟𝑒 𝑢 = 1, . . 𝑚
𝑖=1
• The standard error is given by
𝑆𝐶𝑢 = (𝑀𝑆𝐸 σ𝑎𝑖=1 𝑐𝑖2 /𝑛𝑖 )
• The critical value to be compared with 𝐶𝑢 is
S𝛼,𝑢 = 𝑆𝐶𝑢 𝑎 − 1 𝐹1−𝛼,𝑎−1,𝑁−𝑎
• If 𝐶𝑢 > 𝑆𝛼,𝑢 then the contrast is significant
Scheffe’s Method
• 𝐶𝑢 − 𝑆𝛼,𝑢 ≤ Γ𝑢 ≤ 𝐶𝑢 + 𝑆𝛼,𝑢 are simultaneous confidence
intervals.
• In Scheffe’s Method, Type-I error is atmost 𝛼 for any
possible comparison
Randomized Complete
Block Design and Latin
Square Design
Prof. Sayak Roychowdhury
Motivation
• Blocking is a technique to understand the extent of
influence of nuisance factors
• A nuisance factor is a factor that may have an effect on
the response variable, not of primary importance to the
experimenter. E.g Variability due to raw material,
operator, shift, etc.

Unknown and
Nuisance factors

Randomize
Uncontrollable

Known and uncontrollable ANCOVA

Known and controllable Blocking


Randomized Complete Block Design
(RCBD)
• RCBD has 3 components
• Block: Experimental runs are conducted in blocks for different
levels of nuisance factors e.g. 2 blocks for Operator 1 and
Operator 2 or 3 blocks for Shift A, Shift B and Shift C
• Complete: Each block should contain all the different treatment
levels. E.g. There are 5 treatment levels of a factor, and 2
operators are involved. Each operator should conduct
experiments at all 5 treatment levels.
• Randomized: Within each block, order of experimental runs
should be randomized, to minimize the effect of other
factors(such as environmental, fatigue) not considered.
Example 1
• Think about an engineering application for which
i. There is a response variable (Y) e.g Yield%, Viscosity, Purity(%)
etc.
ii. There is a factor that you are interested (X), which can be
varied at 3 levels, a1, a2, a3 e.g Concentration of a particular
component, temperature, pressure
iii. A blocking (nuisance) factor (B) with 2 levels: b1, b2, e.g.
experimenter, machines
• The purpose is to study the variability due to the levels of factor X
only, and to isolate the error component present due to the
nuisance factor
• Assume there are 30 experimental runs to be conducted; 10 for
each levels
• There are 2 ways to design this a. completely randomized design ;
b. randomized block design
Completely Randomized Design

Level a1, 10 runs

Randomized
Design (Assign b1 Level a2, 10 runs
and b2 randomly)

Level a3, 10 runs


Randomized Block Design
Level a1, 5
runs

Blocking Level a2, 5


level b1 runs

Level a3, 5
runs
Randomized
Block Design
Level a1, 5
runs

Blocking Level a2, 5


level b2 runs

Level a3, 5
runs
Example 2
• We want to study whether the effect of different tips on
measurement of hardness
• 1 factor : tip type, number of levels: 4
• You may take 4 replications for each levels (total 4 X 4 = 16
runs) and apply ANOVA to determine the effect due to levels
• Nuisance factor : test coupons, as they maybe produced
from different batches of raw materials
• An example of RCBD in this case:
Test Coupon
1 2 3 4
Tip3 Tip3 Tip2 Tip1
Tip1 Tip4 Tip1 Tip4
Tip4 Tip2 Tip3 Tip2
Tip2 Tip1 Tip4 Tip3
Data Collection for RCBD

Each of the a treatment levels is repeated in each of the b


blocks, in random order
Each data point is 𝑦𝑖𝑗 , where 𝑖 = 1, . . , 𝑎; 𝑗 = 1, . . 𝑏
Model for RCBD
• Model for RCBD
𝑦𝑖𝑗 = 𝜇 + 𝜏𝑖 + 𝛽𝑗 + 𝜖𝑖𝑗
∀ 𝑖 = 1 , … , 𝑎; 𝑗 = 1, … , 𝑏
𝜇: Grand mean
𝜏𝑖 : Effect due to 𝑖 𝑡ℎ treatment
𝛽𝑗 : Effect of 𝑗𝑡ℎ block
𝜖𝑖𝑗 : Error ~𝑁(0, 𝜎 2 )
We consider treatment and block levels as deviation from
mean; hence σ𝑎𝑖=1 𝜏𝑖 = 0 ; σ𝑏𝑗=1 𝛽𝑗 = 0
Hypothesis Statement
• The same model may be stated as
𝑦𝑖𝑗 = 𝜇𝑖𝑗 + 𝜖𝑖𝑗 ∀ 𝑖 = 1 , … , 𝑎; 𝑗 = 1, … ,
𝜇𝑖𝑗 = 𝜇 + 𝜏𝑖 + 𝛽𝑗
• We are interested in testing the equality of treatment
means 𝜇𝑖 = 𝜇 + 𝜏𝑖
𝐻0 : 𝜇1 = 𝜇2 = ⋯ = 𝜇𝑎 (null hypothesis)
𝐻1 : at least one 𝜇𝑖 ≠ 𝜇𝑗
• Alternatively we want to test if all the treatment effects
are zero
𝐻0 : 𝜏1 = 𝜏2 = ⋯ = 𝜏𝑎 = 0
𝐻1 : at least one 𝜏𝑖 ≠ 0
ANOVA Table for RCBD

𝑆𝑆𝑇 = 𝑆𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡 + 𝑆𝑆𝐵𝑙𝑜𝑐𝑘𝑠 + 𝑆𝑆𝐸


ANOVA for RCBD
• Because there are N observations, SST has N - 1 degrees of
freedom.

• There are a treatments and b blocks, so SSTreatments and SSBlocks


have a - 1 and b -1 degrees of freedom, respectively.

• The error sum of squares is just a sum of squares between


cells minus the sum of squares for treatments and blocks.

• There are ab cells with ab - 1 degrees of freedom between


them, so SSE has ab - 1 - (a - 1) - (b - 1) = (a - 1)(b - 1) degrees
of freedom.
Calculations

𝑆𝑆𝑇
𝑆𝑆𝐵𝑙𝑜𝑐𝑘𝑠

𝑆𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠
𝑆𝑆𝐸
Calculations
• For ease of calculations the following simplifications
maybe used:
Results and Conclusion
• If 𝐹0 > 𝐹 1−𝛼 , 𝑎−1 , 𝑎−1 𝑏−1 then conclude that null
hypothesis is rejected and the treatments have significant
effects
𝑀𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠
where 𝐹0 =
𝑀𝑆𝐸
𝐹 1−𝛼 , 𝑎−1 , 𝑎−1 𝑏−1 is found from the F-distribution table,
usually 𝛼 = 0.05
• Effect of blocking factor may also be estimated from the ratio
𝑀𝑆𝐵𝑙𝑜𝑐𝑘𝑠
(but not usually compared with F-distribution, as
𝑀𝑆𝐸
blocking puts a restriction on randomization)
• Usually if the ratio is much greater than 1, it suggests the
blocking factor has a significant contribution in the variability
of the response, and suggests that blocking was a good idea
Explanation
• Considering the error term 𝜖𝑖𝑗 ~𝑁(0, 𝜎 2 ), the expected
values of mean squares can be derived as

𝑀𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠
• Hence if 𝐹0 = ≫ 1 it suggests significance of
𝑀𝑆𝐸
treatment effects. Similar conclusion can be made for the
blocking factor as well
Estimation of Parameters
• The estimates of the parameters are given by:
𝜇ො = 𝑦.
ത.
𝜏ෝ𝑖 = 𝑦ത𝑖∙ − 𝑦.
ത . . 𝑖 = 1, . . , 𝑎
𝛽መ𝑗 = 𝑦ത∙𝑗 − 𝑦.
ത . . 𝑗 = 1, . . , 𝑏
𝑦ො𝑖𝑗 = 𝑦ത𝑖∙ + 𝑦ത∙𝑗 − 𝑦.ത..
Latin Square Design
• There are several other types of designs that utilize the blocking principle.

• For example, suppose that an experimenter is studying the effects of five


different formulations of a rocket propellant used in aircrew escape
systems on the observed burning rate.

• Each formulation is mixed from a batch of raw material that is only large
enough for five formulations to be tested.

• Furthermore, the formulations are prepared by several operators, and


there may be substantial differences in the skills and experience of the
operators.

• Thus, it would seem that there are two nuisance factors to be “averaged
out” in the design: batches of raw material and operators.
Latin Square Design

• Notice the square arrangement, 5 levels each of raw material,


operators and treatments
• Latin Square Design is used to eliminate 2 nuisance factors
• Rows and columns impose 2 restrictions on randomization
Latin Square Design
• Some examples of latin squares are
Latin Square Design
• Following the similar formulation as RCBD, for 𝑝 levels
each of row factor, column factor and the treatment, the
statistical model is given by:

𝑦𝑖𝑗𝑘 denotes 𝑖 𝑡ℎ row, 𝑘 𝑡ℎ column and 𝑗𝑡ℎ treatment


• This is an effects model, the model is completely
additive, and there is no interaction between rows,
columns and treatments
ANOVA of Latin Square Design
• We are interested in determining the significance of
treatment levels
𝐻0 : 𝜏1 = 𝜏2 = ⋯ = 𝜏𝑎 = 0
𝐻1 : at least one 𝜏𝑖 ≠ 0
• The analysis of variance consists of partitioning the total
sum of squares of the N = p2 observations into
components for rows, columns, treatments, and error, for
example,
𝑆𝑆𝑇 = 𝑆𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡 + 𝑆𝑆𝑅𝑜𝑤 + 𝑆𝑆𝐶𝑜𝑙𝑢𝑚𝑛 + 𝑆𝑆𝐸
ANOVA for Latin Square Design
ANOVA for Latin Square Design with
Replication
• A disadvantage of small latin squares is the lack of error degree of freedom.
• To resolve this, replication is used. Suppose 𝑛 replicates are used for each
combination.
• Case 1: design with same operator and raw material in each replicate
ANOVA for Latin Square Design with
Replication
• Case 2: Use same operator and different raw materials in each replicate or vice
versa
ANOVA for Latin Square Design with
Replication
• Case 3: Use different operator and different raw materials in each replicate
Graeco Latin Square Design
• Graeco-Latin Square Designs are used to control 3 sources
of variability, i.e. to block in 3 directions

The statistical model is given by: (row 𝑖, col 𝑙, Latin letter 𝑗,


Greek letter 𝑘)
ANOVA of Graeco-Latin Square
Random Effects Model
and
Non Parametric
Method
Prof. Sayak Roychowdhury
Random Effects Model
• When a factor has a large number of factor levels, the
experimenter may choose 𝑎 of these levels from the
population of the levels.
• The factor is assumed to be random
• Statistical Model
𝑦𝑖𝑗 = 𝜇 + 𝜏𝑖 + 𝜖𝑖𝑗 ; 𝑖 = 1, . . 𝑎, 𝑗 = 1, . . 𝑛
Both 𝜏𝑖 and 𝜖𝑖𝑗 are random variables
𝜏𝑖 ~ 𝑁(0, 𝜎𝜏2 ) ; 𝜖𝑖𝑗 ~ 𝑁(0, 𝜎 2 )
𝜎𝜏2 and 𝜎 2 are variance components
𝑉 𝑦𝑖𝑗 = 𝜎𝜏2 + 𝜎 2
Random Effects Model
• In fixed effects model, all 𝑦𝑖𝑗 are independent
• In REM, 𝑦𝑖𝑗 s are only independent if they come from
different factor levels.
• 𝐶𝑜𝑣 𝑦𝑖𝑗 , 𝑦𝑖𝑗′ = 𝜎𝜏2 when 𝑗 ≠ 𝑗 ′
• 𝐶𝑜𝑣 𝑦𝑖𝑗 , 𝑦𝑖′𝑗′ = 0 when 𝑖 ≠ 𝑖 ′
• Observations within a specific factor level all have the
same covariance, since
• before the experiment is conducted, we expect the
observations at that factor level to be similar as they all
have the same random component
Analysis of Variance
• Like fixed effects model
𝑆𝑆𝑇𝑜𝑡𝑎𝑙 = 𝑆𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠 + 𝑆𝑆𝐸𝑟𝑟𝑜𝑟
• In stead of individual treatment effects as in fixed effects
model, we are more interested in the population of
treatments
• Hypothesis statement:
𝐻0 : 𝜎𝜏2 = 0; 𝐻1 : 𝜎𝜏2 > 0
The calculations for ANOVA are the same as that of fixed
effects model.
If 𝐹0 > 𝐹1−𝛼,𝑎−1,𝑁−𝑎 then reject the null hypothesis
Estimation of Model Parameters
• It can be shown that
𝐸 𝑀𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡𝑠 = 𝜎 2 + 𝑛𝜎𝜏2
𝐸 𝑀𝑆𝐸𝑟𝑟𝑜𝑟 = 𝜎 2
• Estimators of variance components are given by:
𝜎ො 2 = 𝑀𝑆𝐸𝑟𝑟𝑜𝑟

𝑀𝑆𝑇𝑟𝑒𝑎𝑡𝑚𝑒𝑛𝑡 − 𝑀𝑆𝐸𝑟𝑟𝑜𝑟
𝜎ො𝜏2 =
𝑛
Non-parametric Method
• Kruskal-Wallis Test:
When normality assumption is not justified, experimenter
may resort to non-parametric methods , that do not
depend on normality assumption
• Kruskal-Wallis Test, developed in 1952 is an alternative to
ANOVA
Kruskal-Wallis Test
• Steps:
• Rank observations 𝑦𝑖𝑗 in ascending order, and 𝑅𝑖𝑗 be the rank
• In case of ties, assign average rank to each of the tied obs
1 𝑎 𝑅 𝑖 .2 𝑁 𝑁+1 2
• Test statistic: 𝐻 = [σ𝑖=1 − ] where 𝑛𝑖 is the
𝑆2 𝑛𝑖 4
number of observations in 𝑖𝑡ℎ treatment
• 𝑅𝑖 . Is the sum of the observations in the 𝑖𝑡ℎ treatment row
1 𝑎 𝑛𝑖 2 𝑁 𝑁+1 2
• 𝑆2 = [σ𝑖=1 σ𝑗=1 𝑅𝑖𝑗 − ] is the variance of ranks
𝑁−1 4
2
• If 𝐻 > 𝜒1−𝛼,𝑎−1 , hypothesis is rejected.
Factorial Design
Prof. Sayak Roychowdhury
Applications

Product
DOE Develop
ment

Improve Process
ment of
Existing Improve
Product ment

Service
Troubles
Improvem
hooting
ent
Process

Controllable Factors 𝑥1 , . . 𝑥𝑘 (yeast, flour, wood…)

Input Process Output


𝑦

Uncontrollable Factors 𝑦1 , . . 𝑦𝑚 (pizza maker at store,


taste preference of customer, environment)
Model

Controllable Factors 𝑥1 , . . 𝑥𝑘 (yeast, flour, wood…)

Optimize!!!

Input Model
Y=f(x) Output
𝑦
(taste)

Uncontrollable Factors 𝑦1 , . . 𝑦𝑚 (pizza maker at store,


taste preference of customer, environment)
Some Purposes of Experimentation
• Determine
• Which variables are most influential on the output (Screening)
• Where to set the inputs to produce the output at the most
desirable level (Optimization)
• Where to set the influential inputs to reduce the variability in
the output (Robustness)
• Where to set the controllable inputs so that the effects of the
uncontrollable inputs are minimized (Robustness)

• Find the defining equation Y=f(x) in order to optimize the


process

Pg
5
Sequence of Steps
• Select factors and responses
• Select experimental design
• Run experiments
• Fit model form
• Determine significant factors
• Set unimportant factors to economical settings and plan
next experiment
• Optimize the model
Sequence of Steps
• Select factors and responses
• Select experimental design
• Run experiments
• Fit model form
• Determine significant factors
• Set unimportant factors to economical settings and plan
next experiment
• Optimize the model
Factors, Levels and Responses
▪ Problem Definition
▪ Select the factor levels that make the pizza delicious in taste and
consistent in shape.
▪ Factors, Levels, and Ranges of Interest
Factors Levels (low, high) Responses

Flour Cheap, Expensive Taste

Yeast Less, More Consistency


Olive Oil Virgin, Extra Virgin

Kneading Less, More


Mixture Old, New
Wood Unknown, Hickory
Paper Helicopter
Factors low high
A Body height cm 1.5 4 E
B Foldover length cm 0.5 1.5
C Tail length cm 2 5 A
D Wing width cm 2 4
E Wing length cm 3 6 F
F Pitch angle deg. 0 20 D
G Paper type lined regular
C
1 cm is 2 squares

B
Factors, Levels and Responses
▪ Problem Definition
▪ Select the factor levels that make the helicopter stay in the air as long as
possible. Maybe also, for the helicopter to spin many times before it
lands.
▪ Factors, Levels, and Ranges of Interest
Factors, Levels and Responses

Some factors levels and responses for cycling time to


ISE from Hostel?
Sequence of Steps
• Select factors and responses
• Select experimental design
• Run experiments
• Fit model form
• Determine significant factors
• Set unimportant factors to economical settings and plan
next experiment
• Optimize the model
Select Experimental Design

http://www.itl.nist.gov/div898/handbook/pri/secti
on3/pri33.htm
Fractional
Factorial
Full Factorial

Box-Behnken
Latin
Plackett- Hypercube
Burman
Central
Composite
Factorial Design
• For experiments that involve the study of the effects of two or
more factors, factorial designs are most efficient for this type
of experiment.
• Factorial design, means that in each complete trial or replicate
of the experiment all possible combinations of the levels of
the factors are investigated.
• For example, if there are 𝑎 levels of factor A and 𝑏 levels of
factor B, each replicate contains all 𝑎𝑏 treatment
combinations.
• Factorial design are often said to be crossed design.
• The effect of a factor is defined to be the change in response
produced by a change in the level of the factor.
• This is frequently called a main effect because it refers to the
primary factors of interest in the experiment.
Main Effects
2 Factor 2 Level design without interaction
Run Factor A Factor B Response Y
1 + + 40
2 - + 20
3 + - 30
4 - - 10

Main Effects:
40 B 30 + 40 10 + 20
+B- 𝑒𝐴 = 𝑌ത𝐴+ − 𝑌ത𝐴− = − = 20
30 2 2
Y

B
20 +
10
B- 20 + 40 10 + 30
𝑒𝐵 = 𝑌ത𝐵+ − 𝑌ത𝐵− = −
A- A+ 2 2
= 10
Main Effects

𝐴=

B =
Effects
2 Factor 2 Level design with interaction (difference in response in
the levels of 1 factor is not the same for all the levels of other factors)
Run Factor A Factor B AB Response Y
1 + + + 0
2 - + - 20
3 + - - 30
4 - - + 10

Main Effects:
30+0 10+20
B+
B- 𝑒𝐴 = 𝑌ത𝐴+ − 𝑌ത𝐴− = 2 − 2 = 0*
20 + 0 10 + 30
ത ത
𝑒𝐵 = 𝑌𝐵+ − 𝑌𝐵− = − = 10
Y

B- 2 2
B+

10 + 0 20 + 30
𝑒𝐴𝐵 = 𝑌ത𝐴𝐵+ − 𝑌ത𝐴𝐵− = − = −20
A- A+ 2 2
Main Effects with Interaction

𝐴=

B =
Factorial Design to Fitting Regression Lines
• The linear regression model representation
𝑦 = 𝛽0 + 𝛽1 𝑥1 + 𝛽2 𝑥2 + 𝛽12 𝑥1 𝑥2 + 𝜖
Where 𝜖~𝑁 0, 𝜎 2
𝐸𝑓𝑓𝑒𝑐𝑡𝑗
• For 2 level factorial design 𝛽𝑗 =
2
• True for both main and interaction effects
• 𝛽0 = 𝐴𝑣𝑒𝑟𝑎𝑔𝑒 𝑜𝑓 𝑎𝑙𝑙 𝑟𝑒𝑠𝑝𝑜𝑛𝑠𝑒𝑠
• These estimates are equal to least square regression
estimates.
2 Factor Factorial Design

• Factor A has 𝑎 levels, factor B has 𝑏 levels, each combination


replicated n times.
• The experimental run order should be randomized
ANOVA for factorial design
• The effects model is given by

• Testing hypothesis for equality of row treatments


𝐻0 : 𝜏1 = 𝜏2 = ⋯ = 𝜏𝑎 = 0 ; 𝐻1 : 𝑎𝑡 𝑙𝑒𝑎𝑠𝑡 1 𝜏𝑖 ≠ 0
• Testing hypothesis for equality of column treatments
𝐻0 : 𝛽1 = 𝛽2 = ⋯ = 𝛽𝑏 = 0; 𝐻1 : 𝑎𝑡 𝑙𝑒𝑎𝑠𝑡 1 𝛽𝑗 ≠ 0
• Testing hypothesis for interactions
𝐻0 : 𝜏𝛽 𝑖𝑗 = 0 for all 𝑖, 𝑗; 𝐻1 : 𝑎𝑡 𝑙𝑒𝑎𝑠𝑡 1 (𝜏𝛽)𝑖𝑗 ≠ 0
2 Factor Factorial Design ANOVA
Source SS Dof MS F F_crit
A SSA a-1 SSA/(a-1) MSA/MSE F(0.95,a-1,ab(n-1))

F(0.95,b-1, ab(n-1))
B SSB b-1 SSB/(b-1) MSB/MSE

F(0.95, (a-1)(b-1)
AB SSAB (a-1)(b-1) SSAB/((a-1)(b-1)) MSAB/MSE , ab(n-1))

Error SSError ab(n-1) SSError/ab(n-1)


Total SSTotal abn-1

𝑆𝑆𝑇 = 𝑆𝑆𝐴 + 𝑆𝑆𝐵 + 𝑆𝑆𝐴𝐵 + 𝑆𝑆𝐸


For 22 design, a=2, b=2
Calculations
Estimators
• 𝜇ො = 𝑦ത …
• 𝜏Ƹ 𝑖 = 𝑦ത𝑖 . . −𝑦ത … for 𝑖 = 1, . . , 𝑎
• 𝛽መ𝑗 = 𝑦ത.𝑗. − 𝑦ത … for 𝑗 = 1, . . , 𝑏
෢𝑖𝑗 = 𝑦ത𝑖𝑗. − 𝑦ത𝑖.. − 𝑦ത.𝑗. + 𝑦ത …
• 𝜏𝛽
for 𝑖 = 1, . . , 𝑎 , 𝑗 = 1, . . , 𝑏
Fitted value 𝑦ො𝑖𝑗𝑘 = 𝜇ො + 𝜏Ƹ 𝑖 + 𝛽መ𝑗 + 𝜏𝛽
෢𝑖𝑗 = 𝑦ത𝑖𝑗 .
Residual 𝑦𝑖𝑗𝑘 − 𝑦ො𝑖𝑗𝑘
Advantages of Factorial Designs (FD)
• FDs are more efficient than one factor at a time
experiments
• FDs can reveal interaction effects, to avoid deriving
misleading conclusions
• FDs allow the effects of a factor to be estimated at several
levels of the other factors, yielding conclusions that are
valid over a range of experimental conditions
• FDs make the regression model fitting easy
22 Design
23 Design
Design and Analysis of Experiments, D.C. Montgomery Ch 6.
22 Design with n Replicates
Orthogonality

1. Except column I, every column has equal number of + and -.


2. The sum of products of any 2 columns is 0.
3. The product of any 2 columns results another column of the table
• ## All of these are implied by Orthogonality
ANOVA
• Sum of squares of 2𝑘 designs can be calculated by
𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 2
𝑆𝑆𝐹𝑎𝑐𝑡𝑜𝑟 =
2𝑘 𝑛
2 𝐶𝑜𝑛𝑠𝑡𝑟𝑎𝑠𝑡
𝐸𝑓𝑓𝑒𝑐𝑡 =
𝑛2𝑘
Variance
• If there are 𝑛 replicates in each of the 2𝑘 runs, and if
𝑦𝑖1 , 𝑦𝑖2 … , 𝑦𝑖𝑛 are the observations of the 𝑖 𝑡ℎ run, then
estimated variance of 𝑖 𝑡ℎ run is:
1 2
𝑆𝑖2 = σ𝑛𝑗=1 𝑦𝑖𝑗 − 𝑦ഥ𝑖 for 𝑖 = 1, . . , 2𝑘
𝑛−1

• Overall variance estimate is given by:


2𝑘 𝑛
2
1 2
𝑆 = 𝑘 ෍ ෍ 𝑦𝑖𝑗 − 𝑦ഥ𝑖
2 (𝑛 − 1)
𝑖=1 𝑗=1
Variance
• Variance of each effect estimate:
𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 1
𝑉 𝐸𝑓𝑓𝑒𝑐𝑡 = 𝑉 = 2 𝑉(𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡)
𝑛2𝑘−1 𝑛2𝑘−1
• 𝑉 𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 = 𝑛2𝑘 𝜎 2
• 𝜎 2 is estimated by 𝑆 2
• Standard error of effect:
2𝑆
𝑆𝐸 𝐸𝑓𝑓𝑒𝑐𝑡 =
𝑛2𝑘
Test of Significance of Effect
• To test the significance of estimated effects:
𝐸𝑓𝑓𝑒𝑐𝑡
𝑡0 =
𝑆𝐸 𝐸𝑓𝑓𝑒𝑐𝑡

Compare the test-statistic with 𝑡𝑐𝑟𝑖𝑡 = 𝑡𝛼,𝑁−𝑝


2
𝑁 = 𝑇𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑟𝑢𝑛𝑠
𝑝 = 𝑇𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑝𝑎𝑟𝑎𝑚𝑒𝑡𝑒𝑟𝑠
Blocking and
Confounding
Prof. Sayak Roychowdhury
Ref Book: Design and Analysis of Experiments by
D.C. Montgomery
Orthogonality

1. Except column I, every column has equal number of + and -.


2. The sum of products of any 2 columns is 0.
3. The product of any 2 columns results another column of the table
• ## All of these are implied by Orthogonality
22 Design with n Replicates
ANOVA
• Sum of squares of 2𝑘 designs can be calculated by
𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 2
𝑆𝑆𝐹𝑎𝑐𝑡𝑜𝑟 =
2𝑘 𝑛
2 𝐶𝑜𝑛𝑠𝑡𝑟𝑎𝑠𝑡
𝐸𝑓𝑓𝑒𝑐𝑡 =
𝑛2𝑘
Sparsity of Effects Principle
• Most systems are dominated by some of the main effects,
and low-order interactions.
• High order interactions are mostly negligible
• An article by Li, Sudarshan and Frey (2006) pointed out
that out of 43 published full factorial experiments
• Most had 3-4 factors
• About 40% of the main effects were significant
• About 11% of 2-factor interactions were significant
• 3-factor interactions were about 5%
Blocking and Confounding
• In many problems, it is not possible to perform a
complete replicate in each block
• Confounding is a design technique for arranging a
complete factorial experiment, where the block size is
smaller than the number of treatment levels.
• This causes some of the treatment effects (usually high
order interactions) to be indistinguishable from the
blocks. (that’s why it is called confounding)
• Here we consider 2𝑘 designs in 2𝑝 (𝑝 < 𝑘) incomplete
blocks, i.e. number of blocks can be 2,4, 8…
Confounding with 2 Blocks
• A 2 factor 2 level factorial design is conducted in 2 blocks

• Notice how the blocks are confounding the AB interaction


Confounding with 2 Blocks
• A 3 factor 2 level factorial design is conducted in 2 blocks
Confounding with 2 Blocks
• Notice how positives of ABC interactions are put in block
2 and the negatives in block 1
Method of Defining Contrast
• The other method of defining blocks consist of a linear
combination of factors
𝐿 = 𝛼1 𝑥1 + ⋯ . +𝛼𝑘 𝑥𝑘
• In a 2𝑘 system, we have either 𝛼𝑖 = 0 or 1 and 𝑥𝑖
= 0 𝑜𝑟 1
• The blocks are assigned based on 𝐿 𝑚𝑜𝑑 2 (= 0 or 1)
• For 23 design, to confound ABC, put 𝛼1 = 𝛼2 = 𝛼3 = 1
Group Theory Approach
• Block containing the treatment (1) is called the principal
block
• Any element in the principal block (except (1)) can be
generated by multiplying 2 other elements in the principal
block, e.g. 𝑎𝑏. 𝑎𝑐 = 𝑎2 𝑏𝑐 = 𝑏𝑐; 𝑎𝑏. 𝑏𝑐 = 𝑎𝑏 2 𝑐 = 𝑎𝑐 etc.
• Treatment combinations in the other block can be
generated by multiplying 1 element in the new block by
each element in the principal block,
e.g. 𝑏. 1 = 𝑏; 𝑏. 𝑎𝑏 = 𝑎; 𝑏. 𝑎𝑐 = 𝑎𝑏𝑐; 𝑏. 𝑏𝑐 = 𝑐 etc
𝑘
2 Factorials in 4 Blocks
• When number of factors is moderately large 𝑘 ≥ 4, a 2𝑘
design can be constructed with four blocks
• There will be 2𝑘−2 observations in each block
• For 25 design, confounding 𝐴𝐷𝐸 and 𝐵𝐶𝐸 it is possible to
generate the 4 blocks in the following way
• Defining contrasts
𝐿1 = 𝑥1 + 𝑥4 + 𝑥5
𝐿2 = 𝑥2 + 𝑥3 + 𝑥5
• Every combination will generate a particular pair of values
or 𝐿1 𝑚𝑜𝑑 2 and 𝐿2 𝑚𝑜𝑑 2:
𝐿1 , 𝐿2 ∈ { 0,0 , 0,1 , 1,0 , (1,1)}
𝑘
2 Factorials in 4 Blocks

• Generalized interaction of ADE and BCE are also confounded,


𝐴𝐷𝐸. 𝐵𝐶𝐸 = 𝐴𝐵𝐶𝐷𝐸 2 = 𝑨𝑩𝑪𝑫
• Care must be taken in choosing effects to be confounded,
generally higher order interactions are chosen.
Group Theory Approach for 4 Blocks
• The group theoretic approach can also be applied when
there are 4 blocks
• Product of 2 treatment combinations in the principal
block yields another element in the principal block
e.g 𝑎𝑑. 𝑏𝑐 = 𝑎𝑏𝑐𝑑, 𝑎𝑏𝑒. 𝑏𝑑𝑒 = 𝑎𝑑 etc
• For a new block, select a treatment combination that is
not in principal block, and multiply it with all the
combinations in the principal block
e.g. 𝑏. 1 = 𝑏; 𝑏. 𝑎𝑑 = 𝑎𝑏𝑑 etc
• The general procedure for generating 4 blocks, is to
choose 2 effects to generate the blocks, then use defining
contrast and group theory approach
2𝑝 Blocks
• The methods can be extended to 2𝑘 factorial designs
confounded in 2𝑝 blocks (𝑝 < 𝑘) where each block
contains 2𝑘−𝑝 runs
• Select 𝑝 independent effects to be confounded, which
means no effect chosen should be generalized interaction
of others
• Blocks are generated by 𝑝 defining contrasts 𝐿1 , 𝐿2 . . , 𝐿𝑝
associated with these effects
• Exactly 2𝑝 − 𝑝 − 1 other effects will be confounded with
blocks as generalized interactions
• For ANOVA, sum of squares are calculated without blocks,
then all the SS associated with confounded effects are
added to get 𝑆𝑆𝐵𝑙𝑜𝑐𝑘𝑠
𝑘
Suggested Blocking for 2 Design
Partial Confounding
• When there are multiple replicates, the blocks can be
chosen in such a way, that some information can be
retained for confounded interactions.
• Consider the 23 designs with 4 replicates, and each
replicate can be accommodated in 2 blocks
• Completely confounded ABC
Partial Confounding
• Fully confounded ABC

• Partially confounded ABC


Partial Confounding

• Information of ABC can be obtained from replicates II, III,


IV
• Information of AB can be obtained from I, III, IV
• Information of BC can be obtained from I, II, IV
• Information of AC can be obtained from I, II, III
Fractional Factorial
Design
Prof. Sayak Roychowdhury
Ref Book: Design and Analysis of Experiments by
D.C. Montgomery
Fractional Factorial
• In 26 full factorial design, there are 64 runs, only 6
dof of 63 dof are main effects, and 15 dofs
correspond to 2 factor interactions
• Rest of the 42 dofs are for 3 factor interactions or
higher.
• If higher order interactions can be reasonably
ignored, then fractional factorial designs are more
economical.
• FF designs are used for screening experiments
Fractional Factorial
• The successful use of fractional factorial design is based
on three key ideas:
• Sparsity of Effects: Out of several variables, the system or
process is likely to be driven by some of the main effects
and low-order interactions
• The projection property: FF designs can be projected
onto larger designs in the subset of significant factors
• Sequential experimentation: Combine two or more FF
designs to construct sequentially larger designs
Factorial Designs
• Full Factorials: 𝑘 factors, 2 levels (high, low), 2𝑘 runs of each replicate
• We assume response is approximately linear over the range of factor levels.
Fractional Factorials:𝟐𝒌−𝒑 runs, where 𝒑 < 𝒌
• Regular Fractional Factorials (A and D optimal):
• All columns can be obtained by multiplying other columns
• Number of runs 𝑛 = 2𝑚 where 𝑚 is a positive integer (4,8,16,32…)
• Placket Burman Designs (A and D optimal):
• Invented by Placket and Burman (1946) with irregular fractional
factorial designs
• Number of runs are found in multiples of 4. (12,16,20,24…)
• Chosen based on availability
• Rule of Thumb: Pick smallest A-optimal design with enough columns
𝑘
One-half Fraction 2 Design
• Consider ½ fraction of 23 design

Selected
fraction

• Selected only +ABC, so ABC is called generator


• 𝑰 = 𝑨𝑩𝑪 is called defining relation
𝑘
One-half Fraction 2 Design
• Consider ½ fraction of 23 design A= BC; B=AC; C=AB

Selected
fraction

• Selected only +ABC, so ABC is called generator


• 𝑰 = 𝑨𝑩𝑪 is called defining relation
𝑘
One-half Fraction 2 Design
• Estimation of main effects of A, B and C

• Estimation of interactions

• [A]= [BC], [B]=[AC], [C]=[AB] impossible to differentiate the


effects. This is called aliasing. A&BC, B&AC, C&AB are aliases.
• We are really estimating A+BC, B+CA, C+AB
𝑘
One-half Fraction 2 Design
• In Minitab, the alias structure for the same design is
shown in the following way:
Aliases can be obtained by
multiplying the factor with I=ABC:
𝐴. 𝐼 = 𝐴. 𝐴𝐵𝐶 = 𝐵𝐶 and so on

𝐼 = +𝐴𝐵𝐶 is also called principal fraction


Resolution of Design
Alias: When the estimate of an effect also include the influence of
one or more other effect. E.g. C=AB in 23−1 design.
A design is of resolution R if no p-factor effect is aliased with another
effect containing less than R - p factors
Resolution of Design
• A design is of resolution R if no p-factor effect is aliased
with another effect containing less than R - p factors
• In general, the resolution of a two-level fractional factorial
design is equal to the number of letters in the shortest
word in the defining relation
• Fractional designs that have the highest possible
resolution consistent with the degree of fractionation
required are usually deployed
• The higher the resolution, the less restrictive the
assumptions that are required regarding which
interactions are negligible to obtain a unique
interpretation of the results
Construction of One-half Fraction
• One-half fraction of 2𝑘 design of highest resolution maybe
constructed by writing down full 2𝑘−1 factorial
• 𝑘 𝑡ℎ factor is added with + and – sign same as that of the highest
order interaction of ABC…(K-1)
• Below, factor C is added, with identical levels as of AB.
• Another option is to confound highest order interaction and
creating 2 blocks, each will be one-half fraction
Sequences of Fractional Factorials
• It is advisable to run a one-half fraction of experiment
first, analyse the result, and then run the other half
• For example if there are k = 4 factors, and you can afford
24 = 16 runs, it is better to run half fraction 24−1
𝐼𝑉 = 8
runs and then decide on the next runs based on the
results.
• Sequential experimentation lose information only for
highest order interactions
NPP of Effects Example

A B C D E F G Time
1 -1 -1 -1 -1 1 1 0.25
1 1 -1 1 -1 -1 -1 0.6
-1 -1 -1 1 1 1 -1 1.74
-1 -1 1 1 -1 -1 1 1.17
-1 1 -1 -1 1 -1 1 1.3
1 -1 1 -1 1 -1 -1 0.93
1 1 1 1 1 1 1 1.32
-1 1 1 -1 -1 1 -1 0.7
Steps for NPP of Effects

Factor Avg. at - Avg. at + Effect Coef. Est. "i " (i -.5)/7 Effect Factor =NORMINV((i -0.5)/7,0,1)
A 1.2275 0.775 -0.4525 -0.22625 1 0.071428571 -0.4525 A -1.4652323
B 1.0225 0.98 -0.0425 -0.02125 2 0.214285714 -0.0425 B -0.7916378
C 0.9725 1.03 0.0575 0.02875 3 0.357142857 0.0025 F -0.3661057
D 0.795 1.2075 0.4125 0.20625 4 0.5 0.0175 G 0
E 0.68 1.3225 0.6425 0.32125 5 0.642857143 0.0575 C 0.36610572
F 1 1.0025 0.0025 0.00125 6 0.785714286 0.4125 D 0.79163783
G 0.9925 1.01 0.0175 0.00875 7 0.928571429 0.6425 E 1.46523234

• Plot the ordered effects on X axis, the normal quantiles


obtained above on the Y axis
• Factors off the line are significant
Normal Probability Plot of Effects

NPP of Effects
2

1.5 E

1
D
0.5
C
0 G
-0.6 -0.4 -0.2 0 F 0.2 0.4 0.6 0.8
-0.5
B
-1

A -1.5

-2
Example
• A chemical product is produced in a pressure vessel. Four
factors are A: Temperature, B: Pressure, C: Concentration
of Formaldehyde, D: Stirring Rate.
• Response: Filtration Rate (to be maximized)
• Additional objective: Reduce formaldehyde concentration
Example 1
𝟐𝟒 Full Factorial Design

Source: Montgomery D.C., Design and Analysis of Experiments


Example 1
• 𝟐𝟒−𝟏
𝑰𝑽 Fractional Factorial Design with defining fraction
𝑰 = 𝑨𝑩𝑪𝑫
Source: Montgomery D.C., Design and Analysis of Experiments

𝑨. 𝑨𝑩𝑪𝑫 = 𝑩𝑪𝑫; 𝑩. 𝑨𝑩𝑪𝑫 = 𝑨𝑪𝑫; 𝑪. 𝑨𝑩𝑪𝑫 = 𝑨𝑩𝑫; 𝑫. 𝑨𝑩𝑪𝑫 = 𝑨𝑩𝑪


𝑨𝑩. 𝑨𝑩𝑪𝑫 = 𝑪𝑫; 𝑩𝑪. 𝑨𝑩𝑪𝑫 = 𝑨𝑫; 𝑪𝑫. 𝑨𝑩𝑪𝑫 = 𝐀𝐁; so on
Example 1

• Calculation of effects
• Normal probability
plot of effects
• ANOVA

𝐶𝑜𝑛𝑡𝑟𝑎𝑠𝑡 2
𝑆𝑆𝐹𝑎𝑐𝑡𝑜𝑟 =
𝑁
2 𝐶𝑜𝑛𝑠𝑡𝑟𝑎𝑠𝑡
𝐸𝑓𝑓𝑒𝑐𝑡 =
𝑁
Where N= number of runs
One-Quarter Fractional Factorial
• For moderately larger number of factors 𝑘 , 2𝑘−2 fractional
factorial is often useful
• This is full factorial with 𝑘 − 2 factors, 2 additional factors
added with appropriately chosen interactions with the first
𝑘 − 2 factors
• 2 generators 𝑃 and 𝑄, 𝐼 = 𝑃 and 𝐼 = 𝑄 are called generating
relations
• Complete defining relation 𝐼 = 𝑃 = 𝑄 = 𝑃𝑄 (𝑃𝑄 is
generalized interaction).
• Elements of defining relation 𝑃, 𝑄, 𝑃𝑄 are also called words
• Aliases of an effect are produced by multiplication of that
factor with each word
One-Quarter Fractional Factorial
• Aliases of an effect are produced by multiplication of that
factor with each word
• Example: In a 26−2 design 𝐼 = 𝐴𝐵𝐶𝐸 and 𝐼 = 𝐵𝐶𝐷𝐹 are
chosen as defining generator
• Complete defining relation 𝐼 = 𝐴𝐵𝐶𝐸 = 𝐵𝐶𝐷𝐹 = 𝐴𝐷𝐸𝐹
• Aliases :
𝐴 + 𝐵𝐶𝐸 + 𝐴𝐵𝐶𝐷𝐹 + 𝐷𝐸𝐹
𝐵 + 𝐴𝐶𝐸 + 𝐶𝐷𝐹 + 𝐴𝐵𝐷𝐸𝐹
etc.
• For 26−2 design, start with full factorial 24 design, then
put 𝐸 = 𝐴𝐵𝐶, 𝐹 = 𝐵𝐶𝐷 columns
𝑘−𝑝
General 2 Design
• 2𝑘 factorial design containing 2𝑘−𝑝 runs is called 1/2𝑝
fraction of the 2𝑘 design, or 2𝑘−𝑝 fractional factorial
design
• 𝑝 independent generators to be chosen
• 2𝑝 − 𝑝 − 1 generalized interactions
• Each effect has 2𝑝 − 1 aliases
• Care should be taken so that effects of interest should not
be aliased with each other
• A reasonable criterion is the resulting 2𝑘−𝑝 design should
be of highest possible resolution
Plackett – Burman Designs
• 2 level fractional factorial designs developed by Plackett
and Burman to study 𝑘 = 𝑁 − 1 variables in 𝑁 runs
• For 𝑁 = 12, 20, 24, 28, 36 runs. These designs cannot be
represented as cubes, so they are called non-geometric
designs.
• Main effects may be partially aliased with 2 factor
interactions
• PB designs are non-regular designs. In regular designs, all
effects can be estimated distinctly, or completely aliased.
• In non-regular designs, some information on aliased
effects maybe available
Plackett – Burman Designs
• Write the appropriate row in the table as a column
• A second column is then generated from the first by moving the
element of the column down 1 position, and the placing the last
element of column 1 in first position of column 2.
• 3rd column is generated from 2nd in the same way
• Generate 𝑘 columns in this way
• A row of minus signs are then added
• This method is only for 12, 20, 24 and 36 runs
Plackett – Burman Designs
• PB designs have complex alias structure
• In a 12 run design, every main effect is partially aliased
with every 2-factor interactions, not involving itself
• PB is a non-regular design. In a regular design, all effects
can be either estimated independently of other effects or
they are completely aliased.
Choosing Fractional Factorial Designs

Preferred n=#runs # factors Design # columns


* 4 3 Reg. 3
4 PB 3
* 8 4-7 Reg. 7
8 PB 7
16 8-11 Reg. 15
* 12 PB 11
* 16 12-16 Reg. 15
16 PB 15
Optimal Designs
• Volume of the joint confidence region that contains all the
model regression coefficients is inversely proportional to
the square root of the determinant of 𝑋 ′ 𝑋.
• Minimize the variance of estimates
• 𝑋 ′ 𝑋 – Information matrix
• A-optimal designs: Minimizes trace (sum of the elements
of main diagonal) of inverse of information matrix. It
minimizes the average variance of the estimates of the
regression coefficients
• D-optimal designs: Maximizes |𝑋 ′ 𝑋| , minimizes
generalized variance of the model regression coefficients.
Optimal Designs
• G-optimal designs: Minimizes the maximum prediction
variance 𝑉(𝑦)

• I-optimal designs: Minimizes average prediction variance.

• 2𝑘 designs are A,D, G and I optimal for first order model


and first order model with interactions.
Saturated and Supersatured
• When 𝑘 = 𝑁 − 1, where 𝑘 is the number of factors, and
𝑁 is the number of runs, it is called a saturated design
• Examples 23−1
𝐼𝐼𝐼 , 27−4
𝐼𝐼𝐼 , 15 factors 16 runs design
• When 𝑘 > 𝑁 − 1, the designs are called supersaturated
designs
• One plausible way of creating supersaturated design is to
use Hadamard Matrices (square orthogonal matrices)
• For more info, check section 8.8 (Montgomery, DoE book)
Resolution III Design
• Resolution III designs can be used up to 𝑘 = 𝑁 − 1 factors
in 𝑁 runs, where 𝑁 is a multiple of 4, e.g. 23−1
𝐼𝐼𝐼 , 27−4
𝐼𝐼𝐼
• 27−4
𝐼𝐼𝐼 is 1/16 th of 27

• Can be constructed first by writing the full factorial with A,


B, C, then associating 4 additional factors with aliases,
such as D=AB, E=AC, F=BC, G=ABC
Resolution III Design
• 27−4
𝐼𝐼𝐼 can be used to obtain other resolution III designs for
studying designs with fewer than 7 factors and 8 runs
• 26−3
𝐼𝐼𝐼 can be obtained just by dropping the column G
• Defining relation of 26−3𝐼𝐼𝐼 can be obtained from defining
relation of 27−4𝐼𝐼𝐼 with any word containing letter G
dropped
• It is possible to have resolution III design with 15 factors in
16 runs, 215−11
𝐼𝐼𝐼
• First write 16 treatment combinations for a full 24 design
with factors A,B,C,D and then equating the remaining 11
factors as 2
Resolution of Design
Resolution of Design
Randomize
Randomized Order: To balance the effects of
extraneous and uncontrollable factors.
Important to make valid statistical
inferences!
Significant Factors
• Ways:
• Main effects plot, interactions plot
• Normal Probability Plot of Effects
• ANOVA (Analysis of Variance)
Steps for NPP of Effects
2

1.5 E

1
D
0.5

z score
0
-0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8

-0.5

-1

A -1.5

-2
Effects

More chances to find


insignificant factors for
noise estimation.
A Few Tips for Experimentation
• Use your non-statistical knowledge of the problem
• Especially valuable for choosing factors and levels
• Using statistics is no substitute for thinking about the problem
• Keep the design and analysis as simple as possible
• KISS – Keep it Simple, Stupid!
• more complex = more potential for error
• Complex experiments and analyses are often fraught with errors
• Know the difference between statistical and practical
importance
• Every statistically significant difference is not important or practical to
implement
• Experiments are usually iterative
• Our knowledge increases with time. Expect to use several experiments to
arrive at the optimum process.
• Rule of thumb: don’t use more than 25% of resources in 1st Expt.
P
g
3
Definitions for DOE
• Balanced Design (Orthogonality) – An experiment design where each
level of a factor is repeated the same number of times for all
possible combinations of the levels of the other factors. A balanced
design is said to be orthogonal
• Block – A group of homogeneous experimental runs
• Blocking Variable – A factor, usually a noise variable, that is an
undesirable source of variability. A group of experimental runs at a
single level of the blocking variable is a block.
• Experimental Design – The formal plan for conducting the
experiment. This includes the choices of factors, levels, responses,
blocks, sample sizes, repetitions, replications & randomization.
• Factors (or inputs) – One of the controlled or uncontrolled variables
whose influence on a response (output) is being studied in an
experiment.
• Interaction – When the difference in the response between levels of
one factor varies as the level of the other factors is changed.

P
g
3
Definitions for DOE – continued
• k1 x k2 x k3 …-- The description of the basic design of a factorial
experiment. The number of “k’s” is the number of factors. The
value of the “k” is the number of levels of interest of that factor. For
example, a 2 x 3 x 3 experiment has three factors; one input has two
levels, and two have three levels. In this case, 2x3x3 = 18
combinations for this full factorial experiment.
• k-way Interaction – an interaction between k number of variables
• Level – Values of the factor being studied in the experiment.
• Main Effect – The change in the average response observed during
the change from one level to another for a single factor
• Test Run (Experimental Run) – A single combination of factor levels
that yields one or more observations of the responses
• Treatment Combination – An experimental run using a set of specific
levels in each input variable.

P
g
3
Advanced DOE Definitions
• Confounding – One or more effects that cannot be unambiguously
attributed to a single factor or interaction. Usually due to problems
in design.
• Fixed Factor – Factors whose levels are specifically chosen.
Conclusions about fixed factors generalize to only those levels.
Determination of the factor’s effect on the level of the output is
usually the goal of the experiment. (Knob settings, for example)
• Random Factor – Factors whose levels are selected randomly from a
larger population of possible levels. Determination of the factor’s
contribution to the overall variance of the system is the goal of this
experiment. (Selecting 3 machines out of 20, for example)
• Repetitions – consecutive experimental runs using the same
treatment combinations. With no change in setup between runs.
• Replications – experimental runs using the same treatment
combinations that were not run consecutively. Replications are
duplicate runs of the entire experiment. Sample size calculations
apply to replicates, not repetitions.
• http://www.itl.nist.gov/div898/handbook/pri/section7/pri7.htm
P
g
4
Factors, Levels and Responses
▪ Problem Definition
▪ Select the factor levels that make the pizza delicious in taste and
consistent in shape.
▪ Factors, Levels, and Ranges of Interest
Factors Levels (low, high) Responses

Flour Cheap, Expensive Taste

Yeast Less, More Consistency


Olive Oil Virgin, Extra Virgin

Kneading Less, More


Mixture Old, New
Wood Unknown, Hickory
Including Cost Analysis (Cube Plot Analysis)

What if said the overall task is to minimize cost while satisfying your customers.
• Hickory Wood is not much more expensive than Unknown Wood
• Expensive Flour is significantly higher than cheap flour
• Also, the kneading machine can be set at a high or low level without significantly increasing the energy usage (and
hence, cost).
• Customer satisfaction ratings were consistently high for Taste levels of 8 or above and Crust Consistency levels of 6
or over.
Examine carefully all the model graphs and pick the best recipe for pizza that will minimize your cost while still
ensuring high customer satisfaction. (After Narrowing it Down to 4 Significant factors)
ANOVA- Taste
ANOVA-Consistency

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy