PPPPPPPPPP
PPPPPPPPPP
GROUP NUMBER:- 5
Group members ID NUMBER
1,GIZAW GETACHEW NSR/432/15
2,KUTABA KUYLITA NSR/582/15
3,MAHILET ABERA NSR/609/15
4,BIBLE BABE NSR/200/15
5,NAHUSENAY GETACHEW NSR/730/15
6,TADELE TIGISTU NSR/875/15
7,FAYISA DULOMA NSR/1623/14
8,SIMEGN GASHYALEW NSR/851/15
9,YOHANNIS KIDANU NSR/1015/15
0
Submitted to: Mr. Biruk S.
Contents
INTRODUCTION...........................................................................................................................2
Estimation in probability.................................................................................................................3
Mean square.....................................................................................................................................3
mean Square Estimation..................................................................................................................5
Linear mean square estimation........................................................................................................7
Referance.......................................................................................................................................10
1
INTRODUCTION
The Mean Square Estimation is a widely used technique in various fields, such as econometrics,
machine learning, and signal processing, to estimate the parameters of statistical models and
make predictions based on the observed data. Linear mean square estimation is a technique used
in statistics to estimate the parameters of a linear regression model.
The goal of linear mean square estimation is to find the best-fitting line (or plane in higher
dimensions) that minimizes the sum of the squared differences between the observed values and
the values predicted by the model.
2
Estimation in probability
For trials with categorical outcomes (such as noting the presence or absence of a term), one way
to estimate the probability of an event from data is simply to count the number of times an event
occurred divided by the total number of trials. This is referred to as the relative frequency of an
event.
There are two basic types of estimation problems. In the first type, we are interested in
estimating the parameters of one or more r.v.'s, and in the second type, we are interested in
estimating the value of an inaccessible r.v. Y in terms of the observation of an accessible r.v. X.
But we are going to focus on the second type of estimation .
Mean square
The mean square (MS) is a statistical measure that represents the average of the squared
deviations from the mean of a dataset. It is calculated by summing the squared differences
between each data point and the mean, and then dividing the result by the number of data points.
MS = Σ(x - μ)2 / n
Where:
The mean square is commonly used in various statistical analyses and applications, such as:
3
1. Analysis of Variance (ANOVA):
- The mean square is used to calculate the F-statistic, which is used to determine the statistical
significance of the differences between group means.
2. Regression Analysis:
- The mean square is used to calculate the residual variance, which measures the goodness of
fit of a regression model.
3. Quality Control:
- The mean square is used to monitor the variability of a process and detect any changes or
deviations from the desired target.
4. Signal Processing:
- The mean square is used to measure the power of a signal, which is important in applications
such as audio and image processing.
The mean square is a useful metric because it provides a measure of the overall
variability or dispersion of a dataset. It is particularly sensitive to large deviations from
the mean, as the squared differences amplify the impact of outliers or extreme values.
The mean square is often used in conjunction with other statistical measures, such as the
standard deviation or the coefficient of variation, to provide a more comprehensive
understanding of the dataset.
4
mean Square Estimation
- It aims to minimize the mean square error between the estimated value and the true value,
making it a powerful tool for various applications.
suppose Xi, X2,........., Xn represent a Sequence of random Variables about whom one Set
observation are available. And Y represent the new random Variable of unknown random
Variable.
The Problem to obtain good estimate or estimating the value of inaccessible random Variable
“Y” in term of Observations of accessible random variable X1,X2,......Xn.
Y٨=g(X1,X2,.........,Xn)
Y٨=g(X)
Suppose the sample space S is the set of all children in a community and the random variable Y
is the height of each children and Y(C) is the height of specific child.
-It follow that if we wish to estimate by a number this number must equal the mean of Y.
-we known assume that each Selected Child is weight . The weight random variable X hence the
optimum estimate of Y is now conditional mean.
Y٨=g(X)=E(Y/X)
e(X)=Y-Y٨
5
=Y-g(X)
E(|e2|)=E[(Y-g(X))2]
E[(Y-g(X))2]=∫∞-∞ (y-g(x))2f(y)dy
Example
Let Y=X2 and X be a uniform random variable over (-1,1),find the mean square estimator
Solution
1,g(X)=Y٨=E(Y/X)
=E(X2/X)
=X2
E(e2)=E[(Y-g(X))2]
=[(X2-X2)2]
=0
NB:
6
7
Linear mean square estimation
Linear mean square estimation is a method used to estimate a linear relationship between
variables while minimizing the mean squared error. In the context of linear regression, the goal is
to find the best-fitting line that represents the relationship between the independent and
dependent variables.
The linear mean square estimation involves finding the coefficients of the linear equation (e.g., y
= mx + b) that minimize the sum of squared differences between the observed values and the
values predicted by the linear model. This is typically done using methods like least squares
estimation, where the sum of squared residuals is minimized to obtain the best-fitting line.
By minimizing the mean squared error, linear mean square estimation provides an optimal
estimate of the parameters of the linear model. This allows for predicting the dependent variable
based on the values of the independent variables with minimal error.
8
9
Conclusion
In conclusion, mean square estimation is a method used to quantify the accuracy of an estimator
by measuring the average squared difference between estimated and true values. In the context of
linear mean square estimation, the objective is to minimize this error by finding the best linear
estimator. A mean square estimation error greater than 0 indicates that the estimator is less
accurate than a simple baseline model and suggests the need for improvement in accuracy and
reliability through refining the estimation model.
10
Referance
Hwei P.phd, Theory and problems of probability, Random variables, and Random
processe
Athanasios Papoulis,Probability,Random,Variables and Stochastic process
Oliver C.lbe,Fundamentals of Applied probability &Random process, Elesvier Academic
press,2005.
Donald G.Childers, probability and Random process,john Wiley and Sons,Inc,2005
11
12