0% found this document useful (0 votes)
23 views21 pages

Lecture5-Expected Value and Common PDFs

This document summarizes key concepts in probability and statistics including expected value, mean, variance, convergence, law of large numbers, and central limit theorem. It also introduces common probability density functions such as Gaussian and Rayleigh distributions.

Uploaded by

Sirus Danh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views21 pages

Lecture5-Expected Value and Common PDFs

This document summarizes key concepts in probability and statistics including expected value, mean, variance, convergence, law of large numbers, and central limit theorem. It also introduces common probability density functions such as Gaussian and Rayleigh distributions.

Uploaded by

Sirus Danh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 21

Lecture 5: Expectation and

Common PDFs
Contents
 Probability formulation
 Expected Value definition & Properties
 Mean and Mean Squared
 Variance and Standard Deviation
 Correlation and Covariance
 Convergence
Definition
Stochastic convergence
Other convergences: Mean Squared, Probability, Distribution
 Law of large numbers
 Central limit theorem
 Common PDFs
Gaussian PDF and Rayleigh PDF 2
Probability formulation
 Example 1. Consider the thermal noise phenomenon in a metallic
resistor. It is known that thermal energy affects the entire structure and
the thermal noise voltage is created by a random fluctuating voltage
across the terminals of the resistor.
Question of Probability: What do we expect the absolute value of the
thermal noise voltage to be?
Answer: A mathematical definition of the concept of expected value is
developed to answer this question.

3
Expected Value
 Definition:
 The expected value of a random variable is denoted by and is a real
(nonrandom) number defined by
(1)
Note:
 It is a probability-weighted average over the entire sample space of the sample values of
the random variable
 For a continuous random variable, the expected value is
(2)
 For a discrete random variable, the expected value is given by
(3)

4
Example
 Consider the discrete sample space

The probability function

And the random variable

The expected value of is given by

5
Properties of Expectation
 Linearity:
The expected value of the random variable is given by

where and are two random variables and that maps random variables into real
numbers is a linear transformation.
 Expected value of a function of a Random Variable:
 For any function of a random variable , it can be shown that the random variable
has expected value
(4)
Notes:
 Equation (4) is known as the fundamental theorem of expectation
 Equation (4) can be used to determine the density function for the random variable
Y=g(X) and then using the definition
(5)
6
Properties of Expectation (cont.)
 Set Indicator Function:
A set indicator function for an arbitrary event set ,
(6)
Notes:
1. The random variable takes on the value 1 whenever the event occurs and
value 0 otherwise
2. For the random variable , it can be verified that
(*)
Therefore, every conceivable probability can be interpreted as an
expected value

7
Characteristic Function
 A characteristic function of a random variable, denoted by , is given by
(7)
which corresponds to the function with parameter .
Besides, parameter .
Note:
Equation (7) is the conjugate Fourier transform of the function
 Characteristic function (7) can yield the moments of the random
variable. The moment of , can be obtained by
(8)

8
Mean and Mean Square
 Mean of , denoted by , is given by

 The general moments of a random variable:

 When , it is the average or mean value


(9)
o If the random variable is voltage or current, the time average (i.e. mean) is the DC
bias component of the signal
 When , it is the mean-squared value
(10)
o The mean-square is proportional to the average power
o The square root would be the rms of the random voltage or current

9
Variance and Standard Deviation
 The measure of width of a probability density function is called the
standard deviation of , denoted by
(11)
 The squared standard deviation, which is called the variance, can be
expressed as
(12)
 Notes:
 In electrical circuits, the variance is often related to the average power of
the AC component of a voltage or current
 The standard deviation is the rms of the AC component

10
Correlation and Covariance
 The second joint moment of two random variables and is called correlation
and it is given by
(13)
 can be interpreted as the inner product of two vectors
 Covariance, denoted by is the second joint centralized moment and is given
by
(14)
 Relationship between correlation and covariance is given by
(15)
 Notes:
 are uncorrelated
 : is referred to as the correlation coefficient,
 and are orthogonal
 If two random variables are statistically independent, they are uncorrelated but not orthogonal
unless at least one has zero mean
11
Convergence
 Question:
“Would this sequence converge, and if so, in what sense?”
It may converge in probabilistic senses
 A sequence of real nonrandom numbers converges to a number if and
only if for every positive number (no matter how small), there exists a
positive integer (sufficiently large) such that for all the difference
between and is less than :
(16)
This is abbreviated by

12
Stochastic Convergence
 A sequence of random variables { is actually a family of sequences of
real numbers
{{
together with a sequence of joint probability distributions

 Convergence Almost Surely (or with Probability 1):


for all
where Or it can be re-expressed as
(17)
 Notes (for Probability 1)
There can be particular sample sequences that do not converge, {{
The probability of the event that the sequence does not converge is zero:

13
Other Convergences
 Convergence in Mean Square (or expected square convergence):
(18)
 The “limit in mean” (square) is often abbreviated by the left-hand side of the expression

 Convergence in Probability:
for (19)
 Convergence in Distribution:
for all continuity points x of (20)
 Notes:
The convergence (20) is the weakest
Convergences (17) and (18) show the same strength

14
Law of Large Numbers
 Let be the probability of occurrence; be the random sample mean and
the random relative frequency of occurrence of A; is the number of
times of the occurrence of in trials, .
 Weak Law of Large Numbers:
 The sequence of random variables converges in probability to the
nonrandom variable ,
(21)
 Strong Law of Large Numbers:
 The sequence of random variables converges with probability one to the
nonrandom variable ,
(22)

15
Central Limit Theorem
 An important application of the concept of convergence in distribution
is to determine the limiting distribution of the partial sums of a
sequence of random variables
(23)
 Since the mean and variance of can grow without bound as the
standardized variables
(24)
where are the mean and variance of
 If are independent and identically distributed, converges in distribution
to a Gaussian variable with zero mean and unity variance. Then it is
known as the central limit theorem.

16
Common Probability Density Functions
 Gaussian distribution or Normal distribution:
Used to describe many physical processes
Allow complete statistical analysis of systems in either linear or nonlinear
situations
Completely determined by the values of its mean and variance only
for
where standard deviation Average .
Have only one maximally occurring value, its mean
Gaussian PDF is symmetrical about the mean
The width of PDF depends on the standard deviation

17
Common Probability Density Functions
(cont.)
 Rayleigh PDF:
 The peal value of a random voltage or current having a Gaussian PDF
 The errors associated with the aiming of firearms, missiles, and other
projectiles if the error in each of the two Cartesian coordinate system (x,y)
has an independent Gaussian PDF

It can be shown

18
Summary
 The expected value of a random variable is denoted by and is a real
(nonrandom) number defined by

 Linearity:
The expected value of the random variable is given by

 Expected value of a function of a Random Variable:


 For any function of a random variable , it can be shown that the random variable
has expected value

 A set indicator function for an arbitrary event set ,

19
Summary (cont.)
 Mean and Mean Squared

 Variance and Standard Deviation

 Covariance and Correlation

 Convergence:

20
Summary (cont.)
 Law of Large numbers:
Weak Law of Large Numbers:

Strong Law of Large Numbers:

 Common PDFs:
Gaussian distribution
for
Rayleigh distribution

21

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy