0% found this document useful (0 votes)
7 views43 pages

EEE 6542 - Lecture 4 Notes - BLANK - F2024

The lecture covers random variables, focusing on types such as Bernoulli, Binomial, Poisson, and continuous random variables like Uniform and Exponential. Key concepts include probability mass functions (PMF), probability density functions (PDF), expected values, and variances. The session also introduces the Central Limit Theorem and its significance in modeling random phenomena in engineering.

Uploaded by

Eliane Birba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views43 pages

EEE 6542 - Lecture 4 Notes - BLANK - F2024

The lecture covers random variables, focusing on types such as Bernoulli, Binomial, Poisson, and continuous random variables like Uniform and Exponential. Key concepts include probability mass functions (PMF), probability density functions (PDF), expected values, and variances. The session also introduces the Central Limit Theorem and its significance in modeling random phenomena in engineering.

Uploaded by

Eliane Birba
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 43

EEE 6542

Random Processes in Electrical Engineering

Lecture 4 – Random Variables – PART II

Ismail Uysal, Ph.D.


Associate Professor
Undergraduate Director
Department of Electrical Engineering
iuysal@usf.edu
Tampa Office Location: ENB II Building, 366
Office Hours: Bi-weekly 4:00PM – 5:00PM Thursdays (Microsoft Teams)

Stay in touch – Canvas is your main communications tool !


Announcements
• Quiz 2 available now – due next Tuesday
before class (September 24th, 5PM)
• Includes both Bayes’ rule & today’s lecture
• We are approximately 3 weeks away from the
midterm exam
– I will post example problems and study material
next week
– Recitation sessions are your best resource – use
them !
Announcements
• Office hours
– Thursdays 4:00PM – 5:00PM (Dr. Uysal)
– Mondays 1:00PM – 2:00PM (Rifat)
– Extended office hours TBD before midterm and
final exams
• Next week
– There is a chance the lecture could be online
(previously recorded)
– I will let you know on Monday (the day before).
Today
• Continue our discussion on random variables.
– Learn about different types of random variables.
– Continuous random variables !
A more practical definition of variance
Some Properties of Variance
VAR[c] = 0 where c is a constant

VAR [X + c] = ?

VAR [cX] = ?
Let’s change and redo the last example to
get things started…
Flipping the coin three times, define the RV as “the
number of tails times the number of heads”. What if
the coin was biased?

Let’s say the chance of getting tails is 2/3 and heads is


1/3. Let’s sketch the PMF for this quickly. Let’s do the
same practice of finding and plotting CDF, PMF,
expected value, variance and standard deviation.
Important Families of Discrete Random
Variables
• Certain discrete random variables arise in many diverse,
unrelated applications.
– Bernoulli, Binomial, Poisson
• Pervasiveness is due to fact that they model fundamental
mechanisms underlying random behavior.
• We will examine essential random behavior of important
random variables as well as their PMFs.
Bernoulli Random
Variable
pI(0) = 1 – p and pI(1) = p
Let’s find the expected value and
variance !
Binomial Random Variable
X = I1 + I2 + … + I n
Poisson Random Variable
(major applications in networking/queuing theory to model the arrival process)
• Count the number of occurrences of an event in a time period or in
a region of space(e.g,, packet arrivals, photon arrivals, …
– Packet arrivals, failure events, faults in devices
The Poisson random variable has PMF:

where αlpha is the average number of event


occurrences per unit time.
Continuous Random Variables

Measure voltage pdf


across a resistor
Random
Variable

5

Voltage = 5V

1
Voltage = 3.2V
0.5
cdf
Voltage = 0V 0 0
Continuous Random Variables
Cumulative distribution function
(CDF)
Probability Density Function
Definition: Probability density
function (PDF) of X (if it exists)
is:
Properties of the PDF
1.

2.

3. The probability of an interval is the area under fx(x)


So what is the link between PMF and PDF?

PMF is defined for discrete RV and tells the exact


probability of that RV at that point.

PDF is defined for continuous RV and has to be


integrated to find the probability of that RV being in
that interval.
Expected value and variance
Remember for discrete RV:
Important families of continuous random
variables

• Remember how we used PMF to describe


important discrete random variables, like
Bernoulli, Binomial, Poisson, etc.

• We use PDF to describe important continuous


random variables…
Uniform Random Variable
• A uniform random variable X is “equally likely” to take on any value in some
finite interval (a,b).
• They are natural choices for experiments in which some event is “equally
likely” to happen at any time or place within some interval.
• The random variable X that satisfies this condition is the uniform random
variable. We write X ~U(a,b). It has the probability density function.
Uniform Random Variables: CDF
The cumulative distribution function F given below is easy to
compute either by integration of the pdf or by finding the area of
rectangles. Note that it has all the usual properties of a CDF: 0 to
the left, 1 to the right, increasing and continuous in between.
Uniform Random Variables: Expectation
Intuitively we anticipate E(X)=(a+b)/2, the midpoint of the interval.
Is that really so?
Uniform random variables
Uniform Random Variables: Variance
Exponential Random Variables: PDF
• Let  be a positive real number. We write X ~ exponential() and say
that X is an exponential random variable with parameter  with the
PDF
 e   x , if x 0
f ( x) 
 0, otherwise
CDF of Exponential RV
Show that FX(x) = 1-e-x , x ≥ 0
=0, x≤ 0
Exponential Random Variables: PDF

A simple integration shows that the area under f(x) is 1:


Exponential Random Variables: Expectation
So, if X ~ exponential(), then
Exponential Random Variables:
Applications
Suppose the wait time X for service at the post office has an
exponential distribution with a mean of 3 minutes. If you enter the
post office immediately behind another customer, what is the
probability you wait over 5 minutes?
Since E(X)=1/=3 minutes, then =1/3, so

Under the same conditions, what is the probability of waiting


between 2 and 4 minutes?
• Example: Suppose you enter the post office and
have to choose one of two lines, each of which has
exactly one person ahead of you. The person in the
first line got there just ahead of you. The person in
the second line has already been there 10 minutes.
• Which line should you get in so as to be served
fastest if the waiting times are exponentially
distributed?
• Doesn’t matter !
Exponential Random Variables: The
Memoryless Property

The exponential random variable has an astonishing property. If X ~


exponential(λ) represents a waiting time, then the probability of
waiting a given length of time is not affected by how long you have
waited
.
P ( X  a  b | X  a ) P ( X  b)
That is, if you have already waited a minutes, the probability you
wait b more minutes is the same as your initial probability of
waiting b minutes. This is known as the memoryless property.
Is there a link between Poisson and Exponential
RVs?

Yes ! If the times between successive arrivals to a


system (such as packets received at the receiver)
are exponentially distributed than the number of
arrivals in a fixed time “t” has the Poisson
distribution.
Gaussian/Normal Distribution
The Gaussian, or Normal, random variable is described in terms of
two (real number) parameters, the mean, m and variance 2 . The
PDF and CDF are given by
Gaussian distribution – PDF
(just wanted to dedicate a slide to this)

We often use the shorthand N(m, 2) to denote the


PDF of a Gaussian random variable , X, and write
X~ N(m, 2).

A Gaussian RV with m=0 and =1 is called a


Gaussian (Normal) Random Variable
• Gaussian PDF shown below for zero mean and for several values of
σ > 0 [the standard deviation (STD)]
Sum of Normally Distributed Independent Random Variables

If Xi are statistically independent Gaussian RVs with


mean mi and variance 2 , then:

is also a Gaussian RV with:

m y  m i ,  y2   i2
i i
Central Limit Theorem
Central Limit Theorem:
Assume a random variable Y is a sum of N independent random variables Xi

Y = X1 + X2 + … + XN

One of the reasons why the Gaussian RV is so important in science and engineering
is because if:
• the Xi are independent and identically distributed (i.i.d. ) RVs
• with any distribution,
• with finite means and variances,
then as N approaches infinity, Y becomes a Gaussian R.V. In practice, it doesn’t
even need to reach infinity – anything with N >~ 30 is sufficient to have a very
Gaussian looking distribution.

The CLT a very important and useful result since many “noise” phenomena in
communication and network systems are the sum of such i.i.d. RVs.
Obviously…
• There are many different kinds of continuous
(and discrete) random variables to model many
different types of random events.
• And I don’t expect you to memorize any of them.
• But one thing you need to notice is that they are
all characterized by their PDF.
• And their characteristic values such as “mean”
and “variance” are mostly system parameters in
their PDF.
Rayleigh Random Variable

The mean, variance, PDF, and CDF are below.


Y  X 12  X 22 , m1 m 2 0,  i2  2
y  y 2 / 2 2
p(y )  e , y 0
2
 y 2 / 2 2
F ( y ) 1  e , y 0
Gamma Distribution

The random variable X with probability density function

r x r  1e  x
f ( x)  , for x  0
( r )
is a gamma random variable with parameters λ > 0 and r > 0.

Gamma Function:

Alpha = 1 => Gamma distribution becomes the exponential distribution

Gamma distribution can also model sum of


squares of Rayleigh random variables !
Cauchy:

 /
f X ( x)  ,    x  .
  (x  )
2 2

Laplace:

f X ( x)
1  |x|/ 
f X ( x)  e ,    x  .
2

x
What’s the difference between Cauchy &
Gaussian?

Cauchy has a narrower peak but longer tails compared to Gaussian !


Next week
• The real fun (sorrow? ) begins !

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy