0% found this document useful (0 votes)
65 views33 pages

Background/Random Processes

The document summarizes concepts related to random processes including: 1) Eigenvalues and eigenvectors of matrices, optimization theory, ensemble averages of random variables, and jointly distributed random variables. 2) Optimization theory discusses minimizing functions of one or more variables including using Lagrange multipliers to solve constrained minimization problems. 3) Ensemble averages, variances, and joint moments are defined for characterizing random processes and jointly distributed random variables.

Uploaded by

Sana Saad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views33 pages

Background/Random Processes

The document summarizes concepts related to random processes including: 1) Eigenvalues and eigenvectors of matrices, optimization theory, ensemble averages of random variables, and jointly distributed random variables. 2) Optimization theory discusses minimizing functions of one or more variables including using Lagrange multipliers to solve constrained minimization problems. 3) Ensemble averages, variances, and joint moments are defined for characterizing random processes and jointly distributed random variables.

Uploaded by

Sana Saad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Lecture 3

Background/Random Processes
• Eigenvalues and Eigenvectors
• Optimization Theory
• Ensemble Averages, Jointly Distributed
RVs, Joint Moments

1
Eigenvalues and Eigenvectors
• Let A be an n x n matrix and consider the following
set of homogeneous linear equations

where λ is a constant
• In order for a nonzero solution vector to exist, the
matrix (A - λI) must be singular => det(A - λI) must
be zero

2
Eigenvalues and Eigenvectors
• Charateristic polynomial of A: the nth order
polynomial p(λ) in λ
• Eigenvalues of A: the n roots, λi
• For each eigenvalue λi , matrix (A - λI) will be singular
and and ther will be at least one nonzero vector vi
that solves Eq. (2.44)

• Eigenvectors of A: the n vectors, vi


• For any constant α, α vi will also be an eigenvector

3
Eigenvalues and Eigenvectors

• If A is n x n singular matrix, then there are nonzero


solutions to the homogeneous equation

and λ = 0 is an eigenvalue of A.
• Furthermore, there will be k = n − r(A) linearly
independent solution to Eq. (2.48) => A will have
• r(A) nonzero eigenvalues
• n − r(A) zero eigenvalues
4
Eigenvalues and Eigenvectors

• Eigenvalue Decomposition: For any n x n matrix A having n


linealy independent eigenvectors,

• where matrix V contains the eigenvectors of A and Λ is a


diagonal matrix containing the corresponding eigenvalues

5
Eigenvalues and Eigenvectors

6
Eigenvalues and Eigenvectors

7
Eigenvalues and Eigenvectors

8
Eigenvalues and Eigenvectors

9
Eigenvalues and Eigenvectors

10
Eigenvalues and Eigenvectors

11
Optimization Theory
• Minimization (or maximization) of a function of one or more
variables
• Simplest form: Finding the minimum of a scalar function f(x)
of a single real variable x
• Assuming objective function (OF) f(x) to be differentiable, the
stationary points of f(x) i.e., local and global minima must
satisfy the following conditions

12
Optimization Theory
• f(x) strictly convex function: Only one solution to Eq. (2.67),
which is the global minimum of f(x)
• f(x) not convex: Each stationary point must be checked to see
if it is the global minimum or not
• When OF f(z) is a function of a complex variable z and
differentiable (analytic): stationary points of f(z) found similar
to the real case
• What if f(z) is not differentiable?
• Example:

f(z) has a unique minimum at z = 0 but its not differentiable


since it is a function of z and z*. Any function dependent on z*
is not differentiable w.r.t. z 13
Optimization Theory
• Solution 1: Express f(z) in terms of its real and imaginary parts

and minimize f(x,y) w.r.t. these two variables


• Solution 2 (more elegant): Treat z and z* as independent
variables and minimize f(z, z*) w.r.t. both z and z*

Treating f(z, z*) as a function of z with z* constant

Treating f(z, z*) as a function of z* with z constant

14
Optimization Theory
Setting both derivatives to zero and solving the pair of
equations, we obtain the solution z = 0

• Therefore, for

we can set either Eq. (2.69) or (2.70) to zero and solve for z.

15
Optimization Theory
• How to find the minimum of a function of two or more
variables?
• Scalar function of n variables

• Requires computing the gradient: vector of partial derivatives

• Gradient points in the direction of maximum rate of change of


f(x)
• Gradient is zero at the stationary points of f(x)
16
Optimization Theory
• Necessary condition for a point x to be a stationary point of
f(x)

• For this stationary point to be a minimum, the Hessian matrix


Hx must be positive definite

where Hx is the n x n matrix of 2nd-order partial derivatives


with the (i, j)th element given by

17
Optimization Theory
• f(x) strictly convex: Solution to Eq. (2.71) is unique and is the
global minimum of f(x)
• Function of complex vectors f(z, z*): treat z and z* as
independent variables, stationary points may be obtained
using the theorem

• From the theorem it follows that the stationary points of


f(z, z*) are solutions to the equation

18
Optimization Theory

19
Optimization Theory
• Example of a constrained minimization problem (encountered
in array processing)

{
min zH Rz
z
}
s.t. zH a = 1

• We need to find the vector z that minimizes the OF which is a


quadratic form subject to a linear equality as mentioned
above where a is a given complex vector

20
Optimization Theory

21
Optimization Theory
• Solution: Introduce a Lagrange multiplier λ and solve the
unconstrained objective function

To minimize the OF, we set its gradient w.r.t. z* equal to zero

Finding the value of the Lagrange multiplier λ

22
Optimization Theory
Substituting Eq. (2.74) into (2.75)

Finally, substituting Eq. (2.76) into (2.74)

which is the vector z that minimizes the OF

23
Optimization Theory
Substituting this vector z into the quadratic form (main OF),
we obtain the minimum value of the quadratic form

where the RHS term follows from Eq. (2.75)

24
Random Processes

25
Ensemble Averages
• Sample mean or expected value
• Expected value of a discrete random variable (RV) x that
assumes a value of αk with probability Pr{x = αk}

• For continuous RVs with pdf fx(α)

26
Ensemble Averages

27
Ensemble Averages
• For RV x with pdf fx(α), if y = g(x)

• Variance of RV x, Var{x}

• For complex RVs, mean square value is

• and variance is given by

28
Ensemble Averages
• Expectation is a linear operator

• Using linearity, variance can be expressed as

• When E{x} = 0,

29
Jointly Distributed RVs
• Joint distribution function for two RVs, x(1) and x(2)

Definition: the probability that x(1) is less than α1 and x(2) is


less than α2

• Joint density function for two RVs

30
Jointly Distributed RVs
• Statistical characterization of complex RVs: If z = x + jy is a
complex RV and α = a + jb is a complex number then the
distribution function for z is the joint distribution function

• Joint distribution function for n RVs

• Joint density function for n RVs

31
Joint Moments
• Correlation: 2nd-order joint moment

• Covariance

• Correlation coefficient (normalized covariance invariant to


scaling):

32
Joint Moments
• Due to normalization, correlation coefficient is bounded by
one in magnitude

33

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy