ML Lecture16
ML Lecture16
CSE343/CSE543/ECE363/ECE563
Lecture 16 | Take your own notes during lectures
Vinayak Abrol <abrol@iiitd.ac.in>
Gaussian Mixture Model (GMM)
GMM is a parametric probability density
function represented as a weighted sum
of Gaussian component densities.
Note, if you know the zi’s (cluster assignment), there is no sum (inner red one)
and the issue with the sum and the log goes away!
GMM: The Likelihood Function
Procedure:
Starting at θt with iteration t in orange, we construct the
surrogate lower bound A(θ, θt).
Bayes Rule
Back to GMM
Bayes Rule
Back to GMM
Bayes Rule
E Step: Assignment
Back to GMM
Bayes Rule
E Step: Assignment
sum-log-sum
Back to GMM
sum-log-sum sum-log-E
(convert sum to average)
Back to GMM
And we are back to the Lecture on SVMs. Can you guess What can we do?
Back to GMM
And we are back to the Lecture on SVMs. Can you guess What can we do?
And we are back to the Lecture on SVMs. Can you guess What can we do?
Here we are using index k′ so as not to be confused with the sum over k
Back to GMM
And we are back to the Lecture on SVMs. Can you guess What can we do?
Here we are using index k′ so as not to be confused with the sum over k
GMM Likelihood in Detail
Visualization of EM procedure in GMM
Covariance Matrix: same vs different | full vs diagonal
Covariance Matrix: same vs different | full vs diagonal
Covariance Matrix: same vs different | full vs diagonal
Covariance Matrix: same vs different | full vs diagonal
Covariance Matrix: same vs different | full vs diagonal
Covariance Matrix: same vs different | full vs diagonal
Thanks