Generative Learning Algorithims 1233
Generative Learning Algorithims 1233
LEARNING
ALGORITHMS
PRESENTERS
o ANNAY SAXENA
o ANISH KUMAR
o ANANT KUMAR
W H AT A R E G E N E R AT I V E L E A R N I N G
ALGORITHMS?
Definition:
Learn joint probability P(x, y) and can generate new samples from the
data distribution
Non- Deep
Parametric
Parametric Generative
Models
Models Models
Parametric generative
PA R A M E T R I C models are statistical
G E N E R AT I V E models that assume the
underlying data
MODELS distribution can be
described using a fixed set
of parameters.
These models learn the
parameters from the data
and use them to generate
new samples.
T Y P E S O F PA R A M E T R I C
G E N E R AT I V E M O D E L S
Gaussian Hidden
Bayesian
Mixture Markov
Networks
Models Models
G AU SS I A N
MIXTURE
MODELS
What is GMM?
A Gaussian Mixture Model assumes that the data is generated from a mixture of several
Gaussian distributions, where each Gaussian distribution represents a cluster or a
component of the data.
Concept of GMM
• Data is divided into multiple Gaussian distributions.
• Each Gaussian distribution has its own mean, variance, and weight.
• The sum of weights equals 1.
G AU SS I A N
MIXTURE
MODELS
Mathematical Formulation
• K is the number of Gaussian components.
• πk is the mixing coefficient for the kth Gaussian.
• N is the Gaussian distribution with mean μk
and covariance Σk.
GAUSSIAN MIXTURE MODELS
• Key Concepts
• Conditional Independence: The graph structure encodes which variables
are independent of others given their parents in the graph.
• Joint Probability Distribution: The joint distribution of all variables can be
factored as the product of the conditional probabilities of each variable given
its parents in the graph
B AY E S I A N Learning in Bayesian
NETWORK Networks
Structure Learning: Involves finding
S the best graph structure that
represents the relationships among
variables
Parameter Learning: Once the
structure is fixed, parameters of the
conditional distributions are learned
using methods like maximum likelihood
estimation or Bayesian inference
B AY E S I A N
NETWORK Advantages Applications
Interpretability: The graphical Medical Diagnosis: Can
S structure provides insights into model the relationships
the relationships between between symptoms and
variables diseases
Efficient algorithms: exist for Decision Support Systems:
querying the network, making it Used in expert systems for
useful for decision support reasoning under uncertainty
systems
N O N - PA R A M E T R I C
G E N E R AT I V E
MODELS
• Definition: Non-parametric generative models do
not assume a fixed number of parameters or a fixed
form of the data distribution
• Key Idea: Unlike parametric models, the number of
parameters in non-parametric models increases as
more data is available, providing greater flexibility
• Popular Non- Parametric Models
• Kernel Density Estimation
• k-Nearest Neighbors
KERNEL DENSITY
E S T I M AT I O N
• Definition: KDE is a method used to estimate the probability density function of a random
variable non-parametrically
• Key Concepts
• The data is smoothed by summing up contributions from "kernels" placed at each data point
• The width of the kernels is controlled by a bandwidth parameter, which defines the
smoothness of the density estimate
• Mathematical Representation:
Where,
• n is the number of data points
• K is the kernel function, typically a Gaussian
• h is the bandwidth, which controls the smoothness of the estimate
K E R N E L D E N S I T Y E S T I M AT I O N
• Regression:
• Advantages
• Simple to implement and understand
• Non-parametric and flexible, adapting to data
distribution
K- N E A R E S T
NEIGHBORS
• Limitations
• Sensitive to the choice of k, and performance
can vary based on the distance metric used
• Computationally expensive for large
datasets, as distances need to be calculated
for each point
• Applications
• Pattern recognition and image classification
• Recommendation systems
Definition: Deep Generative Models use
neural networks to generate new data
samples that resemble the training data
DEEP
G E N E R AT I V Key Features
Can generate new instances that
Learn the underlying distribution of
look like they come from the
E MODELS data
original dataset
Data Augmentation: Generative models can create additional data for training
Anomaly Detection: GMM and other generative models can model "normal" data
and detect deviations