L25 - Naïve Bayes
L25 - Naïve Bayes
Background
• There are three methods to establish a classifier
a) Model a classification rule directly
Examples: k-NN, decision trees, perceptron, SVM
b) Model the probability of class memberships given input data
Example: multi-layered perceptron with the cross-entropy cost
3
Example by Dieter Fox
Probabilistic Classification
• Establishing a probabilistic model for classification
– Discriminative model
– Generative model
7
Naïve Bayes
• Bayes classification
8
To create a classifier model, find the probability of given
set of inputs for all possible values of the class
variable C and pick up the output with maximum
probability. This can be expressed mathematically as:
10
• Step 1: Convert the data set into a frequency
table
• Step 2: Create Likelihood table by finding the
probabilities.
• Step 3: Now, use Naive Bayesian equation to
calculate the posterior probability for each
class.
– The class with the highest posterior probability is
the outcome of prediction.
• Problem: Players will play if weather is sunny. Is this
statement is correct?
• Here we have
– P (Sunny |Yes) = 3/9 = 0.33,
– P(Sunny) = 5/14 = 0.36,
– P( Yes)= 9/14 = 0.64
18