6.naive Bayes
6.naive Bayes
§ Naïve Bayes classifiers are a family of simple "probabilistic classifiers" based on applying Bayes'
- -
§ Naïve
-
Bayes has been studied extensively since the 1960s
§ Naïve Bayes classifiers are highly scalable, requiring a number of parameters linear in the number of
-
-
variables (features/predictors) in a learning problem
age:35
Salary:15K I
result.
#
§ -
Bayes’ Theorem finds the probability of an event occurring given the probability of another event that
has already occurred
-
-
§ where A and B are events and P(B) ? 0.
§ Basically, we are trying to find probability of event A, given the event B is true. Event B is also termed
as evidence.
§ P(A) is the priori of A (the prior probability, i.e. Probability of event before evidence is seen). The evidence is
an attribute value of an unknown instance(here, it is event B).
§ P(A|B) is a posteriori probability of B, i.e. probability of event after evidence is seen.
§ Now, with regards to our dataset, we can apply Bayes’ theorem in following way:
-
4
=
§ where, y is class variable and X is a dependent feature vector (of size n) where:
Exp 15
=
Salary?
§ Below is a training data set of weather and corresponding target variable ‘Play’ (suggesting
possibilities of playing). We need to classify whether players will play or not based on weather
condition.
-
② 4
-
- -
- - >
§ Step 2: Create Likelihood table by finding the probabilities like Overcast probability = 0.29 and
probability of playing is 0.64
0
-
§ Step 3: Now, use Naive Bayesian equation to calculate the posterior probability for each class. The
class with the highest posterior probability is the outcome of prediction.
§ Here we have
§ Which means, P (Yes | Sunny) = 0.33 * 0.64 / 0.36 = 0.60, which has higher probability.