0% found this document useful (0 votes)
84 views10 pages

6.naive Bayes

This document discusses Naive Bayes classifiers, which are a simple probabilistic method for classification based on applying Bayes' theorem. Naive Bayes classifiers make strong independence assumptions between predictor variables. The document explains Bayes' theorem, how Naive Bayes works by calculating probabilities from training data, and provides an example of classifying weather to predict if players will play or not. Finally, it briefly introduces different types of Naive Bayes classifiers such as Gaussian, Multinomial, and Bernoulli Naive Bayes.

Uploaded by

patil_555
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views10 pages

6.naive Bayes

This document discusses Naive Bayes classifiers, which are a simple probabilistic method for classification based on applying Bayes' theorem. Naive Bayes classifiers make strong independence assumptions between predictor variables. The document explains Bayes' theorem, how Naive Bayes works by calculating probabilities from training data, and provides an example of classifying weather to predict if players will play or not. Finally, it briefly introduces different types of Naive Bayes classifiers such as Gaussian, Multinomial, and Bernoulli Naive Bayes.

Uploaded by

patil_555
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Machine Learning

Sunbeam Infotech www.sunbeaminfo.com


Naïve Bayes
-

Sunbeam Infotech www.sunbeaminfo.com


Overview
classify
-obability
to

§ Naïve Bayes classifiers are a family of simple "probabilistic classifiers" based on applying Bayes'
- -

theorem with strong (naïve) independence assumptions between the features


-

§ They are among the simplest Bayesian network models


-

§ Naïve
-
Bayes has been studied extensively since the 1960s
§ Naïve Bayes classifiers are highly scalable, requiring a number of parameters linear in the number of
-

-
variables (features/predictors) in a learning problem

age:35

Salary:15K I
result.
#

Sunbeam Infotech www.sunbeaminfo.com


Bayes Theorem

§ -
Bayes’ Theorem finds the probability of an event occurring given the probability of another event that
has already occurred
-

§ Bayes’ theorem is stated mathematically as the following equation:

-
§ where A and B are events and P(B) ? 0.
§ Basically, we are trying to find probability of event A, given the event B is true. Event B is also termed
as evidence.
§ P(A) is the priori of A (the prior probability, i.e. Probability of event before evidence is seen). The evidence is
an attribute value of an unknown instance(here, it is event B).
§ P(A|B) is a posteriori probability of B, i.e. probability of event after evidence is seen.

Sunbeam Infotech www.sunbeaminfo.com


Bayes Theorem

§ Now, with regards to our dataset, we can apply Bayes’ theorem in following way:
-

4
=

§ where, y is class variable and X is a dependent feature vector (of size n) where:

Exp 15
=

Salary?

Sunbeam Infotech www.sunbeaminfo.com


How does it work ?

§ Below is a training data set of weather and corresponding target variable ‘Play’ (suggesting
possibilities of playing). We need to classify whether players will play or not based on weather
condition.

Sunbeam Infotech www.sunbeaminfo.com


How does it work ?

§ Step 1: Convert the data set into a frequency table

-
② 4

-
- -

- - >

Sunbeam Infotech www.sunbeaminfo.com


How does it work ?

§ Step 2: Create Likelihood table by finding the probabilities like Overcast probability = 0.29 and
probability of playing is 0.64

0
-

§ Step 3: Now, use Naive Bayesian equation to calculate the posterior probability for each class. The
class with the highest posterior probability is the outcome of prediction.

Sunbeam Infotech www.sunbeaminfo.com


How does it work ?

§ Problem: Players will play if weather is sunny. Is this statement is correct?


-

§ We can solve it using above discussed method of posterior probability.

P(Yes | Sunny) = P( Sunny | Yes) * P(Yes) / P (Sunny)


- -
-

§ Here we have

P (Sunny |Yes) = 3/9 = 0.33


P(Sunny) = 5/14 = 0.36
P( Yes)= 9/14 = 0.64

§ Which means, P (Yes | Sunny) = 0.33 * 0.64 / 0.36 = 0.60, which has higher probability.

Sunbeam Infotech www.sunbeaminfo.com


Types of Naïve Bayes
-

§ Gaussian Naïve Bayes classifier


- -
-

§ Multinomial Naive Bayes


-

§ Bernoulli Naive Bayes


-

Sunbeam Infotech www.sunbeaminfo.com

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy