0% found this document useful (0 votes)
18 views52 pages

4 HMM

sdrsrd

Uploaded by

ameywani2012
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views52 pages

4 HMM

sdrsrd

Uploaded by

ameywani2012
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 52

Hidden Markov Model

POS tagging using Generative Model


Types of Models

❖2 types
❖Generative model
❖Discriminative model
Motivation

❖Variability is a part of nature


❖Variability leads to uncertainty when drawing conclusion from
data
❖Motivates us to take a probablistic approach
❖To modeling & reasoning
Motivation
Example

❖2 friends Rahul & Ashok


❖Rahul - jog, go to the office, and cleaning his residence
❖According to the weather he does the task
❖He tells Ashok what he did
❖Ashok assume weather as per the task
❖Ashok believes weather {Rainy, Sunny} acts as a Markov
chain
Example

❖On each day Rahul performs a task


❖With certain probability
❖Depending on weather
❖Rahul tells Ashok he cleaned his home
❖Ashok deducts from past experience that it was rainy day
Example

❖The states and observation are:

❖states = ('Rainy', 'Sunny')

❖observations = (‘jog', ‘work', 'clean')

❖And the start probability is:

❖start_probability = {'Rainy': 0.6, 'Sunny': 0.4}


Example
Start

0.6 0.4

❖transition_probability = {
0.4
Rainy Sunny
❖ 'Rainy' : {'Rainy': 0.7, 'Sunny': 0.3},
❖ 'Sunny' : {'Rainy': 0.4, 'Sunny': 0.6}, 0.3
0.6
❖ }
0.7
Example

❖emission_probability = {

❖ 'Rainy' : {'jog': 0.1, 'work': 0.4, 'clean': 0.5},


❖ 'Sunny' : {'jog': 0.6, 'work: 0.3, 'clean': 0.1},

❖ }
Example
Applications of HMM

❖Computational finance ❖Machine translation


❖speed analysis ❖Handwriting recognition
❖Speech recognition ❖Time series analysis
❖Speech synthesis ❖Activity recognition
❖Part-of-speech tagging ❖Sequence classification
❖Document separation in ❖Transportation forecasting
scanning solutions
Example 2
Problem

❖Start W = w1,w2….wn words in the corpus (observed)


❖T = t1…tn corresponding tags (unknown)

❖Ie. Argmax P(t1…tn|w1…wn)


Using Bayes Theorem
Further simplification
Further simplification
What is this model?
Computing probability values
Computing probability values
Disambiguating “race”
Disambiguating “race”

❖P(VB|TO) P(NN|TO)
❖P(NR|VB) P(NR|NN)
❖P(race|VB) P(race|NN)
❖Mutiply the probabilities in each case
❖Find higher probability
Disambiguating “race”
What is this model?
What is this model?
What is HMM?

❖Tag transition probabilities p(ti|ti-1)


❖Word likelihood properties (emission) p(wi|ti)
❖This is HMM
How is Markov model different from HMM

❖Given today is sunny


❖Probability of tomorrow sunny?
❖Day after is rainy?
❖P(qn+2 = R, qn+1 = S)
❖= P(qn+1 = S|qn = S) *
P(qn+2 = R|qn+1 = S, qn = S)
❖Can be approximated as
❖= P(qn+1 = S|qn = S) *
P(qn+2 = R|qn+1 = S)
=0.8 * 0.05 = 0.04
How is Markov model different from HMM
Elements of HMM model
Graphical representation
Graphical representation
Walking through the states to find the best
path
Possible Approach

❖Compute probability of each sequence


❖This may not be efficient
❖We need a solution that does not grow exponentially with the
no. of words
❖We will use VITERBI algorithm to do POS tag in efficient
manner
Question
Question

❖Calculate transition and emission probabilities for a set of


sentences:

❖Mary Jane can see will


❖Spot will see Mary
❖Will Jane spot Mary?
❖Mary will pat Spot
Solution
Emission Probability
Transition Probabiliy
Emission Probability
Solution
Transition Probabiliy
Transition Probability
Solution
Question
Question

❖Due to COVID you were locked in a room for


several days , and asked about the weather
outside. The only piece of evidence is
whether the person comes into the room
carrying your daily meal is carrying an
umbrella or not.
❖Following probability is given.
❖On a sunny day P(umbrella) = 0.1
❖On a rainy day P(umbrella) = 0.8
❖On a foggy day P(umbrella) = 0.3
Part -1

❖Suppose the day you were locked in was sunny. Next day
caretake carried an umbrella into the room. Assuming that
prior probability of caretaker carrying an umbrella on any
day is 0.5. what is the probability that second day was rainy?
Part -2

❖Suppose the day you were locked in was sunny. The


caretaker brought an umbrella on day 2 but not on day 3.
Assuming prior probability of bringing an umbrella is 0.5 , what
is the probability that it is foggy on day 3?
Part -1 Solution
Part -2 Solution
Part -2 Solution
Part -2 Solution
Part -2 Solution
HMM Recap

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy