Hidden Markov Models

29
Hidden Markov Models 戴戴戴 L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Mod els Ara V. Nefian and Monson H. Hayeslll, Face detection and recog nition using Hidden Markov Models

description

Hidden Markov Models. 戴玉書. L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Models Ara V. Nefian and Monson H. Hayeslll, Face detection and recognition using Hidden Markov Models. Outline. Markov Chain & Markov Models - PowerPoint PPT Presentation

Transcript of Hidden Markov Models

Page 1: Hidden Markov Models

Hidden Markov Models

戴玉書

L.R Rabiner, B. H. Juang, An Introduction to Hidden Markov Models

Ara V. Nefian and Monson H. Hayeslll, Face detection and recognition using Hidden Markov Models

Page 2: Hidden Markov Models

Outline

Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application

Page 3: Hidden Markov Models

Outline

Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application

Page 4: Hidden Markov Models

Markov chain property: Probability of each subsequent state depends only

on what was the previous state

)|(),,,|( 1121 ikikikiiik ssPssssP

)()|()|()|(

),,,()|(

),,,(),,,|(),,,(

112211

1211

12112121

iiiikikikik

ikiiikik

ikiiikiiikikii

sPssPssPssP

sssPssP

sssPssssPsssP

Page 5: Hidden Markov Models

Markov ModelsState

State

State

Page 6: Hidden Markov Models

Outline

Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application

Page 7: Hidden Markov Models

N - number of states : M - the number of observables:

If you don’t have complete state information, but some

observations at each state

Hidden Markov Models

},,,{ 21 Nsss

q1 q2 q3 q4 ……

1o 2o 3o 4o

},,,{ 21 Mvvv

Page 8: Hidden Markov Models

Hidden Markov ModelsState:{ , , }

Observable:{ , }

0.1

0.9

0.8

0.2

0.3

0.7

Page 9: Hidden Markov Models

Hidden Markov Models

)|()()),(( immimi svPvbvbB

)|(),( ijijij ssPaaA

M=(A, B, )

= initial probabilities : =(i) , i = P(si)

Page 10: Hidden Markov Models

Outline

Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application

Page 11: Hidden Markov Models

Evaluation Determine the probability that a particular sequence

of symbols O was generated by that model

TTtt qqqqqqqqq

T

tq aaaaMQP ,,,,

1

1 12121111)|(

TT oqoqoqtt

T

tbbbMqoPMQOP ,,,

1 2211),|(),|(

allQ

MQPMQOPMOP )|(),|()|(

Page 12: Hidden Markov Models

Forward recursion

Initialization:

Forward recursion:

Termination:

)|,,...,()( 1 MsqooPi ittt

)()( 11 obi ii

)(])([)( 11

1

tjij

N

itt obaij

N

iT

N

iiTT iMsqoooPMOP

1121 )()|,,...,,()|(

Page 13: Hidden Markov Models

Backward recursion

Initialization:

Backward recursion:

Termination:

),|,...,,()( 21 MsqoooPi itTttt

1)( iT

)(])([)( 11

1

tjij

N

jtt obaji

N

i

N

iiiMOP

1 11i1i1i1T21 )(o b)( )s)P(qsq|o, ... ,o P(o)|(

Page 14: Hidden Markov Models

Outline

Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application

Page 15: Hidden Markov Models

Decoding Given a set of symbols O determine the most likely sequence of hidden states Q that led to the observations

We want to find the state sequence Q which maximizes P(Q|o1,o2,...,oT)

Page 16: Hidden Markov Models

Viterbi algorithm

General idea:if best path ending in qt= sj goes through qt-1= si then it should coincide with best path ending in qt-1= si

s1

si

sN

sjaij

aNj

a1j

qt-1 qt

Page 17: Hidden Markov Models

Initialization:

Forward recursion:

Termination:

Viterbi algorithm

)o ... o,o,s q ,q P(qmax (i) t21it 1-t1 t

)](ob(i)a[ max (j) tjij1-ti

t

)( )o,s P(qmax )( 11i11 obi ii

)]( [ maxi

iT

Page 18: Hidden Markov Models

Viterbi algorithm

Page 19: Hidden Markov Models

Outline

Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation -Decoding -Learning Application

Page 20: Hidden Markov Models

Learning problem Given a coarse structure of the model, determine H

MM parameters M=(A, B, ) that best fit training data determine these parameters

i

imimi s statein timesofNumber

s statein occurs n vobservatio timesofNumber )s | P(v )(b mv

i

jiijij s state ofout ns transitioofNumber

s state tos state fromn transitioofNumber )s | P(s a

1 tat time s statein frequency Expected i i

Page 21: Hidden Markov Models

Baum-Welch algorithm

)o, ... ,o ,o | s q , s P(q ),( T21j1tit jit

)o, ... o,P(o

)s q |o, ... ,P(o )(o b a )o ... o o ,s P(q

T21

j1tT 2t1tjjiT21it

(j)) (o b a (i)

(j) ) (o b a (i)

1t1tjij

1t1tjij

i jt

t

Define variable t(i,j) as the probability of being in state si at time t and in state sj at time t+1, given the observation sequence o1, o2, ... ,oT

0:Initialise M

Page 22: Hidden Markov Models

Baum-Welch algorithm Define variable k(i) as the probability of being in st

ate si at time t, given the observation sequence o1,o2 ,...,oT

N

jtt jii

1

),()(

,)(

),(

1

1

1

1

T

tt

T

tt

ij

i

jia

,

)(

)(

)( 1

1

1

,1

T

tt

T

vott

mj

j

j

vb mt

)(1 i

Page 23: Hidden Markov Models

Outline

Markov Chain & Markov Models Hidden Markov Models HMM Problem -Evaluation problem -Decoding problem -Learning problem Application

Page 24: Hidden Markov Models

Example 1 -character recognition The structure of hidden states:

Observation = number of islands in the vertical slice

s1 s2 s3

Page 25: Hidden Markov Models

Example 1 -character recognition After character image segmentation the following sequence

of island numbers in 4 slices was observed : {1,3,2,1}

Page 26: Hidden Markov Models

Example 2- face detection & recognition The structure of hidden states:

Page 27: Hidden Markov Models

Example 2- face detection A set of face images is used in the training of one

HMM model

N =6 states

Image:48, Training:9, Correct detection:90%,Pixels:60X90

Page 28: Hidden Markov Models

Example 2- face recognition Each individual in the database is represent by an

HMM face model A set of images representing different instances of

same face are used to train each HMM

N =6 states

Page 29: Hidden Markov Models

Example 2- face recognition

Image:400, Training :Half, Individual:40, Pixels:92X112