0% found this document useful (0 votes)
207 views

Hidden Markov Model HMM

The document discusses Hidden Markov Models (HMM). HMMs are a machine learning method that uses probabilistic models and state machines to model sequential problems where the underlying states are hidden. HMMs have three main components: states, output symbols, and probabilities. They have been used successfully in applications like speech recognition, text processing, bioinformatics, and financial analysis. While effective, HMMs require training data and may have issues around data requirements.

Uploaded by

Ganga Sagar
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
207 views

Hidden Markov Model HMM

The document discusses Hidden Markov Models (HMM). HMMs are a machine learning method that uses probabilistic models and state machines to model sequential problems where the underlying states are hidden. HMMs have three main components: states, output symbols, and probabilities. They have been used successfully in applications like speech recognition, text processing, bioinformatics, and financial analysis. While effective, HMMs require training data and may have issues around data requirements.

Uploaded by

Ganga Sagar
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

Hidden Markov Model

Presented

By Om Prakash Mahato 059/MSCKE/069 IOE Pulchowk Campus

HMM Overview
Machine learning method Makes use of state machines Based on probabilistic models Useful in problems having sequential steps Can only observe output from states, not the states themselves
Example: speech recognition
Observe: acoustic signals Hidden States: phonemes
(distinctive sounds of a language)

State machine:

Observable Markov Model

HMM Components
A set of states (xs)
A set of possible output symbols (ys) A state transition matrix (as) Output emission matrix (bs)
probability of making transition from one state to the next probability of a emitting/observing a symbol at a particular state

Initial probability vector

probability of starting at a particular state Not shown, sometimes assumed to be 1

THE HIDDEN MARKOV MODEL DEFINITIONS

Observable Markov Model Example


State transition matrix

Weather
Once each day weather is observed
State 1: rain State 2: cloudy State 3: sunny
Rainy Cloudy

Rainy 0.4 0.2

Cloudy 0.3 0.6

Sunny 0.3 0.2

Sunny

0.1

0.1

0.8

What is the probability the weather for the next 7 days will be:
sun, sun, rain, rain, sun, cloudy, sun

Each state corresponds to a physical observable event

Hidden Markov Model Example


Coin toss:
Heads, tails sequence with 2 coins You are in a room, with a wall Person behind wall flips coin, tells result
Coin selection and toss is hidden Cannot observe events, only output (heads, tails) from events

Problem is then to build a model to explain observed sequence of heads and tails

HMM Uses
Uses
Speech recognition
Recognizing spoken words and phrases

Text processing
Parsing raw records into structured records

Bioinformatics
Protein sequence prediction

Financial
Stock market forecasts (price pattern prediction) Comparison shopping services

HMM Advantages / Disadvantages


Advantages
Effective Can handle variations in record structure
Optional fields Varying field ordering

Disadvantages
Requires training using annotated data
Not completely automatic May require manual markup Size of training data may be an issue

References
Rabiner, L. R. (1989). A Tutorial on Hidden Markov Models and Selected Applications in Speech Recognition. Proceedings of the IEEE http://en.wikipedia.org/wiki/Hidden_Markov_model http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2766791/

Thank you!

You might also like