Module 3 (1)
Module 3 (1)
The Naive Bayes classifier assumes that all features are conditionally
independent of each other, given the class label. This is the “naive”
assumption, and it simplifies the computation of the likelihood.
Example: In classifying whether a person will buy a car based on age and
income, Naive Bayes assumes that age and income are independent, even
though they may be correlated (younger people may have lower incomes).
The assumption simplifies the calculations by ignoring this relationship.
To classify a new instance using Naive Bayes, the classifier computes the
posterior probability for each class and assigns the instance to the class
with the highest probability.
The Viterbi algorithm finds the most likely sequence of hidden states that
produces a given sequence of observations.
Steps:
Formula:
Applications of HMMs
1. States (S)
In this example, the hidden states are the weather conditions. The two
possible states are:
Sunny (S1)
Rainy (S2)
2. Observations (O)
The observations are what we can see or measure, which give us some
idea about the current state. Here, we have three activities:
Walking (O1)
Shopping (O2)
Cleaning (O3)
The transition matrix (A) defines the probabilities of moving from one
state (weather condition) to another state.
This means:
This means:
On a Sunny day:
o There is a 60% chance that the person will go Walking.
o There is a 30% chance that the person will go Shopping.
o There is a 10% chance that the person will stay Cleaning.
On a Rainy day:
This means that there is an 80% chance that the first day is Sunny and a
20% chance that it is Rainy.
6. Likelihood Calculation using Forward Algorithm