0% found this document useful (0 votes)
39 views26 pages

(IT413P) Pattern Recognition Grade Four DR: Nagham Mekky

This document contains notes from a Pattern Recognition course. It discusses three types of machine learning: supervised learning, unsupervised learning, and reinforcement learning. It also provides an introduction to Bayesian decision theory, which is a statistical approach to pattern classification that quantifies the tradeoffs between classification decisions using probabilities and costs. The key concepts of Bayesian decision theory discussed include priors, likelihoods, posteriors, states of nature, evidence, and risk minimization.

Uploaded by

Mustafa Elmalky
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views26 pages

(IT413P) Pattern Recognition Grade Four DR: Nagham Mekky

This document contains notes from a Pattern Recognition course. It discusses three types of machine learning: supervised learning, unsupervised learning, and reinforcement learning. It also provides an introduction to Bayesian decision theory, which is a statistical approach to pattern classification that quantifies the tradeoffs between classification decisions using probabilities and costs. The key concepts of Bayesian decision theory discussed include priors, likelihoods, posteriors, states of nature, evidence, and risk minimization.

Uploaded by

Mustafa Elmalky
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Mansoura University

Faculty of Computers and Information


Department of Information Technology
First Semester- 2020-2021

[IT413P] Pattern Recognition


Grade: Four
Dr: Nagham Mekky
CHAPTER (1)
INTRODUCTION
LEARNING AND ADAPTATION
 Supervised learning
 A teacher provides a category label for each pattern in the
training set
 Unsupervised learning
 The system forms clusters or “natural groupings” of the
unlabeled input patterns
 Reinforcement learning or learning with a critic‫الناقد‬,
no desired category signal is given; critic instead, the
only teaching feedback is that the tentative category is
right or wrong.
LEARNING TYPES

 Supervised Learning with target vector : the training data comprises examples of
the input vectors along with their corresponding target vectors. Examples:
1. Classification the assignment of each input vector to one of a finite number of
discrete categories or classes.
2. Regression. If the desired output consists of one or more continuous variables.
LEARNING TYPES

 Unsupervised learning without target vector: the training data consists of a set of
input vectors x without any corresponding target values. Examples:
1. Clustering: The goal is to discover groups of similar examples within the data.
2. Density estimation to determine the distribution of data within the input space.
3. Visualization project the data from a high-dimensional space down to two or three
dimensions.
LEARNING TYPES

 Reinforcement learning which maximize a reward: The problem of finding


suitable actions to take in a given situation in order to maximize a reward. The
learning aims to discover examples of optimal outputs by a process of trial and error.
A general feature reinforcement learning is the trade-off between
exploration‫استكشاف‬, in which the system tries out new kinds of actions to see how
effective they are, and exploitation‫استغالل‬, in which the system makes use of actions
that are known to yield a high reward.
CHAPTER (2)
BAYESIAN DECISION
THEORY
INTRODUCTION

 Bayesian decision theory is a fundamental statistical approach to the


problem of pattern classification.
 This approach is based on quantifying the tradeoffs between various
classification decisions using probability and the costs that accompany
‫تصاحب‬such decisions.
 It makes the assumption that the decision problem is posed in
probabilistic terms, and that all of the relevant probability values are
known.
BAYESIAN DECISION THEORY

 Designclassifiers to make decisions subject to minimizing an


expected ”risk”.
 The simplest risk is the classification error (i.e., assuming that
misclassification costs are equal).
 When misclassification costs are not equal, the risk can include the
cost associated with different misclassifications.
TERMINOLOGY
 State of nature ω (class label):
 e.g., ω1 for sea bass, ω2 for salmon

 Probabilities P(ω1) and P(ω2) (priors):


 e.g., prior knowledge of how likely is to get a sea bass or a salmon

 Probability density function p(x) (evidence):


 e.g., how frequently we will measure a pattern with feature value x (e.g., x corresponds to
lightness)
TERMINOLOGY (CONT’D)

Conditional probability density p(x/ωj) (likelihood) :


 e.g., how frequently we will measure a pattern with feature value x given that the
pattern belongs to class ωj
e.g., lightness distributions
between salmon/sea-bass
populations
22
TERMINOLOGY (CONT’D)

 Conditional probability P(ωj /x) (posterior) :


 e.g., the probability that the fish belongs to class ωj given feature x.

 Ultimately, we are interested in computing P(ωj /x) for each class ωj.
26

You might also like