0% found this document useful (0 votes)
39 views

Pattern Recognition and Machine Learning: Fuzzy Sets in Pattern Recognition Debrup Chakraborty Cinvestav

Pattern recognition involves tasks like clustering, classification, and feature analysis. Fuzzy clustering allows mixed or overlapping clusters by assigning data points to multiple clusters with varying degrees of membership. The fuzzy c-means algorithm aims to minimize an objective function to find fuzzy partitions and cluster prototypes. It involves iteratively updating a partition matrix and prototype vectors until convergence. Fuzzy k-nearest neighbor classification generalizes the crisp k-NN rule by taking the average of fuzzy class labels of neighbors to assign fuzzy labels to new data points. Fuzzy rule-based classifiers extract rules from data clusters in the input space to classify new points based on proximity to cluster centroids.

Uploaded by

Avishek Chandra
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views

Pattern Recognition and Machine Learning: Fuzzy Sets in Pattern Recognition Debrup Chakraborty Cinvestav

Pattern recognition involves tasks like clustering, classification, and feature analysis. Fuzzy clustering allows mixed or overlapping clusters by assigning data points to multiple clusters with varying degrees of membership. The fuzzy c-means algorithm aims to minimize an objective function to find fuzzy partitions and cluster prototypes. It involves iteratively updating a partition matrix and prototype vectors until convergence. Fuzzy k-nearest neighbor classification generalizes the crisp k-NN rule by taking the average of fuzzy class labels of neighbors to assign fuzzy labels to new data points. Fuzzy rule-based classifiers extract rules from data clusters in the input space to classify new points based on proximity to cluster centroids.

Uploaded by

Avishek Chandra
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 15

Pattern Recognition and

Machine Learning
(Fuzzy Sets in Pattern Recognition)
Debrup Chakraborty
CINVESTAV
Pattern Recognition (Recapitulation)

Data
Object Data
Relational Data

Pattern Recognition Tasks


1) Clustering: Finding groups in data
2) Classification: Partitioning the feature space
3) Feature Analysis: Feature selection, Feature ranking,
Dimentionality Reduction
Fuzzy Clustering

Why?
Mixed Pixels
Fuzzy Clustering

Suppose we have a data set X = {x1, x2…., xn}Rp.


A c-partition of X is a c  n matrix U = [U1U2 …Un] = [uik], where
Un denotes the k-th column of U.
There can be three types of c-partitions whose columns
corresponds to three types of label vectors
Three sets of label vectors in Rc :
Npc = { y Rc : yi  [0 1]  i, yi > 0 i} Possibilistic Label
Nfc = {y  Npc : yi =1} Fuzzy Label
Nhc={y  Nfc : yi  {0 ,1}  i } Hard Label
The three corresponding types of c-partitions are:

 cn
n 
M pcn  U  R : U k  N pc k ;0   uik i 
 k 1 


M fcn  U  M pcn : U k  N fc k 

M hcn  U  M fcn : U k  N fc k 
These are the Possibilistic, Fuzzy and Hard c-partitions
respectively
An Example

Let X = {x1 = peach, x2 = plum, x3 = nectarine}


Nectarine is a peach plum hybrid.
Typical c=2 partitions of these objects are:

U1 Mh23 U2 Mf23 U3 Mp23

x1 x2 x3 x1 x2 x3 x1 x2 x3
1.0 0.0 0.0 1.0 0.2 0.4 1.0 0.2 0.5
0.0 1.0 1.0 0.0 0.8 0.6 0.0 0.8 0.6
The Fuzzy c-means algorithm

The objective function:


c n
m 2
J m (U , V )    uik Dik
i 1k 1
Where, UMfcn,, V = (v1,v2,…,vc), vi  Rp is the ith prototype
m>1 is the fuzzifier and
2
D  xi  v k
2
ik

The objective is to find that U and V which minimize Jm


Using Lagrange Multiplier technique, one can derive the following
update equations for the partition matrix and the prototype vectors

1
 c 2

  Dij  m 1
 i, j
1) uij    
 k 1  Dik  
 
 n m 
  uik x k 
v i   k 1n  i
 
  uik
2) m

 k 1 
Algorithm
Input: XRp
Choose: 1 < c < n, 1 < m < ,  = tolerance, max iteration = N
Guess : V0
Begin
t1
tol  high value
Repeat while (t  N and tol > )
Compute Ut with Vt-1 using (1)
Compute Vt with Ut using (2)
Vt  Vt 1
Compute tol 
pc
t  t+1
End Repeat
(The initialization can also be
Output: Vt, Ut done on U)
Discussions

A batch mode algorithm


Local Minima of Jm
m1+, uik  {0,1}, FCM  HCM
m  , uik  1/c, i and k
Choice of m
Fuzzy Classification
K- nearest neighbor algorithm: Voting on crisp labels

Class 1 Class 2 Class 3


 1  0  0
     
 0  1  0
     
 0  0  1

z
K-nn Classification (continued)

The crisp K-nn rule can be generalized to generate fuzzy


labels.
Take the average of the class labels of each neighbor:

 1  0  0
     
2 0  3 1  1 0
       0.33
 0  0  1  
D( z )    0.50
6  
 017
. 
This method can be used in case the vectors have fuzzy or
possibilistic labels also.
K-nn Classification (continued)

Suppose the six neighbors of z have fuzzy labels as:

x1 x2 x3 x4 x5 x6
 0.9  0.9  0.3  0.03  0.2  0.3
           
 0.0  01
.  0.6  0.95  0.8  0.0
           
 01
.  0.0  01
.  0.02  0.0  0.7

 0.9  0.9  0.3  0.03  0.2  0.3


           
 0.0   01
.    0.6   0.95   0.8   0.0
             0.44
 01
.   0.0  01 .   0.02  0.0  0.7  
D( z )    0.41
6  
 015
. 
Fuzzy Rule Based Classifiers
Rule1:
If x is CLOSE to a1 and y is
CLOSE to b1 then (x,y) is in class
is 1
Rule 2:
If x is CLOSE to a2 and y is
CLOSE to b2 then (x,y) is in class
is 2
How to get such rules!!
An expert may provide us with classification rules.
We may extract rules from training data.

Clustering in the input space may be a possible way to


extract initial rules.

If x is CLOSE TO Ax & y
is CLOSE TO Ay Then
Ay Class is
If x is CLOSE TO Bx & y
is CLOSE TO By Then
By
Class is

Ax Bx

You might also like