2017 Machine Learning-Part
2017 Machine Learning-Part
May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
Machine Learning
Algorithms and Applications
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
OTHER TITLES FROM AUERBACH PUBLICATIONS AND CRC PRESS
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Machine Learning
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
Mohssen Mohammed
Muhammad Badruddin Khan
Eihab Bashier Mohammed Bashier
applicable copyright law.
CRC Press
CRC Taylor & Francis Group
Boca Raton London New York
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
MATLAB® is a trademark of The MathWorks, Inc. and is used with permission. The MathWorks does not
warrant the accuracy of the text or exercises in this book. This book’s use or discussion of MATLAB® soft-
ware or related products does not constitute endorsement or sponsorship by The MathWorks of a particular
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
CRC Press
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487-2742
This book contains information obtained from authentic and highly regarded sources. Reasonable efforts
have been made to publish reliable data and information, but the author and publisher cannot assume
responsibility for the validity of all materials or the consequences of their use. The authors and publishers
have attempted to trace the copyright holders of all material reproduced in this publication and apologize to
copyright holders if permission to publish in this form has not been obtained. If any copyright material has
not been acknowledged please write and let us know so we may rectify in any future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmit-
ted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented,
including photocopying, microfilming, and recording, or in any information storage or retrieval system,
without written permission from the publishers.
For permission to photocopy or use material electronically from this work, please access www.copyright.
com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood
Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and
registration for a variety of users. For organizations that have been granted a photocopy license by the CCC,
a separate system of payment has been arranged.
Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used
only for identification and explanation without intent to infringe.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
Contents
Preface................................................................................xiii
Acknowledgments ............................................................. xv
Authors .............................................................................. xvii
Introduction ...................................................................... xix
1 Introduction to Machine Learning...........................1
1.1 Introduction ................................................................ 1
1.2 Preliminaries ............................................................... 2
1.2.1 Machine Learning: Where Several
Disciplines Meet ............................................... 4
1.2.2 Supervised Learning ........................................ 7
1.2.3 Unsupervised Learning.................................... 9
1.2.4 Semi-Supervised Learning ..............................10
1.2.5 Reinforcement Learning..................................11
1.2.6 Validation and Evaluation ...............................11
1.3 Applications of Machine Learning Algorithms .........14
1.3.1 Automatic Recognition of Handwritten
Postal Codes....................................................15
1.3.2 Computer-Aided Diagnosis .............................17
1.3.3 Computer Vision .............................................19
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
vii
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
viii ◾ Contents
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
x ◾ Contents
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Contents ◾ xi
Index ..........................................................................195
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
Preface
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
xiii
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
xiv ◾ Preface
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
Acknowledgments
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
xv
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
Authors
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
xvii
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
xviii ◾ Authors
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
introduction
have had the dreams to create machines that have the same
level of intelligence as humans. Many science fiction movies
have expressed these dreams, such as Artificial Intelligence;
The Matrix; The Terminator; I, Robot; and Star Wars.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:15 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
xix
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
xx ◾ Introduction
Chapter 4
Naïve Bayesian
Classification
4.1 Introduction
Naïve Bayesian classifiers [1] are simple probabilistic classifiers
with their foundation on application of Bayes’ theorem with
the assumption of strong (naïve) independence among the
features. The following equation [2] states Bayes’ theorem in
mathematical terms:
P ( A ) P ( B| A )
P ( A|B ) =
P (B)
where:
A and B are events
P(A) and P(B) are the prior probabilities of A and B without
regard to each other
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
73
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
74 ◾ Machine Learning
P (c k X ) > P (c j X ) for 1 ≤ j ≤ K , j ≠ k
4.2 Example
To demonstrate the concept of the naïve Bayesian classifier,
we will again use the following dataset:
4.4 Likelihood
Let X be the new example for which we want to predict that
applicable copyright law.
Outlook
P(Sunny| Play = Yes) = 2/9 P(Sunny| Play = No) = 3/5
P(Overcast| Play = Yes) = 4/9 P(Overcast| Play = No) = 0
P(Rain| Play = Yes) = 3/9 P(Rain| Play = No) = 2/5
Temperature
P(Hot| Play = Yes) = 2/9 P(Hot| Play = No) = 2/5
P(Mild| Play = Yes) = 4/9 P(Mild| Play = No) = 2/5
P(Cool| Play = Yes) = 3/9 P(Cool| Play = No) = 1/5
Humidity
P(High| Play = Yes) = 3/9 P(High| Play = No) = 4/5
P(Normal| Play = Yes) = 6/9 P(Normal| Play = No) = 1/5
Windy
P(True| Play = Yes) = 3/9 P(True| Play = No) = 3/5
P(False| Play = Yes) = 6/9 P(False| Play = No) = 2/5
1. P(X/Play = Yes)
2. P(X/Play = No)
Outlook
P(Sunny| Play = Yes) = 3/12 P(Sunny| Play = No) = 4/8
P(Overcast| Play = Yes) = 5/12 P(Overcast| Play = No) = 1/8
P(Rain| Play = Yes) = 4/12 P(Rain| Play = No) = 3/8
Temperature
P(Hot| Play = Yes) = 3/12 P(Hot| Play = No) = 3/8
P(Mild| Play = Yes) = 5/12 P(Mild| Play = No) = 3/8
P(Cool| Play = Yes) = 4/12 P(Cool| Play = No) = 2/8
Humidity
P(High| Play = Yes) = 4/11 P(High| Play = No) = 5/7
applicable copyright law.
Windy
P(True| Play = Yes) = 4/11 P(True| Play = No) = 4/7
P(False| Play = Yes) = 7/11 P(False| Play = No) = 3/7
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
78 ◾ Machine Learning
the following:
1. Prior probability
2. Likelihood
3. Evidence
6 /16×0.00574 = 0.002152
xx{i} = out{i};
end
fy = zeros(nc,1);
num_of_rec_for_each_class = zeros(nc,1);
for i=1:nc
for j = 1:tot_rec
if (yy{j} == yu{i})
num_of_rec_for_each_class(1);
prob_table(:,:,2) = prob_table(:,:,2)./
num_of_rec_for_each_class(2);
Class : P
overcast,Rain,Sunny
cool,hot,mild
Outlook high,normal
false,true
Temperature
Humidity
Windy
Class : N
overcast,Rain,Sunny
cool,hot,mild
high,normal
false,true
A = {‘sunny’, ‘hot’,’high’,’false’};
A1 = find(ismember(rec_unique_value{1},A{1}));
A11 = 1;
applicable copyright law.
A2 = find(ismember(rec_unique_value{2},A{2}));
A21 = 2;
A3 = find(ismember(rec_unique_value{3},A{3}));
A31 = 3;
A4 = find(ismember(rec_unique_value{4},A{4}));
A41 = 4;
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
82 ◾ Machine Learning
References
1. Good, I. J. The Estimation of Probabilities: An Essay on Modern
Bayesian Methods. Cambridge: MIT Press, 1965.
2. Kendall, M. G. and Stuart, A. The Advanced Theory of Statistics.
London: Griffin, 1968.
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
Chapter 5
The k-Nearest
Neighbors Classifiers
5.1 Introduction
In pattern recognition, the k-nearest neighbors algorithm
(or k-NN for short) is a nonparametric method used for
classification and regression [1]. In both cases, the input con-
sists of the k closest training examples in the feature space.
The output depends on whether k-NN is used for classification
or regression:
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
83
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
84 ◾ Machine Learning
5.2 Example
Suppose that we have a two-dimensional data, consisting of
circles, squares, and diamonds as in Figure 5.1.
Each of the diamonds is desired to be classified as either a
circle or a square. Then, the k-NN can be a good choice to do
the classification task.
The k-NN method is an instant-based learning method that
applicable copyright law.
1.0
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
0.9
0.8
0.7
0.6
0.5
x2
0.4
0.3
0.2
0.1
0.0
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
x1
Figure 5.1 Two classes of squares and circle data, with unclassified
diamonds.
It is clear that
f
X
→K
with f ( x j ) = C j , where X = {x 1 , , x N } is a subset of some
space Y .
Given an unclassified point x s = ( a s 1 , a s 2 , , a sm ) ∈ Y , we
would like to find C s ∈ K such that f ( x s ) = C s . At this point, we
have two scenarios [2]:
clear; clc;
EcoliData = load(‘ecoliData.txt’); % Loading the
ecoli dataset
EColiAttrib = EcoliData(:, 1:end-1); % ecoli
attributes
EColiClass = EcoliData(:, end); % ecoli classes
%knnmodel = ClassificationKNN.
applicable copyright law.
fit(EColiAttrib(1:280,:), EColiClass(1:280),...
% ‘NumNeighbors’, 5, ‘DistanceWeight’, ‘Inverse’);
% fitting the
% ecoli data with the k-nearest neighbors method
% The above line changes the number of neighbors
to 4
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
The k-Nearest Neighbors Classifiers ◾ 87
knnmodel = ClassificationKNN.
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
fit(EColiAttrib(1:280,:), EColiClass(1:280),...
‘NumNeighbors’, 5)
Pred = knnmodel.predict(EColiAttrib(281:end,:));
knnAccuracy = 1-find(length(EColiClass(281:end)-
Pred))/length(EColiClass(281:end));
knnAccuracy = knnAccuracy * 100
98.2143
approximated as
k
∑ w( x , x
1
f (xs ) = s sj )⋅ f (xs j )
k j =1
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
88 ◾ Machine Learning
knnmodel = ClassificationKNN.
fit(EColiAttrib(1:280,:), EColiClass(1:280),...
‘NumNeighbors’, 5, ‘DistanceWeight’, ‘Inverse’);
knnAccuracy =
98.2143
References
1. Hastie, T., Tibshirani, R., and Friedman, J. The Elements of
Statistical Learning: Data Mining, Inference, and Prediction,
2nd Ed., New York: Springer-Verlag, February 2009.
2. Vapnik, V. N. The Nature of Statistical Learning Theory. 2nd
Ed., New York: Springer-Verlag, 1999.
applicable copyright law.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
Chapter 6
Neural Networks
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
89
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
90 ◾ Machine Learning
(cell body)
Myelin
Dendrites
sheaths
Axon
6.1.1 Perceptrons
A perceptron is the simplest kind of ANNs, invented in
1957 by Frank Rosenblatt. A perceptron is a neural network
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Neural Networks ◾ 91
x1
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
w1
x2 w2
Processor y
w m−1
xm−1
wm
xm
∑wj =1
j ⋅ x j = w1 ⋅ x 1 + + w m ⋅ x m
1 if x > 0
f ( x ) =
0 if x < 0
applicable copyright law.
1 if x > 0
f ( x ) =
−1 if x < 0
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
92 ◾ Machine Learning
2.0 2.0
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
1.5 1.5
1.0
1.0 0.5
0.5 0.0
y
y
0.0 −0.5
−1.0
−0.5 −1.5
−1.0 −2.0
−1 −0.5 0 0.5 1 −1 −0.5 0 0.5 1
(a) x (b) x
1.0 2.0
1.5
0.5
1.0
0.0 0.5
y
y
0.0
−0.5
−0.5
−1.0 −1.0
−1 −0.5 0 0.5 1 −5 0 5
(c) x (d) x
Figure 6.3 Activation functions: (a) step activation function, (b) sign
activation function, (c) linear activation function, and (d) sigmoid
activation function.
1
f (x ) =
1 + e−x
1
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
b
x1
w1
x2 w2
w m−1
y=f ( m
)
b + Σ wj . xj
j=1
y
xm−1
wm
xm
w1 a j1
w = , xj =
applicable copyright law.
w m a jm
b 1
and generate random values for the (m + 1)-dimensional
vector w. We notice that
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
94 ◾ Machine Learning
w T ⋅ x j = w1a j 1 + + w m a jm + b
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
E j = y j − sign ( w T ⋅ x j )
The error E j could be either 2,0 or −2. Both the first and third
values of the error (nonzero values) indicate the occurrence of
an error in classifying the feature vector x j . Knowing the error
helps us in the adjustment of weights.
To readjust the weight, we define a learning rate parameter
α ∈ ( 0,1). This parameter determines how fast the weights are
changed, and hence, how fast the perceptron learns during the
training phase. Given the learning rate α, the correction in the
weight vector is given by
∆w = αx j E j
wnew = wold + ∆w
end
PerceptronTesting(TestingSet, Class, w)
%% The outputs are a vector of predicted classes
and the prediction
%% accuracy as a percentage. The Accuracy is the
percentage of the ratio
%% between the correctly classified instances in
the testing set and the
%% total number of instances in the testing set.
%% TestingSet is an N by m matrix and Class are the
corresponding classes
%% for the feature vectors in the testing set
matrix. The vector w is the
%% the vectors of weights, obtained during the
training phase
[N, m] = size(TestingSet);
PredictedClass = zeros(N, 1);
for j = 1: N
x = TestingSet(j,:);
wsum = sum(w.*x);
if wsum > 0
PredictedClass(j) = 1;
else
PredictedClass(j) = -1;
end
end
Error = Class - PredictedClass;
Accuracy = (1 - length(find(Error))/
length(Error))*100;
1.5
1.0
0.5
0.0
applicable copyright law.
−0.5
−0.5 0 0.5 1 1.5
Figure 6.5 The XOR binary operation: True instances are plotted in
circles and false instances are plotted with squares.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
98 ◾ Machine Learning
Hidden layers
the weights between the hidden layer and the input layer are
Copyright © 2017. CRC Press. All rights reserved. May not be reproduced in any form without permission from the publisher, except fair uses permitted under U.S. or
1 (1 + e − x )
1. S1 ( x ) = /
2. S 2 ( x ) = (1 − e − x )/(1 + e − x )
dS1 ( x )
= S1 ( x ) (1 − S1 ( x ) )
dx
and
dS 2 ( x )
= 1 − S2 ( x )
dx
The second sigmoid function S 2 ( x ) is the hyperbolic tangent
function. Choosing sigmoid functions guarantees the continuity
and differentiability of the error function. The graphs of the two
sigmoid functions S1 ( x ) and S 2 ( x ) are explained in Figure 6.7.
S1 =
1 1 − e−x
S2 =
1 + e−x 1 + e−x
1.5 1.5
1.0
1.0
0.5
y 0.5 y 0.0
−0.5
applicable copyright law.
0.0
−1.0
−0.5 −1.5
−4 −3 −2 −1 0 1 2 3 4 −4 −3 −2 −1 0 1 2 3 4
x x
∑
1 2
E= yj − p j
2 j =1
∂E
∂w1
∇E =
∂E
∂w L
In the gradient decent method, the update in the weights
vector is proportional to negative of the gradient. That is,
∂E
∆w j = −α , j = 1, …, L
∂w j
k =1
∑
oi = S1 wikz k
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
102 ◾ Machine Learning
we get:
∂E ∂E ∂oi
= ⋅ = − ( u i − oi ) oi ( 1 − oi ) z j
∂wij ∂oi ∂wij
∆wij = α ( u i − oi ) oi (1 − oi ) z j
clear; clc;
A = load(‘EColi1.txt’); % Loading the ecoli data,
with the classes at the
% last column
%B = A(1:end, 2:end);
C = A(1:end, 1:end-1)’; % C is the matrix of the
feature vectors
T = A(:, end)’; % T is the vector of classes
net = feedforwardnet; % Initializing a neural
applicable copyright law.
network ‘net’
net = configure(net, C, T);
hiddenLayerSize = 10; % Setting the number of
hidden layers to 10
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914
Neural Networks ◾ 103
recognition network
net.divideParam.trainRatio = 0.7; % Ratio of
training data is 70%
net.divideParam.valRatio = 0.2; % Ratio of
validation data is 20%
net.divideParam.testRatio = 0.1; % Ratio of testing
data is 10%
[net, tr] = train(net, C, T); % Training the
network and the resulting
% model is the output net
outputs = net(C); % applying the model to the data
errors = gsubtract(T, outputs); % computing the
classification errors
performance = perform(net, T, outputs)
view(net)
The outputs of the above script are as follows:
>> performance =
0.7619
applicable copyright law.
Figure 6.8 The output neural network model for classifying the
ecoli data.
EBSCO Publishing : eBook Collection (EBSCOhost) - printed on 11/1/2023 8:13 AM via PERPUSTAKAAN NASIONAL
REPUBLIK INDONESIA
AN: 1293656 ; Mohssen Mohammed, Muhammad Badruddin Khan, Eihab Bashier Mohammed Bashier.; Machine
Learning : Algorithms and Applications
Account: ns003914