0% found this document useful (0 votes)
2 views

Advanced Decision Sciences Based On Deep Learning And Ensemble Learning Algorithms A Practical Approach Using Python Paneerselvam pdf download

Ebook

Uploaded by

sanoosrekop
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Advanced Decision Sciences Based On Deep Learning And Ensemble Learning Algorithms A Practical Approach Using Python Paneerselvam pdf download

Ebook

Uploaded by

sanoosrekop
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 78

Advanced Decision Sciences Based On Deep

Learning And Ensemble Learning Algorithms A


Practical Approach Using Python Paneerselvam
download
https://ebookbell.com/product/advanced-decision-sciences-based-
on-deep-learning-and-ensemble-learning-algorithms-a-practical-
approach-using-python-paneerselvam-36346546

Explore and download more ebooks at ebookbell.com


Here are some recommended products that we believe you will be
interested in. You can click the link to download.

Advanced Decision Sciences Based On Deep Learning And Ensemble


Learning Algorithms S Sumathi

https://ebookbell.com/product/advanced-decision-sciences-based-on-
deep-learning-and-ensemble-learning-algorithms-s-sumathi-36373830

Sensor Systems For Biological Agent Attacks Protecting Buildings And


Military Bases 1st Edition National Research Council Division On
Engineering And Physical Sciences Board On Manufacturing And
Engineering Design Committee On Materials And Manufacturing Processes
For Advanced Sensors
https://ebookbell.com/product/sensor-systems-for-biological-agent-
attacks-protecting-buildings-and-military-bases-1st-edition-national-
research-council-division-on-engineering-and-physical-sciences-board-
on-manufacturing-and-engineering-design-committee-on-materials-and-
manufacturing-processes-for-advanced-sensors-51850544

Advances In Decision Sciences Image Processing Security And Computer


Vision International Conference On Emerging Trends In Engineering
Icete Vol 1 1st Ed Suresh Chandra Satapathy

https://ebookbell.com/product/advances-in-decision-sciences-image-
processing-security-and-computer-vision-international-conference-on-
emerging-trends-in-engineering-icete-vol-1-1st-ed-suresh-chandra-
satapathy-10489154

Advances In Decision Sciences Image Processing Security And Computer


Vision International Conference On Emerging Trends In Engineering
Icete Vol 2 1st Ed 2020 Suresh Chandra Satapathy

https://ebookbell.com/product/advances-in-decision-sciences-image-
processing-security-and-computer-vision-international-conference-on-
emerging-trends-in-engineering-icete-vol-2-1st-ed-2020-suresh-chandra-
satapathy-10798180
12th International Conference On Information Systems And Advanced
Technologies Icisat 2022 Intelligent Information Data Science And
Decision Support System Mohamed Ridda Laouar

https://ebookbell.com/product/12th-international-conference-on-
information-systems-and-advanced-technologies-icisat-2022-intelligent-
information-data-science-and-decision-support-system-mohamed-ridda-
laouar-49114548

Advanced Multicriteria Decision Making For Addressing Complex


Sustainability Issues Prasenjit Chatterjee

https://ebookbell.com/product/advanced-multicriteria-decision-making-
for-addressing-complex-sustainability-issues-prasenjit-
chatterjee-22043392

Advances In Decision Science And Management Proceedings Of Third


International Conference On Decision Science And Management Icdsm 2021
Taosheng Wang

https://ebookbell.com/product/advances-in-decision-science-and-
management-proceedings-of-third-international-conference-on-decision-
science-and-management-icdsm-2021-taosheng-wang-43027772

Advances In Optimization And Decision Science For Society Services And


Enterprises Ods Genoa Italy September 47 2019 1st Edition Massimo
Paolucci

https://ebookbell.com/product/advances-in-optimization-and-decision-
science-for-society-services-and-enterprises-ods-genoa-italy-
september-47-2019-1st-edition-massimo-paolucci-10788824

Advanced Technologies Systems And Applications V Papers Selected By


The Technical Sciences Division Of The Bosnianherzegovinian American
Academy Of Arts And Sciences 2020 1st Ed Samir Avdakovi

https://ebookbell.com/product/advanced-technologies-systems-and-
applications-v-papers-selected-by-the-technical-sciences-division-of-
the-bosnianherzegovinian-american-academy-of-arts-and-
sciences-2020-1st-ed-samir-avdakovi-22500460
COMPUTER SCIENCE, TECHNOLOGY AND APPLICATIONS

ADVANCED DECISION SCIENCES


BASED ON DEEP LEARNING
AND ENSEMBLE LEARNING
ALGORITHMS

A PRACTICAL APPROACH
USING PYTHON

No part of this digital document may be reproduced, stored in a retrieval system or transmitted in any form or
by any means. The publisher has taken reasonable care in the preparation of this digital document, but makes no
expressed or implied warranty of any kind and assumes no responsibility for any errors or omissions. No
liability is assumed for incidental or consequential damages in connection with or arising out of information
contained herein. This digital document is sold with the clear understanding that the publisher is not engaged in
rendering legal, medical or any other professional services.
COMPUTER SCIENCE, TECHNOLOGY
AND APPLICATIONS

Additional books and e-books in this series can be found on Nova’s


website under the Series tab.
COMPUTER SCIENCE, TECHNOLOGY AND APPLICATIONS

ADVANCED DECISION SCIENCES


BASED ON DEEP LEARNING
AND ENSEMBLE LEARNING
ALGORITHMS

A PRACTICAL APPROACH
USING PYTHON

S. SUMATHI, PHD
SURESH RAJAPPA, PHD
L. ASHOK KUMAR, PHD
AND
SUREKHA PANEERSELVAM, PHD
Copyright © 2021 by Nova Science Publishers, Inc.

All rights reserved. No part of this book may be reproduced, stored in a retrieval system or transmitted
in any form or by any means: electronic, electrostatic, magnetic, tape, mechanical photocopying,
recording or otherwise without the written permission of the Publisher.

We have partnered with Copyright Clearance Center to make it easy for you to obtain permissions to
reuse content from this publication. Simply navigate to this publication’s page on Nova’s website and
locate the “Get Permission” button below the title description. This button is linked directly to the title’s
permission page on copyright.com. Alternatively, you can visit copyright.com and search by title, ISBN,
or ISSN.

For further questions about using the service on copyright.com, please contact:
Copyright Clearance Center
Phone: +1-(978) 750-8400 Fax: +1-(978) 750-4470 E-mail: [email protected].

NOTICE TO THE READER


The Publisher has taken reasonable care in the preparation of this book, but makes no expressed or
implied warranty of any kind and assumes no responsibility for any errors or omissions. No liability is
assumed for incidental or consequential damages in connection with or arising out of information
contained in this book. The Publisher shall not be liable for any special, consequential, or exemplary
damages resulting, in whole or in part, from the readers’ use of, or reliance upon, this material. Any parts
of this book based on government reports are so indicated and copyright is claimed for those parts to the
extent applicable to compilations of such works.

Independent verification should be sought for any data, advice or recommendations contained in this
book. In addition, no responsibility is assumed by the Publisher for any injury and/or damage to persons
or property arising from any methods, products, instructions, ideas or otherwise contained in this
publication.

This publication is designed to provide accurate and authoritative information with regard to the subject
matter covered herein. It is sold with the clear understanding that the Publisher is not engaged in
rendering legal or any other professional services. If legal or any other expert assistance is required, the
services of a competent person should be sought. FROM A DECLARATION OF PARTICIPANTS
JOINTLY ADOPTED BY A COMMITTEE OF THE AMERICAN BAR ASSOCIATION AND A
COMMITTEE OF PUBLISHERS.

Additional color graphics may be available in the e-book version of this book.

Library of Congress Cataloging-in-Publication Data

ISBN:  H%RRN

Published by Nova Science Publishers, Inc. † New York


CONTENTS

Preface vii
Acknowledgments ix
Chapter 1 Introduction 1
Chapter 2 Deep Learning 37
Chapter 3 Convolutional Neural Networks 89
Chapter 4 Recurrent Neural Networks 145
Chapter 5 Ensemble Learning 177
Chapter 6 Implementing DL and Ensemble Learning Models:
Real World Use Cases 217
Appendix 333
Suggested Reading 343
About the Authors 349
Index 355
PREFACE

Deep Learning and Data sciences were an academic discipline with only
a theoretical approach to real-world problems. The applications such as
computer vision, face recognition, and speech recognition were the
prominent ones for which artificial neural networks and machine learning
components were too narrow. In addition, neural networks were considered
to be almost outdated for these applications. In the past seven to eight years,
Deep Learning and Data sciences have grown massively, diving into diverse
application areas such as statistical modeling, speech recognition, voice-to-
text conversion, natural language processing, computer vision, and many
more. Any engineering application, you name it, and deep learning is applied
there, from gaming to medical to physics and many more. It has almost made
scientists and research aspirants feel that survival is impossible without these
sophisticated learning mechanisms.
ACKNOWLEDGMENTS

The authors are always thankful to the Almighty for perseverance and
achievements.
The authors owe their gratitude to Shri L. Gopalakrishnan, Managing
Trustee, PSG Institutions, and gratitude to Dr. K. Prakasan, Principal In
Charge, PSG College of Technology, Coimbatore, India, for their
wholehearted cooperation and great encouragement in this successful
endeavor.
Dr. Sumathi owes much to her daughter, S. Priyanka, who contributed a
great deal of time and assumed much responsibility in helping to complete
this book. She is grateful and proud of the strong support given by her
husband, Mr. Sai Vadivel. Dr. Sumathi would like to extend wholehearted
thanks to her parents, who have reduced the family commitments and
offered constant support. She is incredibly thankful to her brother Mr. M. S.
Karthikeyan who has always been a “stimulator” for her progress. Finally,
wishes to thank her Parents-in-Law for their great moral support.
Dr. Suresh Rajappa would like to thank his wife, Mrs. Padmini
Govindarajan, for her unconditional support and time whenever needed
during the book's writing. Dr. Suresh also thanks his twin daughters Ms.
Dharshini Suresh and Ms. Varshini Suresh, for their continued
encouragement for the book. He would also like to extend his gratitude to
his present and former colleagues at KPMG, especially Mr. Vimal Kumar
x S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

Mehta, for his advice. Finally, Dr. Suresh would thank Mr. Srihari
Govindarajan, Principal at Kloudlogic Inc., for helping him to proofread the
materials for this book and keeping his sanity.
Dr. L. Ashok Kumar would like to take this opportunity to acknowledge
those people who helped me in completing this book. I am thankful to all
my research scholars and students who are doing their projects and research
work with me. But the writing of this book is possible mainly because of the
support of my family members, parents, and sisters. Most importantly, I am
very grateful to my wife, Y. Uma Maheswari, for her constant support during
writing. Without her, all these things would not be possible. I want to express
my special gratitude to my daughter, A. K. Sangamithra, for her smiling face
and support. I want to dedicate this work to her.
Dr. Surekha P. would like to thank her parents, husband, Mr. S.
Srinivasan, and daughter Saisusritha who shouldered extra responsibilities
during the months contributed in writing this book. They did this with a
long-term vision, depth of character, and positive outlook that are truly
befitting of their name. Dr. Surekha offers her humble pranams at the lotus
feet of Amma, Mata Amritanandamayi.
The authors wish to thank all their friends, colleagues, and research
assistants who have been with them in all their endeavors with their
excellent, unforgettable help and assistance in successfully executing the
work.
Chapter 1

INTRODUCTION

LEARNING OUTCOMES

At the end of this chapter, the reader will be able to:

 Understand the need for deep learning and ensemble learning


algorithms in data sciences;
 Appreciate the fundamentals of machine learning;
 Have knowledge of linear algebra concepts related to deep learning;
 Know the fundamental concepts of neural networks, differences
between a shallow neural network and a deep neural network;
 Identify the application areas of deep learning models.

Keywords: deep neural networks, shallow neural networks, linear algebra

1.1. INTRODUCTION

In the present-day scenario, the challenging objectives placed in front of


humans require that the computers by themselves can perceive, process, and
make decisions in a more humanly manner. With the advancements in
2 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

machine and deep learning techniques, computers have become more


intuitive to the real-world randomness present in the environment. This
‘intelligence’ has aided them in achieving solutions to these tasks.
Machine Learning algorithms are mathematical models based on
training data used in the decision-making process without specific
instruction. Deep learning is a subset of Machine Learning that mimics the
working of the human brain in processing data for use in detecting objects,
recognizing speech, and making decisions.
Deep learning algorithms imply deep architectures of multiple layers to
extract features from raw data through a hierarchical progression of learned
features using different layers from the bottom navigating upwards without
prior knowledge of any rule. It becomes more challenging when dealing with
a vast amount of data. Thus deep learning helps significantly to avoid the
feature engineering process, which is time and resource consuming. The
edge presented by Deep Learning is that it manifests the ability to learn
without human supervision, drawing from data that is both unstructured and
unlabelled. Convolutional Neural Network (CNN) is a particular type of
neural network designed to understand the intrinsic properties of images
implicitly. With input being images, these networks are trained to perform a
specific functionality to obtain the desired output. In this chapter, the need
for deep learning and ensemble learning algorithms in data sciences is
elaborated. The fundamentals of machine learning, linear algebra concepts
related to deep learning, the evolution of deep learning models, fundamental
concepts of neural networks are explained. The differences between a
shallow neural network and a deep neural network are highlighted based on
relevant concepts such as activation functions and propagation of error. An
introduction to the ensemble learning approaches is also briefed, and the
broad application areas of deep learning models are summarized.

1.2. RATIONALE

Deep learning has evolved from machine learning and artificial


intelligence (AI) and imitates how human beings acquire knowledge. Deep
Introduction 3

learning is a distinctive element in the field of data science. Deep learning


evolves around predictive modeling and statistics, where researches involve
a large amount of data. It helps a data scientist analyze and find suitable
outcomes from this large amount of data, thus making the process simpler
and faster. While conventional machine learning algorithms and neural
network algorithms are linear models and can be used for prediction
applications, deep learning algorithms are hierarchical models with a high
level of complexity and abstraction that can be used in automatic prediction
applications.
Andrew Ng, chairman/co-founder of Coursera, quotes, “Deep Learning
is an amazing tool that is helping numerous groups create exciting AI
applications,” and “It is helping us build self-driving cars, accurate speech
recognition, computers that can understand images, and much more.”
Though it has been half a century since the first Neural Network was
developed, Deep learning models seem to be more powerful only during
recent times. The reason for this is

 availability and access to a large amount of digital data


 availability of powerful high-speed Graphics Processing Units
(GPUs)

With these resources, deep learning has gained more popularity and
application in diverse disciplines throughout the technological community.

1.3. LINEAR ALGEBRA FOR DECISION SCIENCE

In decision science, linear algebra plays a prominent role since it


involves studying linear sets of equations and transformation. The concepts
of linear algebra evolve in the areas of analysis of rotations in space, solution
of coupled differential equations, least-squares fitting, determination of a
circle passing through three given points, and many other problems in
mathematics, physics, and engineering. Without linear algebra, there is no
4 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

machine learning. On the other hand, machine learning is deeply understood


in terms of the fundamental mathematics involved in linear algebra.
Linear algebra serves as a prerequisite to machine learning since linear
algebra deals with the mathematics involved in data in terms of matrices and
vectors. Linear algebra is used to understand the operations on lines, planes,
vector spaces, and the mapping involved to lead to the linear transforms
among linear equations.
Let us consider the image given in Figure 1. The moment we see the
image, our brain identifies it an apple. Our brain has been trained to identify
the item in the image automatically. Since then, it has gone through millions
of years of evolution. If you imagine a computer thinking and recognize the
image like a human, then the task becomes complicated. The image should
be in some form in the computer to enable this, and then some attributes
should be defined through which the computer can identify the item in the
image correctly. The information about the image in pixel intensities is
stored in the form of 0s and 1s in a computer in a well-organized structure
called the MATRIX. The operations involved concerning the image
attributes evolve around the defined MATRIX.

Figure 1. Apple – An illustration.

Even if we consider the simplest form of a neuron, inputs for an image


or text, etc., are stored in the form of a matrix. The weights and biases for a
particular network are stored in the form of a matrix. So any operation
around these matrices involves the application of Linear Algebra concepts.
Once a given problem is represented in a matrix form, then procedures,
Introduction 5

namely, addition, scalar multiplication, matrix multiplication, transpose,


adjoint, the inverse of a matrix, etc., become more accessible.
The following essential aspect of any data science paradigm is the
concept of Eigenvalues and Eigenvectors. It is mainly applied when we
handle a large amount of data in machine learning and data science. Let us
consider a 3 x 3 square matrix A for which we need to compute the
Eigenvalues and Eigenvectors.

0 1 0
𝐴=[ 3 0 2]
−12 −7 −6

Let us see the Python code for computing the Eigenvalues and
Eigenvectors for the above example:

import numpy as np
import scipy.linalg as la

A = np.array([[0,1,0],[3,0,2],[-12,-7,-6]])
print(‘The given square matrix is\n’, A)
# finding eigenvalues and eigenvectors for the given square matrix
EigVal, EigVect = np.linalg.eig(A)

# printing Eigenvalues
print(‘The Eigenvalues for the given square matrix:\n’, EigVal)

# printing Eigenvectors
print(‘The Eigen Vector for the given square matrix:\n’, EigVect )

The output of the above example:


The given square matrix is

[[0 1 0]
[3 0 2]
6 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

[-12 -7 -6]]

The Eigenvalues for the given square matrix:

[-1. -2. -3.]

The Eigen Vector for the given square matrix:

[[-0.57735027 -0.43643578 0.22941573]


[ 0.57735027 0.87287156 -0.6882472 ]
[ 0.57735027 -0.21821789 0.6882472 ]]

It should also be recalled that the Eigenvalues of a symmetric matrix are


always real, and the Eigenvectors of a symmetric matrix are always
orthogonal. The property can be verified in the following example where we
generate a random matrix of order n:

import numpy as np
import scipy.linalg as la

n = 5 #random matrix of order n


A = np.random.randint(0,5,(n,n))
print(A)

A:
[[0 4 0 4 3]
[3 0 4 2 1]
[4 1 0 0 2]
[3 4 1 2 1]
[3 1 2 2 1]]
# finding eigenvalues and eigenvectors for the given square matrix
EigVal, EigVect = np.linalg.eig(A)

# printing Eigenvalues
Introduction 7

print(‘The Eigenvalues for the given square matrix:\n’, EigVal)

The Eigenvalues for the given square matrix:

[9.86402919+0.j -3.03650521+1.50706405j -3.03650521-1.50706405j


-0.39550939+1.12030595j -0.39550939-1.12030595j]
# To check orthogonality of Eigenvectors, we need to prove that the dot
product of two Eigenvectors is zero

# Eigen vector 1 - read first column


EigVectCol1 = EigVect[:,0]
print(‘Column 1 of the computed Eigen Vector:\n’,EigVectCol1)

# Eigen vector 2 - read fourth column


EigVectCol4 = EigVect[:,4]
print(‘Column 4 of the computed Eigen Vector:\n’,EigVectCol4)

#Evaluating the dot product


dp=EigVectCol1 @ EigVectCol4
print(‘Dot product of two Eigenvectors:\n’,dp)

Column 1 of the computed Eigen Vector:


[-0.51132958+0.j -0.43757932+0.j -0.33563115+0.j -0.51294648+0.j
-0.41388891+0.j]

Column 4 of the computed Eigen Vector:


[0.28444625+0.07105779j -0.36741701-0.28196317j -
0.47120519+0.01240114j
0.57298415-0.j -0.28505442+0.26036061j]

Dot product of two Eigenvectors:

(-0.002449896777988918-0.024875274571690694j)
8 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

The dot product of two Eigenvectors is close to zero, and hence can be
concluded that they satisfy the principle of orthogonality.

1.3.1. Eigenvectors in Data Science

In machine learning and data science, many data are operated using the
concept of Eigenvalues and Eigenvectors. For example, the Principal
Component Analysis (PCA) algorithm in machine learning, one of the best
feature extraction algorithms, works on the concept of Eigenvectors. In
PCA, when a large number of data is involved, there is a need for
dimensionality reduction since too much data may have redundant features.
In addition, a large number of features lead to poor algorithmic efficiency
and occupy ample memory space. So to identify the predominant
characteristics, there is a need for Eigenvectors. The flowchart in Figure 2
illustrates the reduction of dimensions in PCA using Eigenvectors.

Normalize the data based on its


mean

Evaluate the covariance matrix

Evaluate the Eigen vector of the


covariance matrix

Orient the data based on the


Eigen Vector

Figure 2. Reduction of dimension in PCA.

The data is first normalized by finding the zero mean of each column
and then subtracting the mean from each row for every column in the data
set. To obtain a specific range of data, we divide by the standard deviation
throughout. Then the covariance matrix is evaluated. This is done by
Introduction 9

multiplying the matrix by its transpose. Covariance gives us the measure by


which the data dimensions vary concerning each other.
The Eigenvalues and Eigenvectors are then computed. The Eigenvalues
are the data points scattered in the search space, while the Eigenvector will
be a line passing through the Eigenvalues. One column of the Eigenvector
will be orthogonal to the other column of the Eigenvector, as discussed in
the previous section. If the original data is multiplied by the Eigenvectors,
the result would be a new principal axis that gives us the main components.
The information is now oriented to the new axis, which is based on the
principal components.

1.4. FUNDAMENTALS OF MACHINE LEARNING

Machine learning can forecast better and accurate output. It builds the
algorithms which use the input data statistics to predict the result. All social
media websites use machine learning to display the data on the feed. A
simple illustration is given in Figure 3.
The procedure of machine learning involves searching through data to
look for patterns and program it accordingly. Some machine learning
examples include ads displayed as a suggestion, fraud detection.
Some of the machine learning methods are:

 The supervised machine learning algorithm


Supervised learning deals with the known and categorizes data
so that further can classify UN-categorized data.
In supervised learning, a sample set contains input data with
desired output data.
Based on this, new test data can easily be categorized.
For example, an application that identifies the animal as its
herbivore, a carnivore, or an omnivore animal. Using supervised
learning, it already knew the classification of the animal. Now,
whenever a new animal is entered into the system as an input system
will automatically predict its category
10 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

Ground
Truth

Answer
Key

Figure 3. A machine learning scenario.

 The unsupervised machine learning algorithm


In unsupervised learning, the sample set data is unknown and
unlabeled. The data directly cannot be implemented as we are
unaware of the outputs. It simply works on finding the similarities
between the data and categorize as one
For example, categorize the customer based on which product
they purchase, based on a similar product, we can categorize
customer
 The semi-supervised machine learning algorithm
Semi-supervised learning is the mixture of supervised and
unsupervised learning as its data set contains categorized and
uncategorized data. The aim is to predict the new data, which is
more effective and accurate than the user’s output data.
For example, you wish to buy a product and watching ads
related to the product, and suddenly you want to review the same
product. Still, of a different company, in this case, categorized data
was the ad user already watching based on these new ads would be
displayed.
 Reinforcement machine learning algorithm
There is no classified data machine learning from its experience
and prediction. Instead, an agent takes all the actions (robotic avatar)
Introduction 11

finds all the possible scenarios, and fits in the best Example games
like hangman.
Traditional Programming has become oscillate now; developers
use many advanced types of programming. Machine learning is one
of them. Machine learning is all about implementing algorithms.
Input and output are entered into the algorithm to create a program.
This program decreases the code complexity and predicts accurate
outcomes.
For example, in an online product purchasing application,
whenever a client places an order, we will first check either the
client will give payment on time based on old transactions done by
the customer. Input will be client personal and purchase detail, and
its output will be client pay on time or late. Machine Learning will
predict the output based on historical information of the client.
Deep Learning is a branch of artificial intelligence that uses the
working pattern of the human brain for processing and manipulating
the data for processes. Deep learning uses the neural network to
predict the output patterns

1.4.1. How Deep Learning Works

To understand how deep learning works, let’s take an example of the


shortest route calculation application. Whenever we are in a rush and want
to reach the final destination as soon as possible, at this point, we are in
search of a shortened route
Following will be system input required by the user:

 Starting point
 Destination

As mentioned earlier, deep learning works on a neural network; it has


nodes like neurons all the nodes are interlinked with each other.
12 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

Neurons have three layers, namely input layer, hidden layer, and output
layer, as shown in Figure 4.

Source: Sumathi, Sai, and Surekha Paneerselvam. Computational intelligence


paradigms: theory & applications using MATLAB. CRC Press, 2010.

Figure 4. Interconnection between layers in a deep network.

 Input Layer: This layer consist of the records entered by the user,
i.e., starting point and destination.
 Hidden Layer: This layer contains all the calculations and
implementation, i.e., w.r.t to the user starting point and destination
calculate the shorted path which covers minimum time and
Kilometers. The hidden layer can be one or multiple—deep learning
implements here, which have more than one hidden layer.
 Output Layer: This layer consists of the final user output. I.e., it will
display the shortest route to the user.
Introduction 13

1.4.2. How Artificial Intelligence Deep Learning and Machine


Learning Interconnected with Each Other?

Artificial Intelligence, Deep Learning, and Machine Learning are


similar in that they all work on big data and work on modern ways of
programming languages to predict the outputs. The interrelation between the
three is deep learning the subset of machine learning, and machine learning
is the subset of artificial intelligence

ARTIFICIAL
INTELLIGENCE (AI)
Early AI breakthroughs in rule
based systems, heuristic search, MACHINE
formal knowledge representation and LEARNING (ML)
inference. Statistical algorithms enable
machines to learn from data DEEP
patterns and anomalies and then LEARNING (DL)
make predictions, classifications, Enhanced Neural Networks pre-learn data
and decisions. features without human engineering. Multi-
layered “deep” architectures enable learning at
many abstraction levels with accuracy and error
rates comparable to humans for image
recognition and language understanding.

Hit scalability limitations and


Successful ML applications
fragility of solutions that manually
limited by need for huge amounts
encode knowledge without learning.
of data manually “labeled” as Complex, “black box” mechanics of Deep
Successful in restricted domains and
training examples and by need Learning challenged to explain how and why
smaller size problems.
for complex engineering by decisions are being made. Pushing the frontier of
human experts of data “features” AI research into higher level of cognitive activities
that machines can best learn such as reasoning, planning, feeling,
from.. consciousness, and ethics.

1950’s 1960’s 1970’s 1980’s 1990’s 2000’s 2010’s

Figure 5. AI vs ML vs DL
Figure 5. AI vs ML vs DL.

So, the picture says it all (Figure 5); AI is comprehensive that initially
exploded in the late 1950s, which was a drastic change in the data science
industry. Then, in the late 1980s, machine learning was introduced, which
enhanced the features technologies used by artificial intelligence. Lastly,
deep learning was introduced that brings artificial intelligence and machine
learning to the next level.
14 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

1.5. HISTORY OF DEEP LEARNING

Alexey Grigoryevich Ivakhnenko and Valentin Grigorʹevich Lapa were


among the earliest people to implement the Deep learning concepts during
1965. They actualized initiation capacities, including polynomials, and they
were broken down utilizing factual methodologies. However, the progress
of data starting with one layer then onto the next layer was delayed because
of the computational time included. During the 1970s, Kunihiko Fukushima
was the first to apply convolutional neural networks with various pooling
and convolutional layers. In 1979, Fukushima built up the Neocognitron
neural system with a leveled multilayer structure. This system was equipped
for perceiving visual examples. The system took after current renditions yet
were prepared with a support technique of repeating enactment in various
layers, which picked up quality after some time.
Several applications based on the concepts of Neocognitron were
applied to multifarious applications. The utilization of top-down
associations and new learning techniques has considered an assortment of
neural systems to be figured out. When more than one example is displayed
simultaneously, the Selective Attention Model can isolate and perceive
singular examples by moving its consideration from one to the next. A
cutting-edge Neocognitron could not recognize patterns with missing data
yet can likewise finish the picture by including the missing data. This
process was known as inference.
Using errors in training Deep learning models evolved with the theory
proposed by Seppo Linnainmaa in 1970. This backpropagation of errors was
later applied to neural networks in 1985, when Rumelhart, Williams, and
Hinton exhibited backpropagation in a neural network with distributed
representations. Thoughtfully, this revelation uncovered the inquiry inside
psychological brain research of whether human comprehension depends on
emblematic rationale (computationalism) or dispersed portrayals
(connectionism). Finally, in 1989, Yann LeCun gave the main viable exhibit
of backpropagation at Bell Labs. He consolidated Convolutional Neural
Networks (CNN) with backpropagation for classifying handwritten digits.
Introduction 15

The remarkable transformation of Deep Learning occurred in 1999


when PCs began getting quicker at preparing information and Graphic
Processing Units (GPUs) were created. With GPUs processing pictures, data
processing rates were found to increase at a faster rate than predicted. During
this time, support vector machines evolved, and they were competent for
neural networks. While a neural network could be moderately contrasted
with a support vector machine, neural networks offered better outcomes
utilizing similar information. In addition, neural networks have shown
significant performance whenever the training data was increased.
During 2000, it was observed that in the networks with gradient-based
learning methods, the features in the lower layers of a network were not
learned by the upper layers of the networks. The cause for this limitation
was the behavior of certain activation functions. The solution for such
problems was to have a layer-by-layer pre-training and development of long
short-term memory.
In 2001, the Gartner group described the data as three-dimensional,
including the increasing volume of data, increasing data speed, and
increasing range of data sources and types. This verity of data led to the
concepts of Big data analytics.
Stanford professor Fei-Fei Li proposed and launched the ImageNet in
2009, a free database including more than 14 million labeled images. These
images were used for training the neural networks. He said that “Our vision
was that Big Data would change the way machine learning works. Data
drives learning”. By the year 2011, the speed of GPUs significantly
increased, thus enabling the training in CNN at a faster rate, avoiding layer-
by-layer pre-training. AlexNet was one among the evolved CNN, whose
architecture used the Rectified Linear Activation unit, enhancing
computation speed. In addition, in 2012, Google Brain launched ‘The Cat
Experiment,’ which explored the difficulties of unsupervised learning
algorithms, thus enabling deep learning (supervised learning) to train based
on unsupervised algorithms.
16 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

Table 1. Evolution of Deep Learning

Year Model
1943 The first mathematical model of a neural network – McCullochPits
Neural Network
1950 The prediction of machine learning – Turing
1952 First machine learning programs - Arthur Samuel
1957 Setting the foundation for deep neural networks - Frank Rosenblatt
1960 Control theory – Backpropagation of errors - Henry J. Kelley
1965 The earliest working model of deep learning networks - Alexey
Ivakhnenko and V.G. Lapa
1979-80 An ANN learns how to recognize visual patterns - Kunihiko
Fukushima
1982 The creation of the Hopfield Networks - John Hopfield
1985 A program learns to pronounce English words - Terry Sejnowski
1986 Improvements in shape recognition and word prediction - David
Rumelhart, Geoffrey Hinton, and Ronald J. Williams
1989 Machines read handwritten digits - Yann LeCun
1989 Q-learning - Christopher Watkins
1993 A ‘very deep learning’ task is solved - Jürgen Schmidhuber
1995 Support vector machines - Corinna Cortes and Vladimir Vapnik
1997 Long short-term memory was proposed - Jürgen Schmidhuber and
Sepp Hochreiter
1998 Gradient-based learning - Yann LeCun
2009 Launch of ImageNet - Fei-Fei Li
2011 Creation of AlexNet - Alex Krizhevsky
2012 The Cat Experiment
2014 DeepFace
2014 Generative Adversarial Networks (GAN) - Ian Goodfellow
2016 Powerful machine learning products

In 2014, the social media’s deep learning systems were developed and
released, known as the DeepFace, used for human face recognition with
97.35% accuracy. During the same year, Ian Goodfellow and his team
introduced the Generative Adversarial Networks (GAN). He claims GAN to
be one of the most exciting ideas over a decade in Machine Learning. GANs
enable models to tackle unsupervised learning, which is more or less the end
goal in the artificial intelligence community. GAN uses two competing
networks: the first takes in data and attempts to create indistinguishable
samples. In contrast, the second receives both the data and created samples
and determines if each data point is genuine or generated.
Introduction 17

Cray Inc., in 2016, offered powerful machine and deep learning products
and solutions based on Microsoft’s neural-network software on its XC50
supercomputers with 1,000 Nvidia Tesla P100 graphic processing units.
They have proved to perform deep learning tasks on data in a fraction of the
time they used to take – hours instead of days.
Today, DL is present among us in ways we may not even have imagined.
For instance, Google’s voice and image recognition, Netflix and Amazon’s
recommendation engines, Apple’s Siri, automatic email and text replies,
chatbots, and many more. Table 1 summarizes the evolution of DL over the
past 60 years.

1.6. FUNDAMENTALS OF NEURAL NETWORKS

Artificial Neural Networks (ANN) are models that replicate the neural
structure of the brain. This class of machine learning models has been
biologically inspired and used in several computation applications. They
have gained popularity due to their capability in terms of less sensitivity to
noise, low-cost implementation, and ability to provide satisfactory results in
several real-world problems. ANNs have almost evolved in duplicating the
complex, versatile and robust structure of the human brain. The fundamental
processing element of an ANN is a neuron, which is configured to receive
inputs, process them, perform a non-linear operation on them, and obtain the
final result. This process in the artificial neuron is analogous to the neuron
in the human brain. From an application perspective, inputs are given to the
network comprised of a set of neurons. Every input is, in turn, multiplied by
a connection weight. Finally, the summation of the resulting products is fed
through a transfer function (activation function) to produce an output.
A simple neural network consists of input, output, and hidden layers, as
shown in Figure 6. The input layer is a set of neurons that receives inputs
from the outside world, while the neurons perform computation and send
information to the outside world from the output layer. A set of neurons
between the input layer and the output layer form the hidden layer. Some
networks are feed-forward where the input is presented at the input layer,
18 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

and the output is obtained from the output layer after processing. There are
feedback architectures where the output information is again sent back to the
hidden layer for further processing. The output layer also has competitions
performed among its neurons to select the best result. Once the neural
network architecture is framed, the model is ready for the training or learning
process. The initial weights are chosen randomly to train a neural network.
Training or learning is classified under two broad categories - unsupervised
learning and supervised learning. In unsupervised learning, the network has
to make sense of the inputs without help from the outside environment. The
weights of the neurons are modified during the training process. This does
not happen at once, but it occurs over several iterations determined by
learning rules. Supervised learning involves providing the network with the
desired output by manually “grading” the network’s performance or
delivering the desired outputs with the inputs.

Figure 6. A Simple Neural Network.

The ANNs learn or train themselves at a rate called the learning rate,
which controls the learning pace. With a slow learning rate, the computation
Introduction 19

time for the learning process to happen is more prolonged. In contrast, with
faster learning rates, the network may not be able to make the fine
discriminations possible with a system that learns more slowly. Hence a
tradeoff is required in deciding the choice of the learning rate. Learning is
undoubtedly more complex than the simplifications represented by the
learning laws. Some of the learning laws are Hopfield Law, Hebb’s Rule,
Delta rule, Extended Delta rule, Competitive learning rule, Correlation
learning rule, Boltzmann Learning Law, Outstar learning rule, and Memory-
Based Learning Law.
Most of the neural network architectures have been modeled based on
the problems to which they are applied. Most of the applications of ANN
fall into four different categories, namely, (i) Prediction, (ii) Classification,
(iii) Data Association, and (iv) Data Conceptualization. ANNs such as
Perceptron, Back Propagation, Delta Bar Delta, Extended Delta Bar Delta,
Directed Random search, and Higher Order Neural Networks belong to the
category of prediction networks. Learning Vector Quantization, Counter-
propagation network, and Probabilistic Neural Networks serve as excellent
classification algorithms. The Data association networks are Hopfield
network, Boltzmann Machine, Hamming Network, and Bi-directional
Associative Memory. Adaptive Resonance Network and Self Organizing
Map belong to the data conceptualization group of networks. A detailed
description of these networks, including the architecture, training, and
testing algorithm, can be found in our previous book on “Computational
Intelligence Paradigms: Theory & Applications using MATLAB,” published by
CRC Press in 2010.

1.6.1. Advantages

Some of the benefits of ANN include:

 requires less formal statistical training


 ability to implicitly detect complex non-linear relationships between
dependent and independent variables
20 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

 ability to detect all possible interactions between predictor variables


 availability of multiple training algorithms

1.6.2. Disadvantages

Disadvantages of ANN include

 “black box” nature of ANN


 greater computational burden
 proneness to overfitting
 practical nature of model development.

1.6.3. Applications

 Image processing
 Signal Processing
 Weather prediction and forecast
 Pattern recognition
 Classification etc.

1.7. SHALLOW NEURAL NETWORKS

When we hear the term “Neural Networks,” we generally assume that


there are many hidden layers, but there is a type of neural network with one
or two hidden layers, called Shallow Neural Networks. Understanding a
shallow neural network gives an understanding of what is going on inside a
deep neural network. In this section, a mathematical representation of the
shallow neural network is discussed. Figure 7 shows a typical shallow neural
network with one input, one output, and one hidden layer.
Introduction 21

Figure 7. A typical shallow network.

Where A1, A2 are two inputs, Z1, Z2, Z3, Z4, and Z5 are the hidden
layer, and Y1 is the output layer. The fundamental component of a neural
network is referred to as a neuron. Given an input, these neurons provide the
output and pass that output as an input to the successive layer. A neuron can
be considered as a combination of 2 parts:

 The part that computes the output Z, using the inputs and the
weights.
 The part that performs the activation on Z produces the output A of
the neuron.

𝑧 = 𝑤𝑇𝑥 + 𝑏 (1)

𝑎 = 𝜎(𝑍) (2)

In the above diagram, five hidden layers comprise various neurons, each
performing the above two calculations. For example, the five neurons
present within the hidden layer of the above shallow neural network compute
the following:

[1] [1]𝑇 [1]


𝑧1 = 𝑤1 𝑥 + 𝑏1
22 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

[2] [2]𝑇 [2]


𝑧2 = 𝑤2 𝑥 + 𝑏2
[3] [3]𝑇 [3]
𝑧3 = 𝑤3 𝑥 + 𝑏3 (3)
[4] [4]𝑇 [4]
𝑧4 = 𝑤4 𝑥 + 𝑏4
[5] [5]𝑇 [5]
𝑧5 = 𝑤5 𝑥 + 𝑏5

[1] [1]
𝑎1 = 𝜎(𝑧1 )
[2] [2]
𝑎2 = 𝜎(𝑧2 )
[3] [3]
𝑎3 = 𝜎(𝑧3 ) (4)
[4] [4]
𝑎4 = 𝜎(𝑧4 )
[5] [5]
𝑎5 = 𝜎(𝑧5 )

[𝑖]
Where X is, the input vector consists of 2 features, 𝑤𝑗 denotes the
[𝑖]
weights associated with the neuron j in the ith layer, 𝑏𝑗 denotes the bias
[𝑖]
associated with neuron j and layer i, 𝑍𝑗 denotes the intermediate output
[𝑖]
related to the neuron j and the layer I, and 𝑎𝑗 is the final output related the
j and i. The 𝜎 is the sigmoid activation function, and it is defined as,

1
𝜎(𝑥) = 1+ⅇ −𝑥 (5)

As we can see, the five equations of Z and A (Equation (3) and (4))
seems excessive, and so we need to vectorize them as below (for the first
hidden layer)

𝑧 [1] = 𝑥 [1]𝑇 𝑥 + 𝑏 [1] (6)

𝐴[1] = 𝜎(𝑧 [1] ) (7)


Introduction 23

The intermediate outputs z is computed in single matrix multiplication,


and the activation function A is also computed in single matrix
multiplication. A neural network is usually built using numerous hidden
layers. Now that the computations that occur in a specific layer are defined,
further understanding of how the whole neural network computes the output
for a given input X. These can also be referred to as the “forward-
propagation” equations. The output layer can be computed as

𝑧 [2] = 𝑥 [2]𝑇 𝑥 + 𝑏 [2] (8)

𝐴[2] = 𝜎(𝑧 [2] ) = 𝑦̂ (9)

The 𝑦̂ is the output function.

1.7.1. Activation Functions

Simply put, the neural network is a set of mathematical equations and


weights. The technique called activations functions can be leveraged to
make the neural networks resilient and make them perform well in different
scenarios. These “Activation Functions” introduce non-linear properties into
the network. The following section explains why the activation functions are
pivotal for any neural network using shallow neural network examples. As
shown before, the shallow neural network can be represented without
activation function as

(10)

𝑦̂ = 𝑧 [2] = 𝑤 [2]𝑇 𝑧 [1] + 𝑏[2]


(11)

Incorporating equation 10 into equation 11, we get,


24 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

(12)

𝑦̂ = 𝑧 [2] = 𝑤 [2]𝑇 𝑤 [1]𝑇 𝑥 + 𝑏[1] + 𝑏[2]


(13)

or

𝑦̂ = 𝑧 [2] = 𝑤𝑛ⅇ𝑤 𝑥 + 𝑏𝑛ⅇ𝑤 (14)

The output becomes a linear combination of a new Weight Matrix W,


Input X, and a new Bias b, denoting that there remains no influence of the
neurons and the weights present in the hidden layer. Consequently, to
introduce non-linearity into the network, activation functions are introduced.
Many activation functions that can be used include Sigmoid, Tanh, ReLU,
Leaky ReLU, to name a few. It is a requisite to use a particular activation
function for all layers. An activation function can be selected for a specific
layer and a different one for another layer, and so on.

1.7.2. Weight Initialization

As discussed earlier, Weight Matrix W of a Neural Network is randomly


initialized. It could be asked why can’t initialize W with 0’s or any specific
value. With the help of our Shallow Neural Network, this can be explained.
If W1 and W2, the weight matrix of layer one and the weight matrix of layer
two respectively be initialized with 0 or any other value. And if the weight
matrices are identical, the activations of neurons in the hidden layer would
be the same. Furthermore, the derivatives of the activations would be
identical. Consequently, the neurons in each hidden layer would be
comparably modifying the weights, i.e., there would be no relevance of
having multiple neurons in that hidden layer. But the goal is to have each
neuron in the hidden layer be unique, have different weights, and work as a
unique function. Accordingly, the weights should randomly be initialized.
Introduction 25

The best practice for the initialization is using Xavier’s initialization, and
that can be defined as,

1
𝑤 [𝑙] ~𝑁 (𝜇 = 0, 𝜎 2 = 𝑛(𝑙−1)) (15)

The weight matrix (w) of a given layer (l) are picked randomly for a
normal distribution with mean (𝜇) is set to Zero. Variance (𝜎 2 ) reciprocal
of the number of neurons in layer l−1

𝑏 [𝑙] = 0 (16)

The bias for all the layers is initialized to zero.

1.7.3. Forward and Backward Propagation

As discussed in the previous section, the weights of a neural network are


randomly initialized. To use the neural network to predict accurately, the
weights need to be updated. The technique by which we update these
weights is called Gradient Descent, which will be dealt with in detail in the
future chapters. This can be shown below in a computational graph:

The forward propagation (arrow from left to right and top to bottom) is
used to calculate the output based on a given input x. On the other hand, the
26 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

backward propagation (arrow from right to left and bottom to top) is used to
update the weights, w[1] and w[2] and respective bias b[1] and b[2]. Finally, the
loss function is computed by calculating the derivatives of the inputs in each
of the steps, shown below.

𝐿(𝑦̂, 𝑦) = −[𝑦 𝑙𝑜𝑔𝑦̂ + (1 − 𝑦)log(1 − 𝑦̂)] (17)

1.8. DEEP NEURAL NETWORKS

In this section, a brief overview is presented in terms of the network


depth, forward and backward propagations, and deep representations. Then,
a detailed discussion in Deep Neural Networks is presented in Chapter 2.

1.8.1. Deep L-layer Neural Network

In this section, a simple representation of a Neural Network and its


deeper representation is presented. Perceptron is one of the simplest forms
of Neural Network with a simple step activation function. Logistic
regression is another simplest form of a Neural Network which used the
sigmoid activation function. These two models of shallow networks are
limited to the classification of linear separable problems.
Later one hidden layer was introduced to the simplest form of a Neural
Network known as the two-layer model. Unfortunately, though it was not
limited to linear separable problems, it could not handle complex data sets.
Further, two hidden layers were introduced to test the network’s
capability on complex data. As the complexity of the data increased, the
number of hidden layers increased, which led to the L-layer Deep Neural
Network. Table 2 shows the simple models with an increase in the hidden
layers.
Introduction 27

Table 2. Architectural models of ANNs

No Hidden Layer

One hidden layer

Two hidden layers

Five Hidden layers

Notations used:

 L is the number of layers in the neural network


 n[l] is the number of units in layer l
 a[l] is the activations in layer l
 w[l] is the weights for z[l]

Figure 8. A network with six layers (5 hidden layers).


28 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

In the network given in Figure 8, there are 5 hidden layers and one output
layer. The number of units in each layer is given as:

n[1]=4; n[2]=4; n[3]=4; n[4]=4; n[5]=3; n[6]=1

The procedure involved in implementing the network involves the


following steps:

1. Initialise the network parameters


2. Obtain the activation function and output of each layer – forward
propagation
3. Evaluate the loss function
4. Propagate the error backward
5. Update the parameters
6. Train the model
7. Test the model

Here L denotes the number of layers. For the network shown in Figure
8,

L = 6; denotes the number of layers


n[1] = 4; four units in the first layer
n[2] = 4; four units in the second layer
n[3] = 4; four units in the third layer
n[4] = 4; four units in the fourth layer
n[5] = 3; three units in the fifth layer
n[6] = 1; one unit in the sixth layer

a[l] represents the activation function in each layer


w[l] represents the weight for computing the output z[l] in each layer

The input to any arbitrary layer would be the activations from the
previous layer, (l-1)th layer, and this layer’s output would be its activations.
The weights and the bias used in the network are initialized before the
Introduction 29

training of the network. The weights are initialized to random values to


avoid similar outputs at every neuron in a layer, and the bias is initialized
to zero. The rules followed for the calculation are given as:

𝑧 [𝑙] = 𝑊 [𝑙] 𝑎[𝑙−1] + 𝑏 [𝑙] (18)

𝑎[𝑙] = 𝑔[𝑙] (𝑧 [𝑙] ) (19)

The activation and the output are evaluated for each layer and then
backpropagated.

1.8.2. Forward and Backward Propagation

Using equations (18) and (19) the forward propagation is evaluated. To


evaluate the backward propagation, the weights are updated using their
derivatives. During the backpropagation, the inputs are represented as da[l]
and the outputs as da[l-1]. The weights and bias are represented as dW[l] and
db[l] respectively. The equations involved to compute the parameters in
backward propagation are:

[𝑙]
𝑑𝑧 [𝑙] = 𝑑𝑎[𝑙] ∗ 𝑔′ (𝑧 [𝑙] ) (20)

1
𝑑𝑊 [𝑙] = 𝑚 ∗ 𝑑𝑧 [𝑙] ∗ 𝑎[𝑙−1] . 𝑇 (21)

𝑑𝑎[𝑙−1] = 𝑤 [𝑙] . 𝑇 ∗ 𝑑𝑧 [𝑙] (22)

This is the simplest method in which a deep learning model is built. But
choosing the proper parameters and hyperparameters for training large
networks results in promising performance. The parameters of the deep
learning model are the weights and bias. At the same time, the
hyperparameters include the learning rate, number of iterations, number of
30 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

hidden layers, type of activation function, and number of neurons in the


hidden layer.

1.8.3. Deep Representations

From the history of Neural networks, it is evident that neural networks


have provided promising results for several real-world applications. To
solve complex problems, Neural Networks need to be deep with a large
number of hidden layers. The network goes deeper as the relations with the
data get complex. In the case of an image processing application, the first
layer may be involved in identifying the basic features or edges of the image.
As the network progresses deeper, the application involves identifying more
complex features, thus requiring the network to be more profound. For
example, edges can initially detect several faces. But the exact facial
detection is possible with having neurons to identify an eye or nose, i.e., the
facial features. Thus deep representations are required for classification
problems with deeper features.
As an example:
In face recognition, deep representations evolve according to the
following flow diagram
Input Image  Edges  Facial features  More facial features  face
 desired output

1.9. ENSEMBLE LEARNING

Ensemble methods, also known as committee-based learning or learning


multiple classification systems, train multiple theories to unravel the constant
problem. Among the foremost typical ensemble samples, modeling is that the
random forest trees where multiple decision trees are used to predict the results.
Ensemble methods combine several tree-based algorithms to create better-
guessing performance than a single tree algorithm. The main goal of the
ensemble model is for a group of weak learners to come together and form a
Introduction 31

strong student, thus increasing the accuracy of the model. In addition, the
bagging and boosting models have made the network model learn and improve
from the previous classification or training. The concept, algorithms, and
Python-based examples are discussed in detail in Chapter 5 of this book.

1.10. REAL WORLD EXAMPLES

Deep Learning has evolved and found a role during recent years in multi-
disciplinary areas of engineering and science. Some of the thrust and promising
areas in which DL has been applied and proved to be powerful are self-driving
cars, natural language processing, Self Driving Cars, Natural Language
Processing, Image, and Visual Recognition, Fraud Detection, Virtual Assistants,
Healthcare, Developmental disorders in children, and many more.

1.10.1. Self Driving Cars

In Self Driving cars, Deep Learning has emerged in driving autonomous


vehicles. These intelligent algorithms have found their applications in
handling amounts of data. DL models can be developed, trained, and tested
to handle such extensive data in a safe environment. One of the significant
challenges involved in driverless cars is handling unprecedented situations.
The challenge to be addressed is to develop a model that ensures safe driving
in different environmental conditions. There is a large amount of data
coming in from the cameras, geo-satellite, sensors, telematics, etc., for
which sophisticated models are required to identify paths, navigate through
complex traffic, understand signs, and adapt to real-time situations such as
roadblocks, etc., research in this area has proven that DL models are getting
capable in handling such situations.
32 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

1.10.2. Natural Language Processing

Computational algorithms capable of automatically analyzing and


representing human language are called Natural Language Processing
(NLP). Amazon’s voice assistant is based on the concept of NLP. NLP has
also being used in machine translation. The complexities involved in a
language in syntax, expressions, semantics, etc., are intricate for humans to
learn. This needs a lot of training and inhuman; it is achieved since birth.
Deep learning models that are developed around this situation eases in
providing appropriate responses. Tasks such as visual question and
answering, classifying text, sentiment analysis, language modeling, etc., are
gaining thrust with the evolution of Deep learning. Initially, SVM and
logistic regression were used, but they were found to be time-consuming and
complex. Therefore, Convolutional Neural Networks, Recurrent Neural
Networks, Reinforcement Learning, and Memory Augmented networks are
gaining more popularity in NLP.

1.10.3. Image and Visual Recognition

Image recognition is the process is identifying an image and classifying


it into one of the predefined classes. It helps in distinguishing one object
from another. The broader field is called computer vision, in which image
classification is one of the primary tasks in solving computer vision
problems. Various applications under image and visual recognition are
image classification with localization, detecting objects, semantic
segmentation in objects, instance segmentation, pattern recognition, and
many more. DL has found a wide range of applications in all these
applications since it teaches machines to learn by example. One significant
advantage is that the DL models do not require the extraction of features
separately. Instead, they are capable of performing feature engineering
during the training phase of the algorithm.
Introduction 33

1.10.4. Fraud Detection

Fraudulent transactions in banking are increasing day by day due to the


established digitization world in which we live. Most of our financial
transactions are performed online, and there are vast chances of confidential
data getting leaked. Banks and financial sectors are trying hard to find tools
and techniques to detect fraudulent actions in real-time scenarios to assure
security to their customers. Typical transactions are learned through DL
models (autoencoders), and any abnormal transactions can be detected with
such a sophisticated learning model.

1.10.5. Virtual Assistants

Virtual Assistants like Amazon’s Alexa, Cortana, Google Assistant, and


Apple’s Siri are used to interact with humans efficiently. The eventually
provides us a secondary human interaction experience. These virtual
assistants use deep learning models to learn more about their users, like
favorite hangout spots, favorite restaurants, songs, movie genres, etc.; they
learn and understand the commands given by the human being using the
NLP and execute them. They are also trained to take notes, convert voice to
text, book movie tickets, assist in managing hospital appointments, meeting
reminders, email reminders, etc.

1.10.6. Healthcare

NVIDIA, states “From medical imaging to analyzing genomes to


discovering new drugs, the entire healthcare industry is in a state of
transformation, and GPU computing is at heart. GPU-accelerated
applications and systems are delivering new efficiencies and possibilities,
empowering physicians, clinicians, and researchers passionate about
improving the lives of others to do their best work.” DL models have proven
their efficient performance in the early diagnosis of life-threatening diseases,
34 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

addressing the shortfall of physicians, aiding in pathology results, predicting


future risk of diseases, etc. Although DL is enormously used in clinical
research to find cures to untreatable diseases, physicians’ skepticism and
lack of massive datasets are still posing challenges to deep learning in
medicine.

1.10.7. Developmental Disorders in Children

Autism, speech disorders, physical development disorders are specific


problems that affect the quality of life of children. Physicians have always
wondered if there were early detection mechanisms capable of diagnosing
and treating such children. One of the novel applications of DL is the early
detection of such developmental disorders in small children and infants.
Researchers and Scientists in the Computer Science and Artificial
Intelligence Laboratory at MIT and Massachusetts General Hospital’s
Institute of Health Professions have developed and implemented a system
that can identify language and speech disorders even before kindergarten
when most of these cases traditionally start coming to light. Several
researchers also carry out studies to apply deep learning algorithms to
identify autism spectrum disorder (ASD) patients by training magnificent
brain imaging data, thus exploring the brain activation patterns in
differently-abled children.

SUMMARY

Deep learning is built on predictive modeling and statistics, where


researchers use a large amount of data. This has allowed a data scientist to
analyze and develop suitable models to handle this large amount of data,
thus making the process simpler and faster. While conventional machine
learning algorithms and neural network algorithms are linear models and can
be used for prediction applications, deep learning algorithms are hierarchical
models with a high level of complexity and abstraction used in automatic
Introduction 35

prediction applications. The critical aspect of any data science paradigm is


to represent the data in a standard form and understand the mathematical
relation between the data parameters. Thus it becomes essential to
understand the concept of Eigenvalues and Eigenvectors in matrix linear
algebra and their role in dimensionality reduction while handling extensive
data. The evolution of the architecture models since 1943 to date has been
presented as part of the history of NN models. The Fundamental machine
learning algorithms such as supervised, unsupervised, reinforcement
learning are briefed for the reader to recall the basic concepts and algorithms
categorized. One needs to appreciate the underlying differences between AI,
ML, and DL before applying to a research problem. The chapter also throws
light onto shallow neural networks, activation functions, and learning rules
while back-propagating the error. This enables the reader to differentiate
between deep and shallow NN models. A brief introduction to ensemble
learning approaches is also covered and discussed in detail in Chapter 5.
Some of the broad research areas where DL have proven their performance,
such as self-driving cars, natural language processing, Image, and Visual
Recognition, Fraud Detection, Virtual Assistants, Healthcare,
Developmental disorders in children, etc., are briefed.

REVIEW QUESTIONS

1. Why do we need Deep learning models?


2. What are the different machine learning algorithms? List them.
3. Identify suitable real-world applications that can be implemented
using supervised learning algorithms.
4. List the algorithms that belong to the unsupervised learning category
and mention relevant real-world applications.
5. What is the role of reinforcement learning in data sciences? Explain
with a suitable example.
6. Differentiate between Deep Learning, Machine Learning, and
Artificial Intelligence.
36 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

7. What is the role of Artificial Neural Network in Deep Learning,


Machine Learning, and Artificial Intelligence?
8. Identify the recent DL models developed with their relevant
applications.
9. Choose any data set of your choice, represent the data in matrix
form, compute the Eigenvalues and Eigenvectors manually.
10. For the same data set chosen in Question 9, compute the
Eigenvalues and Eigenvectors using Python and compare the
results.
11. What are all the activation functions applied to shallow neural
networks?
12. How is error backpropagated in shallow neural networks?
13. Compare shallow and deep neural networks.
14. List a few recent application areas where DL has proved to have
improved performance over other traditional methods.
Chapter 2

DEEP LEARNING

LEARNING OUTCOMES

At the end of this chapter, the reader will be able to:

 Understand the evolution of the Deep Learning Neural Network


models and basic concepts of Deep Learning
 Appreciate the implementation aspects of Deep Learning, namely,
the data set, bias and variance, regularisation and dropout, ReLU,
Softmax, and so on.
 Know the basic training details of a Deep learning model, such as
the number of hidden layers, activation functions, weight
initialization, learning rates, hyperparameter tuning, and learning
methods.
 Get started with the tools namely Tensorflow and Keras, Microsoft
Azure AI and Machine Learning framework

Keywords: deep learning, bias, variance, dropout, softmax, training,


Tensorflow, Keras
38 S. Sumathi, S. Rajappa, L. Ashok Kumar and P. Surekha

2.1. INTRODUCTION

In recent times, Deep learning has been seen as causing an extraordinary


effect on different ventures and business spaces. Deep learning is a part of
AI calculations and applies an assortment of calculations for preparing
information (for the most part continuous). The human reasoning procedure
is imitated through these calculations, subsequently creating deliberations.
There are a few layers of calculations in machine learning for handling
information, understanding human discourse, and perceiving objects
visually. The data identified with an issue is gone through each layer of the
deep learning system, and the yield of the past layer goes about as the
contribution for the following layer. Deep learning networks are neural
networks where the first layer of a deep learning system is the input layer,
and the last layer is the output layer, while the middle layers are called
hidden layers. Each layer comprises an activation function through which
the learning process is performed. In this chapter, the authors discuss a brief
history of deep learning, basic terminologies of deep learning, training a
deep learning model, Autoencoders, introduction to Tensorflow and Keras,
Convolutional Neural Networks, AlexNet, VGGNet, Residual Network,
Inception Network, and Recurrent Neural Networks. The theoretical
concepts of these topics are elaborated with relevant Python codes for the
reader to implement and have hands-on experience.

Figure 9. Biological Neuron and Artificial Neuron.


Another Random Scribd Document
with Unrelated Content
to visit the lord, or whether, seeing that he had not come to receive
me, my dignity did not demand that I should await his visit; and it
was on this latter course that we finally decided.
‘But he’ll hardly come to-night,’ said Denny, jumping up. ‘I wonder if
there are any decent beds here!’
Hogvardt and Watkins had, by my directions, sat down with us; the
former was now smoking his pipe at the window, while Watkins was
busy overhauling our luggage. We had brought light bags, the rods,
guns, and other smaller articles. The rest was in the yacht. Hearing
beds mentioned, Watkins shook his head in dismal presage, saying,
‘We had better sleep on board, my lord.’
‘Not I! What, leave the island now we’ve got here? No, Watkins!’
‘Very good, my lord,’ said Watkins impassively.
A sudden call came from Hogvardt, and I joined him at the window.
The scene outside was indeed remarkable. In the narrow paved
street, gloomy now in the failing light, there must have been fifty or
sixty men standing in a circle, surrounded by an outer fringe of
women and children; and in the centre stood our landlord, his burly
figure swaying to and fro as he poured out a low-voiced but
vehement harangue. Sometimes he pointed towards us, oftener
along the ascending road that led to the interior. I could not hear a
word he said, but presently all his auditors raised their hands
towards heaven. I saw that some of the hands held guns, some
clubs, some knives; and all the men cried with furious energy, ‘Nai,
Nai. Yes, yes!’ Then the whole body—and the greater part of the
grown men on the island must have been present—started off in
compact array up the road, the innkeeper at their head. By his side
walked another man whom I had not noticed before; he wore an
ordinary suit of tweeds, but carried himself with an assumption of
much dignity; his face I could not see.
‘Well, what’s the meaning of that?’ I exclaimed, looking down on the
street, empty again save for groups of white-clothed women, who
talked eagerly to one another, gesticulating and pointing now
towards our inn, now towards where the men had gone.
‘Perhaps it’s their Parliament,’ suggested Denny; ‘or perhaps they’ve
repented of their rudeness and are going to erect a triumphal arch.’
These conjectures, being obviously ironical, did not assist the matter,
although they amused their author.
‘Anyhow,’ said I, ‘I should like to investigate the thing. Suppose we
go for a stroll?’
The proposal was accepted at once. We put on our hats, took sticks,
and prepared to go. Then I glanced at the luggage.
‘Since I was so foolish as to waste my money on revolvers—?’ said I,
with an inquiring glance at Hogvardt.
‘The evening air will not hurt them,’ said he; and we each stowed a
revolver in our pockets. We felt, I think, rather ashamed of our
timidity, but the Neopalians certainly looked rough customers.
Leading the way to the door I turned the handle; the door did not
open. I pulled hard at it. Then I looked at my companions.
‘Queer,’ said Denny, and he began to whistle.
Hogvardt got the little lantern, which he always had handy, and
carefully inspected the door.
‘Locked,’ he announced, ‘and bolted top and bottom. A solid door
too!’ and he struck it with his fist. Then he crossed to the window
and looked at the bars; and finally he said to me, ‘I don’t think we
can have our walk, my lord.’
Well, I burst out laughing. The thing was too absurd. Under cover of
our animated talk the landlord must have bolted us in. The bars
made the window no use. A skilled burglar might have beaten those
bolts, and a battering ram would, no doubt, have smashed the door;
we had neither burglar nor ram.
‘We’re caught, my boy,’ said Denny, ‘nicely caught! But what’s the
game?’
I had asked myself that question already, but had found no answer.
To tell the truth, I was wondering whether Neopalia was going to
turn out as conservative a country as the Turkish Ambassador had
hinted. It was Watkins who suggested an answer.
‘I imagine, my lord,’ said he, ‘that the natives’ (Watkins always called
the Neopalians ‘natives’) ‘have gone to speak to the gentleman who
sold the island to your lordship.’
‘Gad,’ said Denny, ‘I hope it’ll be a pleasant interview!’
Hogvardt’s broad good-humoured face had assumed an anxious
look. He knew something about the people of these islands; so did I.
‘Trouble, is it?’ I asked him.
‘I’m afraid so,’ he answered, and then we turned to the window
again, except Denny, who wasted some energy and made a useless
din by battering at the door till we beseeched him to let it alone.
There in the room we sat for nearly two hours. Darkness fell; the
women had ceased their gossiping, but still stood about the street
and in the doorways of their houses. It was nine o’clock before
matters showed any progress. Then came shouts from the road
above us, the flash of torches, the tread of men’s feet in a quick
triumphant march. Next the stalwart figures of the picturesque
fellows, with their white kilts gleaming through the darkness, came
again into sight, seeming wilder and more imposing in the
alternating glare and gloom of the torches and the deepening night.
The man in tweeds was no longer visible. Our innkeeper was alone
in front. And all, as they marched, sang loudly a rude barbarous sort
of chant, repeating it again and again; while the women and
children, crowding out to meet the men, caught up the refrain in
shrill voices, till the whole air seemed full of it. So martial and
inspiring was the rude tune that our feet began to beat in time with
it, and I felt the blood quicken in my veins. I have tried to put the
words of it into English, in a shape as rough, I fear, as the rough
original. Here it is:

‘Ours is the land!


Death to the hand
That filches the land!
Dead is that hand,
Ours is the land!

‘Forever we hold it,


Dead’s he that sold it!
Ours is the land,
Dead is the hand!’

Again and again they hurled forth the defiant words, until at last
they stopped opposite the inn with one final long-drawn shout of
savage triumph.
‘Well, this is a go,’ said Denny, drawing a long breath. ‘What are the
beggars up to?’
‘What have they been up to?’ I asked; for I could not doubt that the
song we had heard had been chanted over a dead Stefanopoulos
two hundred years before. At this age of the world the idea seemed
absurd, preposterous, horrible. But there was no law nearer than
Rhodes, and there only Turk’s law. The sole law here was the law of
the Stefanopouloi, and if that law lost its force by the crime of the
hand which should wield it, why, strange things might happen even
to-day in Neopalia. And we were caught in the inn like rats in a trap.
‘I don’t see,’ remarked old Hogvardt, laying a hand on my shoulder,
‘any harm in loading our revolvers, my lord.’
I did not see any harm in it either, and we all followed Hogvardt’s
advice, and also filled our pockets with cartridges. I was determined
—I think we were all determined—not to be bullied by these
islanders and their skull-and-crossbones ditty.
A quarter of an hour passed; then there came a knock at the door,
while the bolts shot back.
‘I shall go out,’ said I, springing to my feet.
The door opened, and the face of a lad appeared.
‘Vlacho the innkeeper bids you descend,’ said he; and then, catching
sight perhaps of our revolvers, he turned and ran downstairs again
at his best speed. Following him we came to the door of the inn. It
was ringed round with men, and directly opposite to us stood
Vlacho. When he saw me he commanded silence with a gesture of
his hand, and addressed me in the following surprising style.
‘The Lady Euphrosyne, of her grace, bids you depart in peace. Go,
then, to your boat and depart, thanking God for His mercy.’
‘Wait a bit, my man’ said I; ‘where is the lord of the island?’
‘Did you not know that he died a week ago?’ asked Vlacho, with
apparent surprise.
‘Died!’ we exclaimed one and all.
‘Yes, sir. The Lady Euphrosyne, Lady of Neopalia, bids you go.’
‘What did he die of?’
‘Of a fever,’ said Vlacho gravely; and several of the men round him
nodded their heads and murmured in no less grave assent, ‘Yes, of a
fever.’
‘I am very sorry for it,’ said I. ‘But as he sold the island to me before
he died, I don’t see what the lady, with all respect to her, has got to
do with it. Nor do I know what this rabble is doing about the door.
Bid them disperse.’
This attempt at hauteur was most decidedly thrown away. Vlacho
seemed not to hear what I said. He pointed with his finger towards
the harbour.
‘There lies your boat. Demetri and Spiro cannot go with you, but you
will be able to manage her yourselves. Listen now! Till six in the
morning you are free to go. If you are found in Neopalia one minute
after, you will never go. Think and be wise.’ And he and all the rest,
as though one spring moved the whole body, wheeled round and
marched off up the hill again, breaking out into the old chant when
they had gone about a hundred yards. We were left alone in the
doorway of the inn, looking, I must admit, rather blank.
Upstairs again we went, and I sat down by the window and gazed
out on the night. It was very dark, and seemed darker now that the
gleaming torches were gone. Not a soul was to be seen. The
islanders, having put matters on a satisfactory footing, were off to
bed. I sat thinking. Presently Denny came to me, and put his hand
on my shoulder.
‘Going to cave in, Charley?’ he asked.
‘My dear Denny,’ said I, ‘I wish you were at home with your mother.’
He smiled and repeated, ‘Going to cave in, old chap?’
‘No, by Jove, I’m not!’ cried I, leaping up. ‘They’ve had my money,
and I’m going to have my island.’
‘Take the yacht, my lord,’ counselled Hogvardt, ‘and come back with
enough force from Rhodes.’
Well, here was sense; my impulse was nonsense. We four could not
conquer the island. I swallowed my pride.
‘So be it,’ said I. ‘But look here, it’s only just twelve. We might have
a look round before we go. I want to see the place, you know.’ For I
was very sorely vexed at being turned out of my island.
Hogvardt grumbled a little at my proposal, but here I overruled him.
We took our revolvers again, left the inn, and struck straight up the
road. We met nobody. For nearly a mile we mounted, the way
becoming steeper with every step. Then there was a sharp turn off
the main road.
‘That will lead to the house,’ said Hogvardt, who had studied the
map of Neopalia very carefully.
‘Then we’ll have a look at the house. Show us a light, Hogvardt. It’s
precious dark.’
Hogvardt opened his lantern and cast its light on the way. But
suddenly he extinguished it again, and drew us close into the rocks
that edged the road. We saw coming towards us, in the darkness,
two figures. They rode small horses. Their faces could not be seen;
but as they passed our silent motionless forms, one said in a clear,
sweet, girlish voice:
‘Surely they will go?’
‘Ay, they’ll go or pay the penalty,’ said the other voice. At the sound
of it I started. For it was the voice of my neighbour in the
restaurant, Constantine Stefanopoulos.
‘I shall be near at hand, sleeping in the town,’ said the girl’s voice,
‘and the people will listen to me.’
‘The people will kill them if they don’t go,’ we heard Constantine
answer, in tones that witnessed no great horror at the idea. Then
the couple disappeared in the darkness.
‘On to the house!’ I cried in sudden excitement. For I was angry
now, angry at the utter humbling scorn with which they treated me.
Another ten minutes’ groping brought us in front of the old grey
house which we had seen from the sea. We walked boldly up to it.
The door stood open. We went in and found ourselves in a large
hall. The wooden floor was carpeted here and there with mats and
skins. A long table ran down the middle; the walls were decorated
with mediæval armour and weapons. The windows were but narrow
slits, the walls massive and deep. The door was a ponderous iron-
bound affair; it shamed even the stout doors of our inn. I called
loudly, ‘Is anyone here?’ Nobody answered. The servants must have
been drawn off to the town by the excitement of the procession and
the singing; or, perhaps, there were no servants. I could not tell. I
sat down in a large armchair by the table. I enjoyed the sense of
proprietorship; I was in my own house. Denny sat on the table by
me, dangling his legs. For a long while none of us spoke. Then I
exclaimed suddenly:
‘By Heaven, why shouldn’t we see it through?’ I rose, put my hands
against the massive door, and closed and bolted it, saying, ‘Let them
open that at six o’clock in the morning.’
‘Hurrah!’ cried Denny, leaping down from his table, on fire with
excitement in a moment.
I faced Hogvardt. He shook his head, but he smiled. Watkins stood
by with his usual imperturbability. He wanted to know what his
lordship decided—that was all; and when I said nothing more, he
asked,
‘Then your lordship will sleep here to-night?’
‘I’ll stay here to-night, anyhow, Watkins,’ said I. ‘I’m not going to be
driven out of my own island by anybody.’
As I spoke, I brought my fist down on the table with a crash. And
then to our amazement we heard, from somewhere in the dark
recesses of the hall where the faint light of Hogvardt’s lantern did
not reach, a low but distinct groan, as of someone in pain. Watkins
shuddered, Hogvardt looked rather uncomfortable; Denny and I
listened eagerly. Again the groan came. I seized the lantern from
Hogvardt’s hand, and rushed in the direction of the sound. There, in
the corner of the hall, on a couch covered with a rug, lay an old man
in an uneasy attitude, groaning now and then and turning restlessly.
By his side sat an old serving-woman in weary heavy slumber. In a
moment I guessed the truth—part of the truth.
‘He’s not dead of that fever yet,’ said I.
CHAPTER III
THE FEVER OF NEOPALIA

I LOOKED for a moment on the old man’s pale, clean-cut,


aristocratic face; then I shook his attendant by the arm
vigorously. She awoke with a start.
‘What does this mean?’ I demanded. ‘Who is he?’
‘Heaven help us! Who are you?’ she cried, leaping up in alarm.
Indeed we four, with our eager fierce faces, must have looked
disquieting enough.
‘I am Lord Wheatley; these are my friends,’ I answered in brisk
sharp tones.
‘What, it is you, then—?’ A wondering gaze ended her question.
‘Yes, yes, it is I. I have bought the island. We came out for a walk
and—’
‘But he will kill you if he finds you here.’
‘He? Who?’
‘Ah, pardon, my lord! They will kill you, they—the people—the men
of the island.’
I gazed at her sternly. She shrank back in confusion. And I spoke at
a venture, yet in a well-grounded hazard:
‘You mean that Constantine Stefanopoulos will kill me?’
‘Ah, hush,’ she cried. ‘He may be here, he may be anywhere.’
‘He may thank his stars he’s not here,’ said I grimly, for my blood
was up. ‘Attend, woman. Who is this?’
‘It is the lord of the island, my lord,’ she answered. ‘Alas, he is
wounded, I fear, to death. And yet I fell asleep. But I was so weary.’
‘Wounded? By whom?’
Her face suddenly became vacant and expressionless.
‘I do not know, my lord. It happened in the crowd. It was a mistake.
My dear lord had yielded what they asked. Yet some one—no, by
heaven, my lord, I do not know who—stabbed him. And he cannot
live.’
‘Tell me the whole thing,’ I commanded.
‘They came up here, my lord, all of them, Vlacho and all, and with
them my Lord Constantine. The Lady Euphrosyne was away; she is
often away, down on the rocks by the sea, watching the waves.
They came and said that a man had landed who claimed our island
as his—a man of your name, my lord. And when my dear lord said
he had sold the island to save the honour of his house and race,
they were furious; and Vlacho raised the death chant that One-eyed
Alexander the Bard wrote on the death of Stefan Stefanopoulos long
ago. Then they came near with knives, demanding that my dear lord
should send away the stranger; for the men of Neopalia were not to
be bought and sold like bullocks or like pigs. At first my lord would
not yield, and they swore they would kill the stranger and my lord
also. Then they pressed closer; Vlacho was hard on him with drawn
knife, and the Lord Constantine stood by him, praying him to yield;
and Constantine drew his own knife, saying to Vlacho that he must
fight him also before he killed the old lord. But at that Vlacho smiled.
And then—and then—ah, my dear lord!’
For a moment her voice broke, and sobs supplanted words. But she
drew herself up, and after a glance at the old man whom her
vehement speech had not availed to waken, she went on.
‘And then those behind cried out that there was enough talk. Would
he yield or would he die? And they rushed forward, pressing the
nearest against him. And he, an old man, frail and feeble (yet once
he was as brave a man as any), cried in his weak tones, “Enough,
friends, I yield, I—” and they fell back. But my lord stood for an
instant, then he set his hand to his side, and swayed and tottered
and fell; the blood was running from his side. The Lord Constantine
fell on his knees beside him, crying, “Who stabbed him?” Vlacho
smiled grimly, and the others looked at one another. But I, who had
run out from the doorway whence I had seen it all, knelt by my lord
and staunched the blood. Then Vlacho said, fixing his eyes straight
and keen on the Lord Constantine, “It was not I, my lord.” “Nor I by
heaven,” cried the Lord Constantine, and he rose to his feet,
demanding, “Who struck the blow?” But none answered; and he
went on, “Nay, if it were in error, if it were because he would not
yield, speak. There shall be pardon.” But Vlacho, hearing this, turned
himself round and faced them all, saying, “Did he not sell us like
oxen and like pigs?” and he broke into the death chant, and they all
raised the chant, none caring any more who had struck the blow.
And the Lord Constantine—’ The impetuous flow of the old woman’s
story was frozen to sudden silence.
‘Well, and the Lord Constantine?’ said I, in low stern tones that
quivered with excitement; and I felt Denny’s hand, which was on my
arm, jump up and down. ‘And Constantine, woman?’
“WHO STABBED HIM?”

‘Nay, he did nothing,’ said she. ‘He talked with Vlacho awhile, and
then they went away, and he bade me tend my lord, and went
himself to seek the Lady Euphrosyne. Presently he came back with
her; her eyes were red, and she wept afresh when she saw my poor
lord; for she loved him. She sat by him till Constantine came and
told her that you would not go, and that you and your friends would
be killed if you did not go. Then, weeping to leave my lord, she
went, praying heaven she might find him alive when she returned. “I
must go,” she said to me, “for though it is a shameful thing that the
island should have been sold, yet these men must be persuaded to
go away and not meet death. Kiss him for me if he awakes.” Thus
she went and left me with my lord, and I fear he will die.’ She ended
in a burst of sobbing.
For a moment there was silence. Then I said again:
‘Who struck the blow, woman? Who struck the blow?’
She shrank from me as though I had struck her.
‘I do not know; I do not know,’ she moaned.
But the question she dared not answer was to find an answer.
The stricken man opened his eyes, his lips moved, and he groaned,
‘Constantine! You, Constantine!’ The old woman’s eyes met mine for
a moment and fell to the ground again.
‘Why, why, Constantine?’ moaned the wounded man. ‘I had yielded,
I had yielded, Constantine. I would have sent them—’
His words ceased, his eyes closed, his lips met again, but met only
to part. A moment later his jaw dropped. The old lord of Neopalia
was dead.
Then I, carried away by anger and by hatred of the man who, for a
reason I did not yet understand, had struck so foul a blow against
his kinsman and an old man, did a thing so rash that it seems to me
now, when I consider it in the cold light of memory, a mad deed. Yet
then I could do nothing else; and Denny’s face, ay, and the eyes of
the others too told me that they were with me.
‘Compose this old man’s body,’ I said, ‘and we will watch it. But do
you go and tell this Constantine Stefanopoulos that I know his crime,
that I know who struck that blow, that what I know all men shall
know, and that I will not rest day or night until he has paid the
penalty of this murder. Tell him I swore this on the honour of an
English gentleman.’
‘And say I swore it too!’ cried Denny; and Hogvardt and Watkins, not
making bold to speak, ranged up close to me; I knew that they also
meant what I meant.
The old woman looked at me with searching eyes.
‘You are a bold man, my lord,’ said she.
‘I see nothing to be afraid of up to now,’ said I. ‘Such courage as is
needed to tell a scoundrel what I think of him I believe I can claim.’
‘But he will never let you go now. You would go to Rhodes, and tell
his—tell what you say of him.’
‘Yes, and further than Rhodes, if need be. He shall die for it as sure
as I live.’
A thousand men might have tried in vain to persuade me; the
treachery of Constantine had fired my heart and driven out all
opposing motives.
‘Do as I bid you,’ said I sternly, ‘and waste no time on it. We will
watch here by the old man till you return.’
‘My lord,’ she replied, ‘you run on your own death. And you are
young; and the youth by you is yet younger.’
‘We are not dead yet,’ said Denny; I had never seen him look as he
did then; for the gaiety was out of his face, and his lips had grown
set and hard.
She raised her hands towards heaven, whether in prayer or in
lamentation I do not know. We turned away and left her to her sad
work; going back to our places, we waited there till dawn began to
break and from the narrow windows we saw the grey crests of the
waves dancing and frolicking in the early dawn. As I watched them,
the old woman was by my elbow.
‘It is done, my lord,’ said she. ‘Are you still of the same mind?’
‘Still of the same,’ said I.
‘It is death, death for you all,’ she said, and without more she went
to the great door. Hogvardt opened it for her, and she walked away
down the road, between the high rocks that bounded the path on
either side. Then we went and carried the old man to a room that
opened off the hall, and, returning, stood in the doorway, cooling
our brows in the fresh early air. While we stood there, Hogvardt said
suddenly,
‘It is five o’clock.’
‘Then we have only an hour to live,’ said I, smiling, ‘if we don’t make
for the yacht.’
‘You’re not going back to the yacht, my lord?’
‘I’m puzzled,’ I admitted. ‘If we go this ruffian will escape. And if we
don’t go—’
‘Why, we,’ Hogvardt ended for me, ‘may not escape.’
I saw that Hogvardt’s sense of responsibility was heavy; he always
regarded himself as the shepherd, his employers as the sheep. I
believe this attitude of his confirmed my obstinacy, for I said,
without further hesitation:
‘Oh, we’ll chance that. When they know what a villain the fellow is,
they’ll turn against him. Besides, we said we’d wait here.’
Denny seized on my last words with alacrity. When you are
determined to do a rash thing, there is a great comfort in feeling
that you are already committed to it by some previous act or
promise.
‘So we did,’ he cried. ‘Then that settles it, Hogvardt’
‘His lordship certainly expressed that intention,’ observed Watkins,
appearing at this moment with a big loaf of bread and a great
pitcher of milk. I eyed these viands.
‘I bought the house and its contents,’ said I; ‘come along.’
Watkins’ further researches produced a large lump of native cheese;
when he had set this down he remarked:
‘In a pen behind the house, close to the kitchen windows, there are
two goats; and your lordship sees there, on the right of the front
door, two cows tethered.’
I began to laugh, Watkins was so wise and solemn.
‘We can stand a siege, you mean?’ I asked. ‘Well, I hope it won’t
come to that.’
Hogvardt rose and began to move round the hall, examining the
weapons that decorated the walls. From time to time he grunted
disapprovingly; the guns were useless, rusted, out of date; and
there was no ammunition for them. But when he had almost
completed his circuit, he gave an exclamation of satisfaction and
came to me holding an excellent modern rifle and a large cartridge-
case.
‘See!’ he grunted in huge delight. ‘“C. S.” on the stock. I expect you
can guess whose it is, my lord.’
‘This is very thoughtful of Constantine,’ observed Denny, who was
employing himself in cutting imaginary lemons in two with a fine
damascened scimitar that he had taken from the wall.
‘As for the cows,’ said I, ‘perhaps they will carry them off.’
‘I think not,’ said Hogvardt, taking an aim with the rifle through the
window.
I looked at my watch. It was five minutes past six.
‘Well, we can’t go now,’ said I. ‘It’s settled. What a comfort!’ I
wonder whether I had ever in my heart meant to go!
The next hour passed very quietly. We sat smoking pipes or cigars
and talking in subdued tones. The recollection of the dead man in
the adjoining room sobered the excitement to which our position
might otherwise have given occasion. Indeed I suppose that I at
least, who through my whim had led the rest into this quandary,
should have been utterly overwhelmed by the burden on me. But I
was not. Perhaps Hogvardt’s assumption of responsibility relieved
me; perhaps I was too full of anger against Constantine to think of
the risks we ourselves ran; and I was more than half-persuaded that
the revelation of what he had done would rob him of his power to
hurt us. Moreover, if I might judge from the words I heard on the
road, we had on our side an ally of uncertain, but probably
considerable, power in the sweet-voiced girl whom the old woman
called the Lady Euphrosyne; she would not support her uncle’s
murderer, even though he were her cousin.
Presently Watkins carried me off to view his pen of goats, and
having passed through the lofty flagged kitchen, I found myself in a
sort of compound formed by the rocks. The ground had been
levelled for a few yards, and the rocks rose straight to the height of
ten or twelve feet; from the top of this artificial bank they ran again
in wooded slopes towards the peak of the mountain. I followed their
course with my eye, and three hundred or more feet above us, just
beneath the summit, I perceived a little wooden châlet or bungalow.
Blue smoke issued from the chimneys; and, even while we looked, a
figure came out of the door and stood still in front of it, apparently
gazing down towards the house.
‘It’s a woman,’ I pronounced.
‘Yes, my lord. A peasant’s wife, I suppose.’
‘I daresay,’ said I. But I soon doubted Watkins’ opinion; in the first
place, because the woman’s dress did not look like that of a peasant
woman; and secondly, because she went into the house, appeared
again, and levelled at us what was, if I mistook not, a large pair of
binocular glasses. Now such things were not likely to be in the
possession of the peasants of Neopalia. Then she suddenly
retreated, and through the silence of those still slopes we heard the
door of the cottage closed with violence.
‘She doesn’t seem to like the looks of us,’ said I.
‘Possibly,’ suggested Watkins with deference, ‘she did not expect to
see your lordship here.’
‘I should think that’s very likely, Watkins,’ said I.
I was recalled from the survey of my new domains—my satisfaction
in the thought that they were mine survived all the disturbing
features of the situation—by a call from Denny. In response to it I
hurried back to the hall and found him at the window, with
Constantine’s rifle rested on the sill.
‘I could pick him off pat,’ said Denny laughingly, and he pointed to a
figure which was approaching the house. It was a man riding a stout
pony; when he came within about two hundred yards of the house,
he stopped, took a leisurely look, and then waved a white
handkerchief.
‘The laws of war must be observed,’ said I, smiling. ‘This is a flag of
truce.’ I opened the door, stepped out, and waved my handkerchief
in return. The man, reassured, began to mop his brow with the flag
of truce, and put his pony to a trot. I now perceived him to be the
innkeeper Vlacho, and a moment later he reined up beside me,
giving an angry jerk at his pony’s bridle.
‘I have searched the island for you,’ he cried. ‘I am weary and hot!
How came you here?’
I explained to him briefly how I had chanced to take possession of
my house, and added significantly:
‘But has no message come to you from me?’
He smiled with equal meaning, as he answered:
‘No; an old woman came to speak to a gentleman who is in the
village—’
‘Yes, to Constantine Stefanopoulos,’ said I with a nod.
‘Well then, if you will, to the Lord Constantine,’ he admitted with a
careless shrug, ‘but her message was for his ear only; he took her
aside and they talked alone.’
‘You know what she said, though?’
‘That is between my Lord Constantine and me.’
‘And the young lady knows it, I hope—the Lady Euphrosyne?’
Vlacho smiled broadly.
‘We could not distress her with such a silly tale,’ he answered; and
he leant down towards me. ‘Nobody has heard the message but the
Lord Constantine and one man he told it to. And nobody will. If that
old woman spoke, she—well, she knows and will not speak.’
‘And you back up this murderer?’ I cried.
‘Murderer?’ he repeated questioningly. ‘Indeed, sir, it was an accident
done in hot blood. It was the old man’s fault, because he tried to sell
the island.’
‘He did sell the island,’ I corrected; ‘and a good many other people
will hear of what happened to him.’
He looked at me again, smiling.
‘If you shouted it in the hearing of every man in Neopalia, what
would they do?’ he asked scornfully.
‘Well, I should hope,’ I returned, ‘that they’d hang Constantine to the
tallest tree you’ve got here.’
‘They would do this,’ he said with a nod; and he began to sing softly
the chant I had heard the night before.
I was disgusted at his savagery, but I said coolly:
‘And the Lady?’
‘The Lady believes what she is told, and will do as her cousin bids
her. Is she not his affianced wife?’
‘The deuce she is!’ I cried in amazement, fixing a keen scrutiny on
Vlacho’s face. The face told me nothing.
‘Certainly,’ he said gently. ‘And they will rule the island together.’
‘Will they, though?’ said I. I was becoming rather annoyed. ‘There
are one or two obstacles in the way of that. First, it’s my island.’
He shrugged his shoulders again. ‘That,’ he seemed to say, ‘is not
worth answering.’ But I had a second shot in the locker for him, and
I let him have it for what it was worth. I knew it might be worth
nothing, but I tried it.
‘And secondly,’ I went on, ‘how many wives does Constantine
propose to have?’
A hit! A hit! A palpable hit! I could have sung in glee. The fellow was
dumbfoundered. He turned red, bit his lip, scowled fiercely.
‘What do you mean?’ he blurted out, with an attempt at blustering
defiance.
‘Never mind what I mean. Something, perhaps, that the Lady
Euphrosyne might care to know. And now, my man, what do you
want of me?’
He recovered his composure, and stated his errand with his old cool
assurance; but the cloud of vexation still hung heavy on his brow.
‘On behalf of the Lady of the island—’ he began.
‘Or shall we say her cousin?’ I interrupted.
‘Which you will,’ he answered, as though it were not worth while to
wear the mask any longer. ‘On behalf, then, of my Lord Constantine,
I am to offer you safe passage to your boat, and a return of the
money you have paid—’
‘How’s he going to pay that?’
‘He will pay it in a year, and give you security meanwhile.’
‘And the condition is that I give up the island?’ I asked; I began to
think that perhaps I owed it to my companions to acquiesce in this
proposal however distasteful it might be to me.
‘Yes,’ said Vlacho, ‘and there is one other small condition, which will
not trouble you.’
‘What’s that? You’re rich in conditions.’
‘You’re lucky to be offered any. It is that you mind your own
business.’
‘I came here for the purpose,’ I observed.
‘And that you undertake, for yourself and your companions, on your
word of honour, to speak to nobody of what has passed on the
island or of the affairs of the Lord Constantine.’
‘And if I won’t give this promise?’
‘The yacht is in our hands; Demetri and Spiro are our men; there will
be no ship here for two months.’ The fellow paused, smiling at me. I
took the liberty of ending his period for him.
‘And there is,’ I said, returning his smile, ‘as we know by now, a
particularly sudden and fatal form of fever in the island.’
‘Certainly you may chance to find that out,’ said he.
‘But is there no antidote?’ I asked, and I showed him the butt of my
revolver in the pocket of my coat.
Welcome to our website – the perfect destination for book lovers and
knowledge seekers. We believe that every book holds a new world,
offering opportunities for learning, discovery, and personal growth.
That’s why we are dedicated to bringing you a diverse collection of
books, ranging from classic literature and specialized publications to
self-development guides and children's books.

More than just a book-buying platform, we strive to be a bridge


connecting you with timeless cultural and intellectual values. With an
elegant, user-friendly interface and a smart search system, you can
quickly find the books that best suit your interests. Additionally,
our special promotions and home delivery services help you save time
and fully enjoy the joy of reading.

Join us on a journey of knowledge exploration, passion nurturing, and


personal growth every day!

ebookbell.com

You might also like