Handbook of Machine Learning for Computational Optimization: Applications and Case Studies (Demystifying Technologies for Computational Excellence) 1st Edition Vishal Jain (Editor) - Own the complete ebook set now in PDF and DOCX formats
Handbook of Machine Learning for Computational Optimization: Applications and Case Studies (Demystifying Technologies for Computational Excellence) 1st Edition Vishal Jain (Editor) - Own the complete ebook set now in PDF and DOCX formats
com
OR CLICK HERE
DOWLOAD EBOOK
https://ebookmeta.com/product/international-human-rights-law-and-
practice-3rd-edition-ilias-bantekas/
ebookmeta.com
Power and Popular Protest Susan Eckstein (Editor)
https://ebookmeta.com/product/power-and-popular-protest-susan-
eckstein-editor/
ebookmeta.com
https://ebookmeta.com/product/android-programming-1st-edition-bill-
phillips-brian-hardy/
ebookmeta.com
https://ebookmeta.com/product/the-gathering-quantum-
prophecy-2-michael-carroll/
ebookmeta.com
https://ebookmeta.com/product/medical-internet-of-things-techniques-
practices-and-applications-1st-edition-anirban-mitra/
ebookmeta.com
https://ebookmeta.com/product/building-a-career-in-cybersecurity-the-
strategy-and-skills-you-need-to-succeed-1st-edition-yuri-diogenes/
ebookmeta.com
Lives of the eminent philosophers Diogenes Laertius
https://ebookmeta.com/product/lives-of-the-eminent-philosophers-
diogenes-laertius/
ebookmeta.com
Handbook of Machine
Learning for Computational
Optimization
Demystifying Technologies for
Computational Excellence: Moving
Towards Society 5.0
Series Editors
Vikram Bali and Vishal Bhatnagar
This series encompasses research work in the feld of Data Science, Edge Computing,
Deep Learning, Distributed Ledger Technology, Extended Reality, Quantum Computing,
Artifcial Intelligence, and various other related areas, such as natural language pro-
cessing and technologies, high-level computer vision, cognitive robotics, automated
reasoning, multivalent systems, symbolic learning theories and practice, knowledge rep-
resentation and the semantic web, intelligent tutoring systems, AI, and education.
The prime reason for developing and growing out this new book series is to focus
on the latest technological advancements – their impact on the society, the challenges
faced in implementation, and the drawbacks or reverse impact on the society due to
technological innovations. With the technological advancements, every individual has
personalized access to all the services, all devices connected with each other commu-
nicating among themselves, thanks to the technology for making our life simpler and
easier. These aspects will help us to overcome the drawbacks of the existing systems
and help in building new systems with latest technologies that will help the society in
various ways, proving Society 5.0 as one of the biggest revolutions in this era.
Edited by
Vishal Jain, Sapna Juneja, Abhinav Juneja, and
Ramani Kannan
First edition published 2022
by CRC Press
6000 Broken Sound Parkway NW, Suite 300, Boca Raton, FL 33487-2742
© 2022 selection and editorial matter, Vishal Jain, Sapna Juneja, Abhinav Juneja, and Ramani
Kannan; individual chapters, the contributors
Reasonable eforts have been made to publish reliable data and information, but the author and
publisher cannot assume responsibility for the validity of all materials or the consequences of
their use. Te authors and publishers have attempted to trace the copyright holders of all material
reproduced in this publication and apologize to copyright holders if permission to publish in this
form has not been obtained. If any copyright material has not been acknowledged please write and
let us know so we may rectify in any future reprint.
Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced,
transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or
hereafter invented, including photocopying, microflming, and recording, or in any information
storage or retrieval system, without written permission from the publishers.
For permission to photocopy or use material electronically from this work, access www.copyright.
com or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA
01923, 978-750-8400. For works that are not available on CCC please contact mpkbookspermissions@
tandf.co.uk
Trademark notice: Product or corporate names may be trademarks or registered trademarks and are
used only for identifcation and explanation without intent to infringe.
DOI: 10.1201/9781003138020
Typeset in Times
by codeMantra
Contents
Preface...................................................................................................................... vii
Editors....................................................................................................................... xi
Contributors ............................................................................................................xiii
v
vi Contents
Index...................................................................................................................... 279
Preface
Machine learning is a trusted technology over decades and has fourished on a
global scale touching the lives of each one of us. The modern-day decision making
and processes are all dependent on machine learning technology to make matured
short-term and long-term decisions. Machine learning is blessed to have phenomenal
support from the research community, and have landmark contributions, which is
enabling machine learning to fnd new applications every day. The dependency of
human processes on machine learning-driven systems is encompassing all spheres of
current state-of-the-art systems with the level of reliability it offers. There is a huge
potential in this domain to make the best use of machines in order to ensure the opti-
mal prediction, execution, and decision making. Although machine learning is not a
new feld, it has evolved with ages and the research community round the globe have
made remarkable contribution for the growth and trust of applications to incorporate
it. The predictive and futuristic approach, which is associated with machine learning,
makes it a promising tool for business processes as a sustainable solution. There is
an ample scope in the technology to propose and devise newer algorithms, which are
more effcient and reliable to give machine learning an entirely new dimension in dis-
covering certain latent domains of applications, it may support. This book will look
forward to addressing the issues, which can resolve the modern-day computational
bottom lines which need smarter and optimal machine learning-based intervention
to make processes even more effcient. This book presents innovative and improvised
machine learning techniques which can complement, enrich, and optimize the exist-
ing glossary of machine learning methods. This book also has contributions focusing
on the application-based innovative optimized machine learning solutions, which
will give the readers a vision of how innovation using machine learning may aid in
the optimization of human and business processes.
We have tried to knit this book as a read for all books wherein the learners and
researchers shall get insights about the possible dimensions to explore in their spe-
cifc areas of interest. The chapter-wise description is as follows:
Chapter 1 explores the basic concepts of random variables (single and multiple),
their role and applications in the specifed areas of machine learning.
Chapter 2 demonstrates Wigner-Ville transformation technique to extract the
time-frequency domain features from typical and atypical EMG signals – myopathy
(muscle disorder) and amyotrophic lateral sclerosis (neuro disorder). Nature inspired
feature selection algorithms, whale optimization algorithm (WOA), genetic algo-
rithm (GA), bat algorithm (BA), fre-fy optimization (FA), and particle swarm opti-
mization (PSO) are utilized to determine the relevant features from the constructed
features.
Chapter 3 presents various algorithms of machine learning (ML), which can be
used for breast cancer detection. Since these techniques are commonly used in many
areas, they are also used for making decisions regarding diagnostic diagnosis and
clinical studies.
vii
viii Preface
Chapter 4 measures the effciency and thoroughly explores the scope for opti-
mal utilization of the input resources owned by depots of the RSRTC. The new
slack model (NSM) of DEA is used as it enumerates the slacks for input and output
variables. The model satisfes the radial properties, unit invariance, and translation
invariance. This study enables policy-makers to evaluate inputs for consistent output
up to the optimum level and improve the performance of the ineffcient depots.
Chapter 5 presents a binary error control coding scheme using weight-based
codes. This method is quite used for classifcation and employs the K nearest neigh-
bor algorithms. The paper also discussed the role of distance matrix with hamming
code evaluation.
Chapter 6 exhibits MRI images of the framed brain to create deep neural system
models that can be isolated between different types of heart tumors. To perform this
task, deep learning is used. It is a type of instrument-based learning where the lower
levels responsible for many types of higher-level defnitions appear above the differ-
ent levels of the screen.
Chapter 7 focuses on creating an affordable and effective warning system for driv-
ers that is able to detect the warning sign boards and speed limits in front of the mov-
ing vehicle, and prompt the driver to lower to safer speeds if required. The software
internally works on a deep learning-based modern neural network YOLO (You Only
Look Once) with certain modifcations, which allows it to detect the road signs really
quickly and accurately on low-powered ARM CPUs.
Chapter 8 presents an approach for the classifcation of lung cancer based on the
associated risks (high risk, low risk, high risk). The study was conducted using a lung
cancer classifcation scheme by studying micrographs and classifying them into a
deep neural network using machine learning (ML) framework.
Chapter 9 presents a statistical feedback evaluation system that allows to design an
effective questionnaire using statistical knowledge of the text. In this questionnaire,
questions and their weight are not pre-decided. It is established that questionnaire-
based feedback systems are traditional and quite straightforward, but these systems
are very static and restrictive. The proposed statistical feedback evaluation system
is helpful to the users and manufacturers in fnding the appropriate item as per their
choices.
Chapter 10 presents an experimental work based on the data collected on various
parameters on the scientifc measuring analytical software tools Air Veda instru-
ment and IoT-based sensors capturing the humidity and temperature data from atmo-
spheric air in certain interval of time to know the patterns of pollution increment or
decrement in atmosphere of nearby area.
Chapter 11 concerns with neural network representations and defning suitable
problems for neural network learning. It covers numerous substitute designs for the
primitive units making up an artifcial neural network, such as perceptron units,
sigmoid unit, and linear units. This chapter also covers the learning algorithms for
training single units. Backpropagation algorithm for multilayer perceptron training is
described in detail. Also, the general issues such as the representational capabilities
of ANNs, overftting problems, and substitutes to the backpropagation algorithm are
also explained.
Preface ix
Chapter 12 proposes a system which will make use of the machine learning
approach to predict a student’s performance. Based on student’s current performance
and some measurable past attributes, the end result can be predicted to classify them
among good or bad performers. The predictive models will make students aware who
are likely to struggle during the fnal examinations.
Chapter 13 presents a study that assists in assessing the awareness status of people
on the TB towards its mitigation and serves as contribution to the feld of health infor-
matics. Indeed, the majority of participants claimed that they had low awareness on
the TB and its associated issues in their communities. Though, the participants were
from Kano state, a strategic location in the northern part of Nigeria, which means
that the result of the experiment can represent major opinions of northern residents.
Chapter 14 deals with psychological data related to depression, anxiety, and stress
data to study how the classifcation and analysis is carried out on imbalanced data.
The proposed work not only contributes on providing practical information about
the balancing techniques like SMOTE, but also reveals the strategy for dealing
with working of many existing classifcation algorithms like SVM, Random Forest,
XGBoost etc. on imbalanced dataset.
Chapter 15 proposes the construction of segmented mask of MRI (magnetic reso-
nance image) using CNN approach with the implementation of ResNet framework.
The understanding of ResNet framework using layered approach will provide the
extensive anatomical information of higher-dimensional image for precise clinical
analysis for curative treatment of patients.
Editors
Vishal Jain is an Associate Professor in the Department of CSE at Sharda University,
Greater Noida, India. He has earlier worked with Bharati Vidyapeeth’s Institute of
Computer Applications and Management (BVICAM), New Delhi, India (affliated
with Guru Gobind Singh Indraprastha University, and accredited by the All India
Council for Technical Education). He frst joined BVICAM as an Assistant Professor.
Before that, he has worked for several years at the Guru Premsukh Memorial College
of Engineering, Delhi, India. He has more than 350 research citation indices with
Google scholar (h-index score 9 and i-10 index 9). He has authored more than 70
research papers in reputed conferences and journals, including Web of Science and
Scopus. He has authored and edited more than 10 books with various reputed pub-
lishers, including Springer, Apple Academic Press, Scrivener, Emerald, and IGI-
Global. His research areas include information retrieval, semantic web, ontology
engineering, data mining, ad hoc networks, and sensor networks. He was recipient of
a Young Active Member Award for the year 2012–2013 from the Computer Society
of India, Best Faculty Award for the year 2017, and Best Researcher Award for the
year 2019 from BVICAM, New Delhi.
Sapna Juneja is a Professor in IMS, Ghaziabad, India. Earlier, she has worked as
a Professor in the Department of CSE at IITM Group of Institutions and BMIET,
Sonepat. She has more than 16 years of teaching experience. She completed her
doctorate and masters in Computer Science and Engineering from M.D. University,
Rohtak, in 2018 and 2010, respectively. Her broad area of research is Software
Reliability of Embedded System. Her areas of interest include Software Engineering,
Computer Networks, Operating System, Database Management Systems, and
Artifcial Intelligence etc. She has guided several research theses of UG and PG
students in Computer Science and Engineering. She is editing book on recent tech-
nological developments.
xi
xii Editors
Ramani Kannan is currently working as a Senior Lecturer, Center for Smart Grid
Energy Research, Institute of Autonomous system, University Teknologi PETRONAS
(UTP), Malaysia. Dr. Kannan completed Ph.D. (Power Electronics and Drives) from
Anna University, India, in 2012; M.E. (Power Electronics and Drives) from Anna
University, India, in 2006; B.E. (Electronics and Communication) from Bharathiyar
University, India, in 2004. He has more than 15 years of experience in prestigious
educational institutes. Dr. Kannan has published more than 130 papers in various
reputed national and international journals and conferences. He is the editor, co-edi-
tor, guest editor, and reviewer of various books, including Springer Nature, Elsevier
etc. He has received award for best presenter in CENCON 2019, IEEE Conference on
Energy Conversion (CENCON 2019), Indonesia.
Contributors
Shivi Agarwal Renu Jain
Department of Mathematics University Institute of Engineering and
BITS Pilani Technology
Pilani, Rajasthan, India CSJM University
Kanpur, Uttar Pradesh, India
Amala Ann K. A.
Data Science Department Abhinav Juneja
CHRIST (Deemed to be University) KIET Group of Institutions
Bangalore, India Ghaziabad, Uttar Pradesh, India
xiii
xiv Contributors
CONTENTS
1.1 Introduction ......................................................................................................2
1.2 Random Variable ..............................................................................................3
1.2.1 Defnition and Classifcation.................................................................3
1.2.1.1 Applications in Machine Learning ........................................4
1.2.2 Describing a Random Variable in Terms of Probabilities....................4
1.2.2.1 Ambiguity with Reference to Continuous
Random Variable ...................................................................5
1.2.3 Probability Density Function................................................................6
1.2.3.1 Properties of pdf ....................................................................6
1.2.3.2 Applications in Machine Learning ........................................7
1.3 Various Random Variables Used in Machine Learning...................................7
1.3.1 Continuous Random Variables .............................................................7
1.3.1.1 Uniform Random Variable.....................................................7
1.3.1.2 Gaussian (Normal) Random Variable....................................8
1.3.2 Discrete Random Variables ................................................................ 10
1.3.2.1 Bernoulli Random Variable ................................................. 10
1.3.2.2 Binomial Random Variable ................................................. 11
1.3.2.3 Poisson Random Variable .................................................... 12
1.4 Moments of Random Variable........................................................................ 13
1.4.1 Moments about Origin........................................................................ 13
1.4.1.1 Applications in Machine Learning ...................................... 13
1.4.2 Moments about Mean ......................................................................... 14
1.4.2.1 Applications in Machine Learning ...................................... 14
1.5 Standardized Random Variable...................................................................... 15
1.5.1 Applications in Machine Learning..................................................... 15
1.6 Multiple Random Variables ............................................................................ 16
1.6.1 Joint Random Variables...................................................................... 17
1.6.1.1 Joint Cumulative Distribution Function (Joint CDF)........... 17
1.6.1.2 Joint Probability Density Function (Joint pdf) .................... 17
1.6.1.3 Statistically Independent Random Variables ....................... 18
1.6.1.4 Density of Sum of Independent Random Variables............. 18
1.6.1.5 Central Limit Theorem ........................................................ 19
DOI: 10.1201/9781003138020-1 1
2 Handbook of Machine Learning
1.1 INTRODUCTION
Predicting the future using the knowledge about the past is the fundamental objective
of machine learning.
In a digital communication system, a binary data generation scheme referred to as
differential pulse code modulation (DPCM) works on the similar principle, where,
based on the past behaviour of the signal, its future value will be predicted, using a
predictor. A tapped delay line flter serves the purpose. More is the order of the pre-
dictor, better is the prediction, i.e. less is the prediction error.[1]
Thus, machine learning, even though not being referred to by this name earlier,
was/is an integral part of technical world.
The same prediction error with reference to a DPCM system is now being addressed
as confdence interval in connection with machine learning. Less prediction error
implies a better prediction, and as far as machine learning is concerned, the prob-
ability of the predicted value to be within the tolerable limits of error (which is the
confdence interval) should be large, which is a metric for the accuracy of prediction.
The machine learning methodology involves the process of building a statistical
model for a particular task, based on the knowledge of the past data. This collected
past data with reference to a task is referred to as data set.
This way of developing the models to predict about ‘what is going to happen’,
based on the ‘happened’, is predictive modelling.
In detective analysis also, ‘Happened’, i.e. past data, is used, but there is no neces-
sity of predicting about ‘Going to happen’.
For example, 30–35 years back, Reynolds-045 Pen ruled the market for a long
time, specifcally in South India. Presently, the sales are not that much signifcant.
If it is required to study the journey of the pen from past to present, detective anal-
ysis is to be performed, since there is no necessity of predicting its future sales.
Similarly, a study of ‘Why the sales of a particular model of an automobile vehicle
came down?’ also belongs to the same category.
The data set referred above is used by the machine to learn, and hence, it is also
referred to as training data set. After learning, the machine faces the test data. Using
the knowledge the machine gained through learning, it should act on the test data to
resolve the task assigned.
In predictive modelling, if the learning mechanism of the machine is supervised
by somebody, then the mode of learning is referred to as supervised learning. That
supervising ‘somebody’ is the training data set, also referred to as labelled training
data, where each labelled data element such as Di is mapped to a data element D0 .
Such many pairs of elements are the learning resources for the machine, and are used
Random Variables in Machine Learning 3
to build the model, using which the machine predicts, i.e. this knowledge about the
mapped will help the machine to map the test data pairs (input-output pair).
It can be inferred about the supervised learning that there is a target variable
which is to be predicted.
Example: Based on the symptoms of a patient, it is to be predicted whether
he/she is suffering from a particular disease. To enable this prediction, the past
history or statistics such as patients with what symptoms (similar) were categorized
under what disease. This past data (both symptoms and categorization) is the train-
ing data set that supervises the machine in its process of prediction. Here, the target
variable is the disease of the patient, which is to be predicted.
In unsupervised learning mechanism, the training data is considered to be
unlabelled, i.e. only Di. Major functionality of unsupervised learning is pattern
identifcation.
Some of the tasks under unsupervised learning are:
Clustering: Group all the people wearing white (near white) shirts.
Density Estimation: If points are randomly distributed along an axis, the
regions along the axis with minimum/moderate/maximum number points
need to be estimated.
It can be inferred about the unsupervised learning that there is no target variable
which is to be predicted.
With reference to previous example of patient with ill-health, it can be told that
all the people with a particular symptom of ill-health need to be grouped; however,
disease of the patient need not to be predicted, which is the target variable, with refer-
ence to supervised learning.
Table 1.1 gives the pairs of all possible outcomes with the corresponding real values
mapped.
4 Handbook of Machine Learning
TABLE 1.1
Sample Space and Mapped Values
Pair in the Sample Space Real Value
(1,1) 2
(1,2), (2,1) 3
(1,3),(2,2),(3,1) 4
(1,4),(2,3),(3,2),(4,1) 5
(1,5),(2,4),(3,3),(4,2),(5,1) 6
(1,6),(2,5),(3,4),(4,3),(5,2),(6,1) 7
(2,6),(3,5),(4,4),(5,3),(6,2) 8
(3,6),(4,5),(5,4),(6,3) 9
(4,6),(5,5),(6,4) 10
(5,6),(6,5) 11
(6,6) 12
This X is referred to as random variable taking all the real values as mentioned.
Thus, a random variable can be considered as a rule by which a real value is
assigned to each outcome of the experiment.
If X = f (˜ ) possesses countably infnite range (points in the range are large in
number, but can be counted), X is referred to as discrete random variable (categorical
with reference to machine learning).
On the other hand, if the range of the function is uncountably infnite (large in
number, which can’t be counted), X is referred to as continuous random variable[2,3]
(similar terminology is used with reference to machine learning also).
( i ) 0 ˝ P ( X = n ) ˝ 1 ( ii ) ˜P ( X = n) = 1
n
TABLE 1.2
Probability Distribution
xi ( value taken by X ) 1 2 3 4 –
Probability 1 1 1 1 –
2 4 8 16
6 Handbook of Machine Learning
P ( x1 < X ˝ x 2 ) = FX ( x2 ) − FX ( x1 )
P( x ˙ X < x + ˜ )
f X ( x ) = lim
˜ ˘0 ˜
The certainty or the probability with which X is in the interval ( x , x + ˜ ) is
P( x ° X < x + ˜ ), and the denominator δ is the width (length) of the interval.
Thus, f X ( x ) is the probability normalized by the width of the interval and can be
interpreted as probability divided by width.
From the properties of CDF, P ( x ˝ X < x + ˜ ) = FX ( x + ˜ ) − FX ( x ).
FX ( x + ˜ ) − FX ( x ) d
Hence, f X ( x ) = lim = FX ( x ), i.e. change in CDF is referred
˜ ˇ0 ˜ dx
to as pdf.[2]
f (x) ˛ 0
ˆ
˜ −ˆ
f ( x ) dx = 1
x
FX ( x ) =
˜−ˇ
f (° ) d °
Random Variables in Machine Learning 7
˜a
f ( x ) dx = P ( a < X < b )
The physical signifcance of uniform density is that the random variable X can
1
lie in any interval within the limits (˜ , ° ) with the same probability, i.e. ,
˜ −°
for any confdence interval ( k , k + ˜ ) , where ° < k < ˛ , the confidence level
1
P (˜ < X < x + ° ) = .
˛ −˜
• Any confdence level P(˜ 1 < X < °1 ) for the given confdence interval
°1
1
(˜ , ° ) can be obtained as˜˛1 ° − ˛
. dx
• The CDF of uniformly distributed random variable X (continuous) is
0 for x < ˜
x −˜
FX ( x ) = for ˜ < x ˘ °
° −˜
1 for x > °
• A uniform random variable is said to be symmetric about its mean
˙ ˜ + ° ˘ [6]
ˇˆ = .
2
2˙˜ 2 ˘ 2˜ 2
variance of X, respectively.
Random Variables in Machine Learning 9
• Single fip of a coin, where the possible outcomes are head and tail
• Identifying about the survival of a person in an accident: survived or not
only two values either 0 or 1. It can assume any one value at a time. p and q ( = 1 − p )
are the probability of success and failure, respectively. Success is the required whose
probability is to be computed.
For Example: when an unbiased coin is tossed, if it is required to compute the
1 1−1
˝ 1ˇ ˝ 1ˇ 1
probability of getting a head (represented as m = 1), then P ( X = 1) = ˆ ˆ = .
˙ 2˘ ˙ 2˘ 2
Here, success is getting a head.
˘0 for x < 0
The CDF of Bernoulli random variable is FX ( x ) = q for 0 ˙ x < 1[6]
p + q = 1 for x ˇ 1
• When the variable to be predicted is binary valued, i.e. the test data is to be
categorized into any one of the two classes available, such classifcation is
binary classifcation and the performance of such algorithms can be anal-
ysed using Bernoulli process.[5]
respectively, such that p + q = 1. Here, X is the number of times of having the success
in the experiment.
Here, X is the binomial random variable, since the above probability is the coef-
fcient of kth term in the binomial expansion of ( p + q ) .
r
If it is required to fnd the probability for tail to occur four times, when a fair coin
4 6
˝ 1ˇ ˝ 1ˇ
is tossed ten times, then such probability is p ( X = 4 ) = 10 4 ˆ ˆ .
˙ 2˘ ˙ 2˘
The probability of having success for ‘k’ times in ‘r’ number of trials of the exper-
ˆrck ( p ) ( q )
r−k
˝ k
for k = 0,1,2,r
iment in a random order is p ( X = k ) = ˙ , and this
ˆ0
ˇ otherwise
is the pmf of X.
m
TABLE 1.3
Variable Encoding
Outlet Size Small Medium Big
1 Big 0 0 1
2 Big 0 0 1
3 Medium 0 1 0
4 Small 1 0 0
5 Medium 0 1 0
Random Variables in Machine Learning 13
˜( s − si )
^ 2
i
2 1 2 1
is defned. Its value for the above X is ( 2 − 4 ) + ( 4 − 6 ) = 4. Its positive square
2 2
root is 2, which is the value required.
Thus, E ˝˙( X − m )2 ˆˇ is an indication of the average amount of variation of the val-
ues taken by the random variable with reference to its mean, and hence, its variance
(σ 2), and standard deviation (σ) is its positive square root.
The third central moment of X is E ˝˙( X − m )3 ˆˇ and is referred to as its skew. The
E ˙( X − m )3 ˇ˘
normalized skew, i.e. coeffcient of skewness, is given as ˆ and is a mea-
˜3
sure of symmetry of the density of X.
A random variable with symmetric density about its mean will have zero coef-
fcient of skewness.
If more values taken by the random variable are to the right of its mean, the
corresponding density function is said to be right-skewed and the respective
coeffcient of skewness will be positive (>0). Similarly, the left-skewed density
function can also be specifed and the respective coeffcient of skewness will be
negative (<0).[1]
** Identically distributed random variables will have identical moments.
TABLE 1.4
Data in Nonuniform Scaling
Loan amount EMI Income
(Rs. in lakhs) (Rs. in thousands) (Rs. in hundreds)
30 50 1600
40 40 1200
50 30 800
16 Handbook of Machine Learning
TABLE 1.5
Statistical Parameters of the Data
Standard deviation (σ) (amount by which value
Variable Mean (m) taken by the variable differs from its mean)
Loan amount 40 10
EMI 40 10
Income 1200 400
TABLE 1.6
Scaled Data
Loan Amount EMI Income
(Rs. in lakhs) (Rs. in thousands) (Rs. in hundreds)
30 − 40 50 − 40 1600 − 1200
= −1 =1 =1
10 10 400
40 − 40 40 − 40 1200 − 1200
=0 =0 =0
10 −10 400
50 − 40 30 − 40 800 − 1200
=1 = −1 = −1
10 −10 400
• All these data are subjected to scaling to make them to be on the same scale,
such that they are comparable. Thus, feature pre-processing requires scal-
ing of the variables.
• Standard scaling is a scaling method, where the scaling of variables is done
X−m
as per the formula X ° = .
˜
• Table 1.5 gives the computations of mean and standard deviation for differ-
ent variables of Table 1.4.
• Table 1.6 represents the above data subjected to scaling.
• It appears that all the scaled variables appear on the same reference scale.
• Thus, the concept of standardized random variable is used in feature
scaling.
1.6.1.1.1 Properties
1. FXY ( −˝, −˝ ) = 0
2. FXY ( ˛, ˛ ) = 1
3. FXY ( −˝, y ) = 0
4. FXY ( x, −˝ ) = 0
+ FXY ( x1 , y1 )
8. 0 ˛ FXY ( x, y ) ˛ 1
ˆ ˆ
1.
˜ ˜
−ˆ −ˆ
f XY ( x, y ) dx dy = 1
ˆ
2.
˜ y=−ˆ
f XY ( x, y ) dy = f ( x ) , which is the marginal density of X
ˆ
3.
˜ x=−ˆ
f XY ( x, y ) dx = f ( y ) , which is the marginal density of Y
x y
4.
˜ ˜
−ˆ −ˆ
f XY ( x , y ) dx dy = FXY ( x , y )
x2 y2
5. P ( x1 < X < x 2 , y1 < Y < y2 ) =
˜ ˜
x1 y1
f XY ( x , y ) dx dy
18 Handbook of Machine Learning
x ˆ
6. FX ( x ) =
˜ ˜
−ˆ y=−ˆ
f XY ( x , y ) dx dy
ˆ y
7. FY ( y ) =
˜ ˜
x=−ˆ −ˆ
f XY ( x , y ) dx dy [12]
XY ˜1 ˜2 … ˜n
°1 p (° 1 , ˜1 ) p (° 1 , ˜ 2 ) … p (° 1 , ˜ n )
P ( X ,Y ) = °2 p (° 2 , ˜1 ) ° … p (° 2 , ˜ n )
° ° ° … °
°n p (° n , ˜1 ) p (° n , ˜ 2 ) … p (° n , ˜ n )
3. ˜p(° , ˛ ) = p( ˛ )
i
i j j
4. ˜p(° , ˛ ) = p(° )
j
i j i
• FXY ( x, y ) = FX ( x ) FY ( y )
• f XY ( x , y ) = f X ( x ) fY ( y )
ˆ ˆ
f Z ( z ) = f X ( x ) * fY ( y ) =
˜ −ˆ
f X ( x ) fY ( z − x ) dx =
˜
−ˆ
f X ( z − y ) fY ( y ) dy, which is
convolution of their individual density functions.
This principle can be extended to multiple number of independent random vari-
ables also.
For two random variables X and Y, there will be three number of second-order
joint moments.
They are
X − mx Y − m y E ( X − mx )(Y − m y ) σ
ρ XY = E [ X ′Y ′ ] = E = = XY
σ x σ y σ xσ y σ xσ y
• Thus, correlation coeffcient can be used as the metric for the measure
of the linear relation between the variables.
• Variance and standard deviation are the measures of the spread of the data
set around its mean and are one-dimensional measure. When dealing with
data sets with two dimensions, the relation between these two variables
in the data set (e.g. number of hours spent by a tailor in stitching and the
number of shirts stitched can be the variables) i.e. the statistical analysis
between the two variables will be studied using covariance. Covariance for
one random variable is nothing but its variance. In the case of n variables,
covariance matrix [ n X n ] is used for the statistical analysis of all the pos-
sible pairs of the variables.[9,13,14]
( )
other event being the condition is denoted as p A B =
( ˜ B) = p( A, B) , which
p A
P ( B) p( B)
is the probability of event A, under the occurrence of the event B.
( )
1. f X x y is always non-negative
Y
( )
ˆ
2.
˜
−ˆ
f X x y dx = 1
Y
Similar properties hold good for discrete variables also, but defned under discrete
summation.[15]
( )
p B A p( A)
( )
• Baye’s theorem is stated as p A B =
p( B)
.
• Let the data set consist of various symptoms leading to corona/malaria/
typhoid.
Random Variables in Machine Learning 23
( )
• Then, the probability p having a specific disease Symptom S , i.e. having a
specifc symptom S, the probability of suffering from a specifc disease is
the conditional probability.
• If all the hypotheses are of equal a priori probability, then the above condi-
tional probability can be obtained from the probability of having those symp-
(
toms, knowing the disease, i.e. p Symptom S having the specific disease . )
This probability is referred to as maximum likelihood (ML) of the specifc
hypothesis.[7,9]
• Then, the required conditional probability is
(
p having a specific disease Symptom S )
=
( )
p Symptom S having the specific disease ˝ p ( having the specific disease )
p ( Symptom S )
f ( y) =
i
˜ dx
f ( x ) | x = xi . i [3]
dy
TABLE 1.7
Linear Regression Transformation of Variables[10]
Nonlinear
Relations Reduced to Linear Law
˜ = p° n log ( ˜ ) = log ( p ) + n ˇ log (° ) ˘ Y = nX + C , with Y = log ( ˜ ) , X = log (° ) ,C = log ( x )
˜ = m° n + C Y = mX + C, with X = ˜ n , Y = °
˜ = p° n + q.log (° ) ˜ °n
Y = aX + b, with Y = ,X = , a = p, b = q
log (° ) log (° )
˜ = pe q˛ Y = m˜ + c, with Y = log ( ° ) , m = q ˇ log ( e ) ,
c = log ( p )
1.8 CONCLUSION
Thus, random variables are playing a vital role in the felds of machine learning and
artifcial intelligence. Since prediction about the future values of a variable involves
some amount of uncertainty, theory of probability and random variables are essen-
tial constituent building blocks of the algorithms used to teach a machine to perform
certain tasks that are dealing with the principles of learning based on the experience.
These random variables are very much specifc in the theory of signal estimation too.[11]
REFERENCES
1. Bhagwandas P. Lathi and Zhi Ding – Modern Digital and Analog Communication
Systems, Oxford University Press, New York, International Fourth Edition, 2010.
2. Scott L. Miller and Donald G. Childers – Probability and Random Processes with
Applications to Signal Processing and Communications, Academic Press, Elsevier
Inc., Boston, MA, 2004.
3. Henry Stark and John W. Woods – Probability and Random Processes with Applications
to Signal Processing, Pearson, Upper Saddle River, NJ, Third Edition, 2002.
4. Kevin P. Murphy – Machine Learning: A Probabilistic Perspective, The MIT Press,
Cambridge, MA, 2012.
5. Jose Unpingco – Python for Probability, Statistics, and Machine Learning, Springer,
Cham, 2016.
6. Steven M. Kay – Intuitive Probability and Random Processes using MATLAB, Springer,
New York, 2006.
7. Peter D. Hoff – A First Course in Bayesian Statistical Methods, Springer, New York,
2009.
8. Shai Shalev-Shwartz and Shai Ben-David – Understanding Machine Learning: From
Theory to Algorithms, Cambridge University Press, New York, 2014.
9. Bernard C. Levy – Principles of Signal Detection and Parameter Estimation, Springer,
Cham, 2008.
Random Variables in Machine Learning 25
10. Michael Paluszek and Stephanie Thomas – MATLAB Machine Learning, Apress,
New York, 2017.
11. Rober M. Gray and Lee D. Davisson – An Introduction to Statistical Signal Processing,
Cambridge University Press, Cambridge, 2004.
12. Friedrich Liese and Klaus-J. Miescke – Statistical Decision Theory – Estimation,
Testing and Selection-Springer Series in Statisitcs, Springer, New York, 2008.
13. James O. Berger – Statistical Decision Theory and Bayesian Analysis, Springer-Verlag,
New York Inc., New York, Second Edition, 2013.
14. Ruise He and Zhiguo Ding (Eds.) – Applications of Machine Learning in Wireless com-
munications, IET Telecommunication Series 81, IET The Institution of Engineering
and Technology, London, 2019.
15. Robert M. Fano – Transmission of Information: A Statistical Theory of Communications,
The MIT Press, Cambridge, MA, 1961.
Exploring the Variety of Random
Documents with Different Content
"I am," remarked Toby, as he lifted his glass, "a prophet in a small
way. Old boy, your hand. To the health of our double marriage--and
no heeltaps."
CHAPTER XII.
ARS AMORIS.
She smiled significantly, and simple Mrs. Valpy, seeing that the
companion was looking at Toby and her daughter, who were
amusing themselves at the piano, misinterpreted the smile, and
therefore spoke according to her misinterpretation.
Mrs. Belswin, thus being appealed to, started, smiled politely, and
assented with much outward show of interest to the remark of the
old lady.
"It's so nice for Toby to have his home here," pursued Mrs. Valpy,
with much satisfaction; "because, you know, our place is not far
from the vicarage, so I shall not be parted from my daughter."
The other woman started, and laid her hand on her breast, as if to
still the beating of her heart.
"Yes; it would be a terrible thing to part with your only child," she
said in a low voice. "I know what the pain of such a separation is."
"You have parted from your child, then?" said Mrs. Valpy,
sympathetically.
"Well, no; not exactly;" she said, still in the same low voice; "but--
but my little daughter--my little daughter died many years ago."
It was very hard for her to lie like this when her daughter was only a
few yards away, chatting to Maxwell at the window; but Mrs. Belswin
looked upon such necessary denial as punishment for her sins, and
accepted it accordingly.
"Oh, yes, I think so. Besides, now you have that dear girl, Kaituna,
and she seems very fond of you."
"Yes."
She could say no more. The strangeness of the situation excited her
to laughter, to that laughter which is very near tears, and she was
afraid to speak lest she should break down.
"And then Sir Rupert will be so glad to find his daughter has such a
good friend."
The mention of the hated name restored Mrs. Belswin to her usual
self, and with a supercilious glance at the blundering woman who
had so unconsciously wounded her, she answered in her ordinary
manner--
"I hope so! But I'm afraid I shall not have an opportunity of seeing
Sir Rupert at once, as I go to town shortly, on business."
"But you will return?"
"No, perhaps not! At all events I think you will like Sir Rupert."
"I'm certain. Such a gentlemanly man. Quite young for his age. I
wonder he does not marry again."
"Perhaps he had enough of matrimony with his first wife," said Mrs.
Belswin, coolly.
"Yes! Simply worshipped her. She died in New Zealand when Kaituna
was a baby, I believe, and Sir Rupert told me how this loss had
overshadowed his life."
The conversation was becoming a little difficult for her to carry on,
as she dare not disclose herself yet, and did not care about
exchanging complimentary remarks on the subject of a man she
detested so heartily.
At this moment Toby struck a chord on the piano, and Tommy burst
out laughing, so, with ready wit, Mrs. Belswin made this interruption
serve as an excuse to break off the conversation.
"The young people seem to be merry," she said to Mrs. Valpy, and
rising to her feet, "I must go over and see what the joke is about."
"Wherever do you learn such slang?" said Mrs. Belswin, with a smile.
"Toby."
"Do you really?" said Tommy, laughing. "Well, I at present speak the
President's American, so go right along, stranger, and look slippy
with the barrel organ."
"If your mother hears you," remonstrated Mrs. Belswin, "she will----"
"As a chaperon you should hunt them out," said Miss Valpy,
mischievously.
"Suppose I give the same advice to your mother," replied Mrs.
Belswin, dryly.
"Don't," said Toby, in mock horror; "as you are strong be merciful."
"I must collect my ideas first," replied Toby, running his fingers over
the piano. "Wait till the spirit moves me."
Mrs. Belswin had resumed her seat near the sleeping form of Mrs.
Valpy, and was thinking deeply, though her thoughts, judging from
the savage expression in her fierce eyes, did not seem to be very
agreeable ones, while Tommy leaned over the piano watching Toby's
face as he tried to seek inspiration from her smiles.
Outside on the short dry grass of the lawn, Kaituna was strolling,
accompanied by Archie Maxwell. The grass extended for some
distance in a gentle slope, and was encircled by tall trees, their
heavy foliage drooping over the beds of flowers below. Beyond, the
warm blue of the sky, sparkling with stars, and just over the
trembling tree-tops the golden round of the moon. A gentle wind
was blowing through the rustling leaves, bearing on its faint wings
the rich odours of the flowers, and the lawn was strewn with aerial
shadows that trembled with the trembling of the trees. Then the
white walls of the vicarage, the sloping roof neutral tinted in the
moonlight, the glimmer of the cold shine on the glass of the upstair
windows, and below, the yellow warm light streaming out of the
drawing-room casements on the gravelled walk, the lawn beyond,
and the figures of the two lovers moving like black shadows through
the magical light. A nightingale began to sing deliciously, hidden in
the warm dusk of the leaves, then another bird in the distance
answered the first. The hoot of an owl sounded faintly through the
air, the sharp whirr of a cricket replied, and all the night seemed full
of sweet sounds.
"What I miss very much in the sky here," said Kaituna, looking up at
the stars, "is the Southern Cross."
"You don't seem very sure, Mr. Maxwell. What about South
America?"
"I thought I had told you that I had changed my mind about South
America."
"I believe you said something about putting off your journey till the
end of the year."
Duplicity on the part of the woman, who knew perfectly well the
event to which the young man referred.
It was only to gain time for reflection, as she knew that a declaration
of love trembled on his lips, but with feminine coquetry could not
help blowing hot to his cold.
"Ah!" said Archie, with a long breath, when the fierce cry had rung
out for the last time, "that is the way to win a bride."
Kaituna thought so too, although she did not make any remark, but
the shrill savagery of the song had stirred her hereditary instincts
profoundly, and even in the dim moonlight Archie could see the
distension of her nostrils, and the flash of excitement that sparkled
in her eyes. It gave him an idea, and throwing himself on his knees,
he began to woo her as fiercely and as freely as ever her dusky
ancestors had been wooed in the virgin recesses of New Zealand
woods.
"Kaituna, I love you! I love you. You must have seen it; you must
know it. This is no time for timid protestations, for doubtful sighing.
Give me your hands." He seized them in his strong grasp. "I am a
man, and I must woo like a man. I love you! I love you! I wish you
to be my wife. I am poor, but I am young, and with you beside me, I
can do great things. Say that you will marry me."
"But my father!"
He sprang to his feet, still holding her hands, and drew her forcibly
towards him.
"Your father may consent--he may refuse. I do not care for his
consent or his refusal. Say you will be my wife, and no human being
shall come between us. I have no money. I will gain a fortune for
you. I have no home--I will make one for you. Youth, love, and God
are on our side, and we are made the one for the other. You must
not say no! You shall not say no. You are the woman needed to
complete my life; and God has given you to me. Lay aside your
coquetry, your hesitations, your fears. Speak boldly to me as I do to
you. Let no false modesty--no false pride--no maidenly dread come
between us. I love you, Kaituna. Will you be my wife?"
"Kaituna!"
One who has been in strange lands, and ventured his life in far
countries, is by no means anxious to court again the dangers he has
so happily escaped. The traveller, telling his tales by his lately gained
fireside, shudders as he remembers the perils he has dared, the
risks he has encountered, and is thankful for his present safety, so
thankful indeed that he is unwilling to place his life for the second
time at the disposal of chance.
It was somewhat after this fashion that Mrs. Belswin viewed her
present security in contrast to her past jeopardy. She had been a
free-lance, and adventuress, an unprotected woman at the mercy of
the world, so hard and pitiless to such unfortunates; but now she
had found a home, a refuge, a daughter's love, a bright oasis in the
desert of affliction, and she dreaded to be driven out of this peaceful
paradise, which held all that made her life worth having, into a
stormy world once more. Through perils more deadly than those of
savage lands, through storms more terrible than those of the ocean,
she had passed into a haven of tranquillity; but now that she was
tasting of the pleasures of hope and repose, it seemed as though
she would once more be driven forth to battle with her fellow-
creatures.
Her quondam husband held her fate in his hand. He had right and
might on his side, and she knew that she could expect no mercy
from one whom she had so deeply wronged. Had the positions been
reversed she felt that she would not have scrupled to enforce the
powers she possessed, and, therefore, never for a moment dreamed
that her husband would act otherwise. All she knew was that she
was now in Paradise, that she enjoyed her daughter's affection,
ignorant as that daughter was of the mother's identity, and that the
husband of her youth, and the father of her dearly-loved child would
expel her from this hardly won Paradise as soon as he discovered
her therein.
This being the case, she did not waste time in asking for a mercy not
likely to be granted, but set herself to work to find out some means
of retaining her position in defiance of her husband's enmity and
hatred. After her conversation with Mrs. Valpy, she saw that Rupert
Pethram had glossed over the affair of the divorce in order to avoid
all suspicion of scandal against himself and the mother of his child,
for he was unwilling that the child should suffer for the sin of her
parent. This was certainly a point in her favour, as by threatening to
denounce the whole affair if she was not allowed to retain her
position she could force him to acquiesce in her demand, in order to
avoid scandal.
But then if he, though keeping the terrible affair secret from the
outside world, told Kaituna all about her mother's disgrace, thus
destroying the love which the girl had for the memory of one whom
she thought was dead--it would be too terrible, as she could urge
nothing in extenuation of her sin, and would be forced to blush
before her own child. No, nothing could be done in that way. Should
she throw herself on the mercy of the man she had wronged? Alas!
she knew his stern nature well enough to be aware of the hopeless
folly of such an attempt. Looking at the whole affair in whatever way
that suggested itself to her fertile brain, she saw no means of
retaining her position, her child or her newly-found respectability,
except by enlisting the sympathy of Ferrari and----
But it was too terrible. It was a crime. Guilty as she was, to do this
would render her still more guilty. Even if she succeeded in getting
her husband out of the way, and it was not discovered by the law,
there was still Ferrari to be reckoned with. It would give him a
strong hold over her, which he would use to force her into marriage,
and then she would be still separated from her child, so that the
crime she contemplated would be useless.
To see this woman raging up and down her bedroom was a pitiful
sight. Flinging herself on her knees she would pray to God to soften
the heart of her husband, then, realising how futile was the hope,
she would start to her feet and think again of the crime she
contemplated committing with the assistance of her Italian lover. She
raged, she wept, she sighed, she implored. Her mood changed with
every tick of the clock; from hope she fell into despair; from despair
she changed once more to hope--tears imprecations, prayers,
threats, she tried them all in their turn, and the result was always
the same--absolute failure. She was dashing herself in vain against
an adamantine wall, for in her calmer moments she saw how
helpless she was against the position held by her husband--a
position approved of by law, approved of by the world. She could do
nothing, and she knew it.
Still, Ferrari!
Yes, she would go up and see him, for perhaps he could solve the
riddle which thus perplexed her so terribly. He would demand his
price, she knew him well enough for that. Well, she would pay it in
order to still retain possession of her child. Let her accomplish her
present desire and the future would take care of itself. So, Mrs.
Belswin, summoning all her philosophy to her aid, composed her
features, and told Kaituna that she was going up to London on
business.
"But papa will be here next week," said the girl in dismay.
"Yes; I'm sorry to go at such a time, dear," replied Mrs. Belswin, with
an immovable countenance, "but it is a very important matter that
takes me away."
"Oh, I'm glad of that," said Kaituna, with a flush; "you know I want
you to help me gain papa's consent to my marriage with Archie."
Mrs. Belswin smiled bitterly as she kissed her daughter, knowing how
weak was the reed upon which the girl leaned. She ask Rupert
Pethram to consent to the marriage--she dare to demand a favour of
the man she had wronged for the child she had forsaken! She
almost laughed as she thought of the terrible irony of the situation,
but, restraining herself with her usual self-command, bade the girl
hope for the best.
"How strangely you talk," said Kaituna, rather puzzled; "if you come
back in a fortnight you will be sure to see papa."
"Of course, dear! of course. I was only thinking that some
unforeseen accident----"
"Perhaps I shall never see her again," she said, with a groan,
throwing herself back in her seat. "But no; that will never happen;
even if Rupert does turn me out of the house he will not tell Kaituna
anything to destroy her belief in her mother, so I shall some day
meet her with her husband."
Her lips curled as she said this, knowing well that Sir Rupert would
never give his consent to the marriage, and then she clenched her
hands with a frown.
And with this firm determination she left her husband's house--the
house in which she should have reigned a happy mistress and
mother, and the house into which she had crept like a disguised
thief, the house which she, in the mad instinct of her savage nature,
intended to deprive of its master.
While waiting on the railway platform for the London train, she saw
Samson Belk.
The relations between these two were peculiar. Ever since he had
seen her at his mother's cottage, Belk had followed her everywhere
like her shadow, much to Mrs. Belswin's astonishment, for, candid in
all things to herself, she could not conceive how a handsome young
man could leave younger women for one verging on middle age. Yet
such was the case. This bucolic man had fallen passionately in love,
and adored her with all the sullen ardour of his obstinate nature. He
was slow-witted, dull-headed, and it took a long time for an idea to
penetrate into his brain, but once the idea was there, nothing could
get it out again. This woman, so different from all he had known,
who spoke in a commanding way, who flashed her eyes fiercely on
all, as if they were her slaves, had, without a word, without a sign,
brought to his knees this uncultured man, who knew nothing of the
deference due to the sex, and whose only attributes were great
physical strength and a handsome exterior. Formerly, owing to these
advantages, he had gained admiration from all women, and in return
had treated them with brutal indifference, or scarcely veiled
contempt; but now the positions were reversed, and he was the
abject slave of this imperious queen, who looked down at him with
disdain. It was a case of Samson like wax in the hands of Delilah--of
Hercules subjugated by Omphale; and Samson Belk, with all his
virile strength, his handsome face, his stalwart figure, was crouching
like a dog at the feet of Mrs. Belswin.
"Well, Mr. Belk," she said, indifferently, "what are you doing here?"
"Thank you, but there is no need. The porters will attend to all that,"
replied the lady, graciously. "But you don't look very well, Mr. Belk. I
suppose you've been drinking."
"That's just as well. You know Sir Rupert returns next week, and if
he found you to be dissipated, he'd dismiss you on the spot."
"Would he?" said Belk, sullenly. "Let him if he likes. You seem to
know Sir Rupert, madam."
Mrs. Belswin was not going to discuss this subject with a servant like
Belk, so she turned indifferently away as the train came into the
station, and left him standing there, looking in sullen admiration at
her graceful form in the dark garments she now affected.
He blurted out this question with a deep flush, and Mrs. Belswin
stared at him with undisguised astonishment She could not
understand the reason of this man's deference, for she judged it
impossible that he could be so deeply in love with her as all his
actions seemed to denote. Good-natured, however, when not
crossed in any way, she replied politely, as the train moved off--
She laughed at the idea, and taking up the Telegraph began to read,
but suddenly laid it down with a nervous start.
"Ferrari loves me! Belk loves me! I love neither, but only my child.
Rupert stands between me and my happiness. Which of these men
will remove him out of my path? Ferrari--a subtle Italian, Belk--a
brutal Saxon. Humph! The fox and the lion over again--craft and
strength! I can depend on them both, and Rupert----"
"Number One is the greater number; if I assisted Number Two it would become the
lesser."
While his voice lasted he was well aware that he could command an
excellent income which satisfied him completely; for when he grew
old and songless he was quite prepared to return to Italy, and live
there the happy-go-lucky life of his youth on polenta and sour wine.
In his impulsive southern fashion he loved Mrs. Belswin madly; but,
strangely enough, it never for a moment occurred to him to save
money against his possible marriage with her. If he starved, she
would starve; if he made money, she would share it; and if she
objected to such a chequered existence, Signor Ferrari was quite
confident enough in his own powers of will and persuasion to be
satisfied that he could force her to accept his view of the matter.
This was the Ferrari philosophy, and no bad one either as times go,
seeing that a singer's livelihood depends entirely upon the caprice of
the public. As long as he could get enough to eat, be the food rich
or plain, a smoke, and plenty of sleep, the world could go hang for
all he cared. He lived in the present, never thought about the past,
and let the future take care of itself; so altogether managed to
scramble through life in a leisurely, selfish manner eminently
egotistical in fashion.
He did so, and was just in the middle of the first verse when Mrs.
Belswin made her appearance, upon which he stopped abruptly, and
came forward to greet her with theatrical effusion.
"Stella dora! once more you shine," he cried, seizing her hands, with
a passionate look in his dark eyes. "Oh, my life! how dear it is to see
thee again."
Mrs. Belswin drew her hand away sharply and frowned, for in her
present irritable state of mind the exaggerated manner of Ferrari
jarred on her nerves.
"How can that be acting, cruel one, which is the truth?" replied
Ferrari, reproachfully, rising from his knees. "Thou knowst my love,
and yet when I speak you are cold. Eh, Donna Lucrezia, is your
heart changed?"
"Oh, for a time; yes!" echoed Ferrari, mockingly. "Amica mia, you
have a strange way of speaking to him who adores you. Dio, you
play with me like a child. I love you, and wish you for my wife. You
say 'yes,' and depart for a time. Now return you to me and again
say, 'Stephano, I leave you for a time.'"
"I made no promise to be your wife," said Mrs. Belswin, angrily, "nor
will I do so unless you help me now."
"Help you! and in what way? Has the little daughter been cruel? You
wish me to speak as father to her."
"I wish you to do nothing of the sort. My daughter is quite well, and
I was perfectly happy with her."
"And without me," cried Ferrari, jealously; upon which Mrs. Belswin
made a gesture of irritation.
"We can settle that afterwards," she said, drawing off her gloves:
"meanwhile let us talk sense. I shall be up in town for a fortnight."
"At an hotel in the Strand. I'll give you the address before I leave."
"It all depends on whether you will help me in what I wish to do."
"Ebbene! Is it il marito?"
"How pleased you are," said Ferrari, mockingly. "Oh, yes, he will be
so sweet to behold you."
"I must! I must!" cried Mrs. Belswin in despair. "I can't give up my
child after meeting her again. Twenty years, Stephano, and I have
not seen her; now I am beside her every day. She loves me--not as
her mother, but as her friend. I can't give up all this because my
husband is returning."
Signor Ferrari shrugged his shoulders and lighted a cigarette.
"But there is nothing more you can do," he said, spreading out his
hands with a dramatic gesture, "eh, carrissima? Think of what is this
affair. Il marito has said to you, 'Good-bye.' The little daughter thinks
you to be dead. If then you come to reveal yourself, il marito--eh,
amica mia! it is a trouble for all."
"Nothing! oh no, certainly! You have beheld the little daughter for a
time. Now you are to me again. I say, Stella 'dora, with me remain
and forget all."
"No, I will not! I will not!" cried Mrs. Belswin, savagely, rising to her
feet. "Cannot you see how I suffer? If you love me as you say, you
must see how I suffer. Give up my child, my life, my happiness! I
cannot do it."
"I can! I must! Do you think I will stay with you while my child calls
me?"
"With me you must stay, my Norma. I love thee. I will not leave you
no more."
Mrs. Belswin felt her helplessness, and clenched her hands with a
savage cry of despair, that seemed to be torn out of her throbbing
heart. Up and down the gaudy room she paced, with her face
convulsed with rage, and her fierce eyes flashing with an unholy fire,
while Ferrari, secure in his position, sat quietly near the window,
smoking leisurely. His self-possession seemed to provoke her, ready
as she was to vent her impotent anger on anything, and, stopping
abruptly she poured forth all her anger.
"Why do you sit there smiling, and smiling, like a fool?" she shrieked,
stamping her foot. "Can you not suggest something? Can you not do
something?"
"Eh, carissima, I would say, 'Be quiet' The people below will hear you
cry out."
"Murder!"
Signor Ferrari let the cigarette drop from his fingers, and jumped up
with a cry of dismay looking pale and unnerved. She saw this, and
lashing him with her tongue, taunted him bitterly.
"Yes, murder, you miserable! I thought you were a brave man; but I
see I made a mistake. You love me! You want to be my husband!
No, no, no! I marry a brave man--yes, a brave man; not a coward!"
He bounded across the room, and seized her roughly by the wrist.
He flung her away from him with a gesture of anger, and began to
walk about the room. Mrs. Belswin remained silent, savagely
disappointed at the failure of her plan, and presently Ferrari began
to talk again in his rapid, impulsive fashion.
"If there was any gain. Yes. But I see not anything. I would work
against myself. You know that, Signora Machiavelli. Ah, yes; I am not
blind, cara mia. While il marito lives, you are mine. He will keep you
from the little daughter. But he dies--eh, and you depart."
"I refuse your swearing. They are false. Forget, il marito--forget the
little daughter! You are mine, mia moglie, and you depart not again."
Mrs. Belswin laughed scornfully, and put on her gloves again with
the utmost deliberation. Then, taking up her umbrella, she moved
quickly towards the door; but not so quickly as to prevent Ferrari
placing himself before her.