100% found this document useful (6 votes)
25 views

Handbook of Machine Learning for Computational Optimization: Applications and Case Studies (Demystifying Technologies for Computational Excellence) 1st Edition Vishal Jain (Editor) - Own the complete ebook set now in PDF and DOCX formats

The document provides information about various eBooks available for download on ebookmeta.com, focusing on topics related to machine learning, artificial intelligence, and computational optimization. It highlights the 'Handbook of Machine Learning for Computational Optimization' and other titles in the 'Demystifying Technologies for Computational Excellence' series, which explore the impact of technology on society. Additionally, it includes details about the editors, chapters, and subjects covered in these publications.

Uploaded by

atavewnar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (6 votes)
25 views

Handbook of Machine Learning for Computational Optimization: Applications and Case Studies (Demystifying Technologies for Computational Excellence) 1st Edition Vishal Jain (Editor) - Own the complete ebook set now in PDF and DOCX formats

The document provides information about various eBooks available for download on ebookmeta.com, focusing on topics related to machine learning, artificial intelligence, and computational optimization. It highlights the 'Handbook of Machine Learning for Computational Optimization' and other titles in the 'Demystifying Technologies for Computational Excellence' series, which explore the impact of technology on society. Additionally, it includes details about the editors, chapters, and subjects covered in these publications.

Uploaded by

atavewnar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 71

Read Anytime Anywhere Easy Ebook Downloads at ebookmeta.

com

Handbook of Machine Learning for Computational


Optimization: Applications and Case Studies
(Demystifying Technologies for Computational
Excellence) 1st Edition Vishal Jain (Editor)
https://ebookmeta.com/product/handbook-of-machine-learning-
for-computational-optimization-applications-and-case-
studies-demystifying-technologies-for-computational-
excellence-1st-edition-vishal-jain-editor/

OR CLICK HERE

DOWLOAD EBOOK

Visit and Get More Ebook Downloads Instantly at https://ebookmeta.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Industry 4.0 Technologies for Business Excellence:


Frameworks, Practices, and Applications (Demystifying
Technologies for Computational Excellence) 1st Edition
Shivani Bali (Editor)
https://ebookmeta.com/product/industry-4-0-technologies-for-business-
excellence-frameworks-practices-and-applications-demystifying-
technologies-for-computational-excellence-1st-edition-shivani-bali-
editor/
ebookmeta.com

Artificial Intelligence, Machine Learning, and Data


Science Technologies: Future Impact and Well-Being for
Society 5.0 (Demystifying Technologies for Computational
Excellence) 1st Edition
https://ebookmeta.com/product/artificial-intelligence-machine-
learning-and-data-science-technologies-future-impact-and-well-being-
for-society-5-0-demystifying-technologies-for-computational-
excellence-1st-edition/
ebookmeta.com

Data Science and Innovations for Intelligent Systems:


Computational Excellence and Society 5.0 (Demystifying
Technologies for Computational Excellence) 1st Edition
Kavita Taneja (Editor)
https://ebookmeta.com/product/data-science-and-innovations-for-
intelligent-systems-computational-excellence-and-
society-5-0-demystifying-technologies-for-computational-
excellence-1st-edition-kavita-taneja-editor/
ebookmeta.com

International Human Rights Law and Practice 3rd Edition


Ilias Bantekas

https://ebookmeta.com/product/international-human-rights-law-and-
practice-3rd-edition-ilias-bantekas/

ebookmeta.com
Power and Popular Protest Susan Eckstein (Editor)

https://ebookmeta.com/product/power-and-popular-protest-susan-
eckstein-editor/

ebookmeta.com

Android Programming 1st Edition Bill Phillips Brian Hardy

https://ebookmeta.com/product/android-programming-1st-edition-bill-
phillips-brian-hardy/

ebookmeta.com

The Gathering Quantum Prophecy 2 Michael Carroll

https://ebookmeta.com/product/the-gathering-quantum-
prophecy-2-michael-carroll/

ebookmeta.com

Medical Internet of Things: Techniques, Practices and


Applications 1st Edition Anirban Mitra

https://ebookmeta.com/product/medical-internet-of-things-techniques-
practices-and-applications-1st-edition-anirban-mitra/

ebookmeta.com

Building a Career in Cybersecurity The Strategy and Skills


You Need to Succeed 1st Edition Yuri Diogenes

https://ebookmeta.com/product/building-a-career-in-cybersecurity-the-
strategy-and-skills-you-need-to-succeed-1st-edition-yuri-diogenes/

ebookmeta.com
Lives of the eminent philosophers Diogenes Laertius

https://ebookmeta.com/product/lives-of-the-eminent-philosophers-
diogenes-laertius/

ebookmeta.com
Handbook of Machine
Learning for Computational
Optimization
Demystifying Technologies for
Computational Excellence: Moving
Towards Society 5.0
Series Editors
Vikram Bali and Vishal Bhatnagar

This series encompasses research work in the feld of Data Science, Edge Computing,
Deep Learning, Distributed Ledger Technology, Extended Reality, Quantum Computing,
Artifcial Intelligence, and various other related areas, such as natural language pro-
cessing and technologies, high-level computer vision, cognitive robotics, automated
reasoning, multivalent systems, symbolic learning theories and practice, knowledge rep-
resentation and the semantic web, intelligent tutoring systems, AI, and education.
The prime reason for developing and growing out this new book series is to focus
on the latest technological advancements – their impact on the society, the challenges
faced in implementation, and the drawbacks or reverse impact on the society due to
technological innovations. With the technological advancements, every individual has
personalized access to all the services, all devices connected with each other commu-
nicating among themselves, thanks to the technology for making our life simpler and
easier. These aspects will help us to overcome the drawbacks of the existing systems
and help in building new systems with latest technologies that will help the society in
various ways, proving Society 5.0 as one of the biggest revolutions in this era.

Industry 4.0, AI, and Data Science


Research Trends and Challenges
Edited by Vikram Bali, Kakoli Banerjee, Narendra Kumar, Sanjay Gour, and
Sunil Kumar Chawla
Handbook of Machine Learning for Computational Optimization
Applications and Case Studies
Edited by Vishal Jain, Sapna Juneja, Abhinav Juneja, and Ramani Kannan
Data Science and Innovations for Intelligent Systems
Computational Excellence and Society 5.0
Edited by Kavita Taneja, Harmunish Taneja, Kuldeep Kumar,
Arvind Selwal, and Ouh Lieh
Artifcial Intelligence, Machine Learning, and Data Science Technologies
Future Impact and Well-Being for Society 5.0
Edited by Neeraj Mohan, Ruchi Singla, Priyanka Kaushal, and Seifedine Kadry

For more information on this series, please visit: https://www.routledge.com/


Demystifying-Technologies-for-Computational-Excellence-Moving-Towards-
Society-5.0/book-series/CRCDTCEMTS
Handbook of Machine
Learning for Computational
Optimization
Applications and Case Studies

Edited by
Vishal Jain, Sapna Juneja, Abhinav Juneja, and
Ramani Kannan
First edition published 2022
by CRC Press
6000 Broken Sound Parkway NW, Suite 300, Boca Raton, FL 33487-2742

and by CRC Press


2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN

© 2022 selection and editorial matter, Vishal Jain, Sapna Juneja, Abhinav Juneja, and Ramani
Kannan; individual chapters, the contributors

CRC Press is an imprint of Taylor & Francis Group, LLC

Reasonable eforts have been made to publish reliable data and information, but the author and
publisher cannot assume responsibility for the validity of all materials or the consequences of
their use. Te authors and publishers have attempted to trace the copyright holders of all material
reproduced in this publication and apologize to copyright holders if permission to publish in this
form has not been obtained. If any copyright material has not been acknowledged please write and
let us know so we may rectify in any future reprint.

Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced,
transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or
hereafter invented, including photocopying, microflming, and recording, or in any information
storage or retrieval system, without written permission from the publishers.

For permission to photocopy or use material electronically from this work, access www.copyright.
com or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA
01923, 978-750-8400. For works that are not available on CCC please contact mpkbookspermissions@
tandf.co.uk

Trademark notice: Product or corporate names may be trademarks or registered trademarks and are
used only for identifcation and explanation without intent to infringe.

Library of Congress Cataloging-in-Publication Data


Names: Jain, Vishal, 1983- editor.
Title: Handbook of machine learning for computational optimization :
applications and case studies / Vishal Jain, Sapna Juneja, Abhinav Juneja, Ramani Kannan.
Description: Boca Raton : CRC Press, 2021. | Series: Demystifying
technologies for computational excellence | Includes bibliographical references and index.
Identifers: LCCN 2021017098 (print) | LCCN 2021017099 (ebook) |
ISBN 9780367685423 (hardback) | ISBN 9780367685454 (paperback) |
ISBN 9781003138020 (ebook)
Subjects: LCSH: Machine learning—Industrial applications. | Mathematical
optimization—Data processing. | Artifcial intelligence.
Classifcation: LCC Q325.5 .H295 2021 (print) | LCC Q325.5 (ebook) | DDC 006.3/1—dc23
LC record available at https://lccn.loc.gov/2021017098
LC ebook record available at https://lccn.loc.gov/2021017099

ISBN: 978-0-367-68542-3 (hbk)


ISBN: 978-0-367-68545-4 (pbk)
ISBN: 978-1-003-13802-0 (ebk)

DOI: 10.1201/9781003138020

Typeset in Times
by codeMantra
Contents
Preface...................................................................................................................... vii
Editors....................................................................................................................... xi
Contributors ............................................................................................................xiii

Chapter 1 Random Variables in Machine Learning ............................................. 1


Piratla Srihari

Chapter 2 Analysis of EMG Signals using Extreme Learning Machine


with Nature Inspired Feature Selection Techniques .......................... 27
A. Anitha and A. Bakiya

Chapter 3 Detection of Breast Cancer by Using Various Machine Learning


and Deep Learning Algorithms ..........................................................51
Yogesh Jadhav and Harsh Mathur

Chapter 4 Assessing the Radial Effciency Performance of Bus Transport


Sector Using Data Envelopment Analysis...........................................71
Swati Goyal, Shivi Agarwal, Trilok Mathur, and Nirbhay Mathur

Chapter 5 Weight-Based Codes—A Binary Error Control Coding


Scheme—A Machine Learning Approach......................................... 89
Piratla Srihari

Chapter 6 Massive Data Classifcation of Brain Tumors Using DNN:


Opportunity in Medical Healthcare 4.0 through Sensors .................. 95
Rohit Rastogi, Akshit Rajan Rastogi, D.K. Chaturvedi,
Sheelu Sagar, and Neeti Tandon

Chapter 7 Deep Learning Approach for Traffc Sign Recognition on


Embedded Systems ...........................................................................113
A. Shivankit, Gurminder Kaur, Sapna Juneja, and Abhinav Juneja

v
vi Contents

Chapter 8 Lung Cancer Risk Stratifcation Using ML and AI on Sensor-


Based IoT: An Increasing Technological Trend for Health of
Humanity ...........................................................................................137
Rohit Rastogi, Mukund Rastogi, D.K. Chaturvedi,
Sheelu Sagar, and Neeti Tandon

Chapter 9 Statistical Feedback Evaluation System ............................................153


Alok Kumar and Renu Jain

Chapter 10 Emission of Herbal Woods to Deal with Pollution and Diseases:


Pandemic-Based Threats ...................................................................183
Rohit Rastogi, Mamta Saxena, D. K. Chaturvedi,
and Sheelu Sagar

Chapter 11 Artifcial Neural Networks: A Comprehensive Review................... 203


Neelam Nehra, Pardeep Sangwan, and Divya Kumar

Chapter 12 A Case Study on Machine Learning to Predict the Students’


Result in Higher Education .............................................................. 229
Tejashree U. Sawant and Urmila R. Pol

Chapter 13 Data Analytic Approach for Assessment Status of Awareness of


Tuberculosis in Nigeria..................................................................... 243
Ishola Dada Muraina, Rafeeah Rufai Madaki, and
Aisha Umar Suleiman

Chapter 14 Active Learning from an Imbalanced Dataset: A Study


Conducted on the Depression, Anxiety, and Stress Dataset .............251
Umme Salma M. and Amala Ann K. A.

Chapter 15 Classifcation of the Magnetic Resonance Imaging of the Brain


Tumor Using the Residual Neural Network Framework ...................267
Tina and Sanjay Kumar Dubey

Index...................................................................................................................... 279
Preface
Machine learning is a trusted technology over decades and has fourished on a
global scale touching the lives of each one of us. The modern-day decision making
and processes are all dependent on machine learning technology to make matured
short-term and long-term decisions. Machine learning is blessed to have phenomenal
support from the research community, and have landmark contributions, which is
enabling machine learning to fnd new applications every day. The dependency of
human processes on machine learning-driven systems is encompassing all spheres of
current state-of-the-art systems with the level of reliability it offers. There is a huge
potential in this domain to make the best use of machines in order to ensure the opti-
mal prediction, execution, and decision making. Although machine learning is not a
new feld, it has evolved with ages and the research community round the globe have
made remarkable contribution for the growth and trust of applications to incorporate
it. The predictive and futuristic approach, which is associated with machine learning,
makes it a promising tool for business processes as a sustainable solution. There is
an ample scope in the technology to propose and devise newer algorithms, which are
more effcient and reliable to give machine learning an entirely new dimension in dis-
covering certain latent domains of applications, it may support. This book will look
forward to addressing the issues, which can resolve the modern-day computational
bottom lines which need smarter and optimal machine learning-based intervention
to make processes even more effcient. This book presents innovative and improvised
machine learning techniques which can complement, enrich, and optimize the exist-
ing glossary of machine learning methods. This book also has contributions focusing
on the application-based innovative optimized machine learning solutions, which
will give the readers a vision of how innovation using machine learning may aid in
the optimization of human and business processes.
We have tried to knit this book as a read for all books wherein the learners and
researchers shall get insights about the possible dimensions to explore in their spe-
cifc areas of interest. The chapter-wise description is as follows:
Chapter 1 explores the basic concepts of random variables (single and multiple),
their role and applications in the specifed areas of machine learning.
Chapter 2 demonstrates Wigner-Ville transformation technique to extract the
time-frequency domain features from typical and atypical EMG signals – myopathy
(muscle disorder) and amyotrophic lateral sclerosis (neuro disorder). Nature inspired
feature selection algorithms, whale optimization algorithm (WOA), genetic algo-
rithm (GA), bat algorithm (BA), fre-fy optimization (FA), and particle swarm opti-
mization (PSO) are utilized to determine the relevant features from the constructed
features.
Chapter 3 presents various algorithms of machine learning (ML), which can be
used for breast cancer detection. Since these techniques are commonly used in many
areas, they are also used for making decisions regarding diagnostic diagnosis and
clinical studies.

vii
viii Preface

Chapter 4 measures the effciency and thoroughly explores the scope for opti-
mal utilization of the input resources owned by depots of the RSRTC. The new
slack model (NSM) of DEA is used as it enumerates the slacks for input and output
variables. The model satisfes the radial properties, unit invariance, and translation
invariance. This study enables policy-makers to evaluate inputs for consistent output
up to the optimum level and improve the performance of the ineffcient depots.
Chapter 5 presents a binary error control coding scheme using weight-based
codes. This method is quite used for classifcation and employs the K nearest neigh-
bor algorithms. The paper also discussed the role of distance matrix with hamming
code evaluation.
Chapter 6 exhibits MRI images of the framed brain to create deep neural system
models that can be isolated between different types of heart tumors. To perform this
task, deep learning is used. It is a type of instrument-based learning where the lower
levels responsible for many types of higher-level defnitions appear above the differ-
ent levels of the screen.
Chapter 7 focuses on creating an affordable and effective warning system for driv-
ers that is able to detect the warning sign boards and speed limits in front of the mov-
ing vehicle, and prompt the driver to lower to safer speeds if required. The software
internally works on a deep learning-based modern neural network YOLO (You Only
Look Once) with certain modifcations, which allows it to detect the road signs really
quickly and accurately on low-powered ARM CPUs.
Chapter 8 presents an approach for the classifcation of lung cancer based on the
associated risks (high risk, low risk, high risk). The study was conducted using a lung
cancer classifcation scheme by studying micrographs and classifying them into a
deep neural network using machine learning (ML) framework.
Chapter 9 presents a statistical feedback evaluation system that allows to design an
effective questionnaire using statistical knowledge of the text. In this questionnaire,
questions and their weight are not pre-decided. It is established that questionnaire-
based feedback systems are traditional and quite straightforward, but these systems
are very static and restrictive. The proposed statistical feedback evaluation system
is helpful to the users and manufacturers in fnding the appropriate item as per their
choices.
Chapter 10 presents an experimental work based on the data collected on various
parameters on the scientifc measuring analytical software tools Air Veda instru-
ment and IoT-based sensors capturing the humidity and temperature data from atmo-
spheric air in certain interval of time to know the patterns of pollution increment or
decrement in atmosphere of nearby area.
Chapter 11 concerns with neural network representations and defning suitable
problems for neural network learning. It covers numerous substitute designs for the
primitive units making up an artifcial neural network, such as perceptron units,
sigmoid unit, and linear units. This chapter also covers the learning algorithms for
training single units. Backpropagation algorithm for multilayer perceptron training is
described in detail. Also, the general issues such as the representational capabilities
of ANNs, overftting problems, and substitutes to the backpropagation algorithm are
also explained.
Preface ix

Chapter 12 proposes a system which will make use of the machine learning
approach to predict a student’s performance. Based on student’s current performance
and some measurable past attributes, the end result can be predicted to classify them
among good or bad performers. The predictive models will make students aware who
are likely to struggle during the fnal examinations.
Chapter 13 presents a study that assists in assessing the awareness status of people
on the TB towards its mitigation and serves as contribution to the feld of health infor-
matics. Indeed, the majority of participants claimed that they had low awareness on
the TB and its associated issues in their communities. Though, the participants were
from Kano state, a strategic location in the northern part of Nigeria, which means
that the result of the experiment can represent major opinions of northern residents.
Chapter 14 deals with psychological data related to depression, anxiety, and stress
data to study how the classifcation and analysis is carried out on imbalanced data.
The proposed work not only contributes on providing practical information about
the balancing techniques like SMOTE, but also reveals the strategy for dealing
with working of many existing classifcation algorithms like SVM, Random Forest,
XGBoost etc. on imbalanced dataset.
Chapter 15 proposes the construction of segmented mask of MRI (magnetic reso-
nance image) using CNN approach with the implementation of ResNet framework.
The understanding of ResNet framework using layered approach will provide the
extensive anatomical information of higher-dimensional image for precise clinical
analysis for curative treatment of patients.
Editors
Vishal Jain is an Associate Professor in the Department of CSE at Sharda University,
Greater Noida, India. He has earlier worked with Bharati Vidyapeeth’s Institute of
Computer Applications and Management (BVICAM), New Delhi, India (affliated
with Guru Gobind Singh Indraprastha University, and accredited by the All India
Council for Technical Education). He frst joined BVICAM as an Assistant Professor.
Before that, he has worked for several years at the Guru Premsukh Memorial College
of Engineering, Delhi, India. He has more than 350 research citation indices with
Google scholar (h-index score 9 and i-10 index 9). He has authored more than 70
research papers in reputed conferences and journals, including Web of Science and
Scopus. He has authored and edited more than 10 books with various reputed pub-
lishers, including Springer, Apple Academic Press, Scrivener, Emerald, and IGI-
Global. His research areas include information retrieval, semantic web, ontology
engineering, data mining, ad hoc networks, and sensor networks. He was recipient of
a Young Active Member Award for the year 2012–2013 from the Computer Society
of India, Best Faculty Award for the year 2017, and Best Researcher Award for the
year 2019 from BVICAM, New Delhi.

Sapna Juneja is a Professor in IMS, Ghaziabad, India. Earlier, she has worked as
a Professor in the Department of CSE at IITM Group of Institutions and BMIET,
Sonepat. She has more than 16 years of teaching experience. She completed her
doctorate and masters in Computer Science and Engineering from M.D. University,
Rohtak, in 2018 and 2010, respectively. Her broad area of research is Software
Reliability of Embedded System. Her areas of interest include Software Engineering,
Computer Networks, Operating System, Database Management Systems, and
Artifcial Intelligence etc. She has guided several research theses of UG and PG
students in Computer Science and Engineering. She is editing book on recent tech-
nological developments.

Abhinav Juneja is currently working as a Professor in the Department of IT at KIET


Group of Institutions, Delhi-NCR, Ghaziabad, India. Earlier, he has worked as an
Associate Director and a Professor in the Department of CSE at BMIET, Sonepat.
He has more than 19 years of teaching experience for postgraduate and under-
graduate engineering students. He completed his doctorate in Computer Science
and Engineering from M.D. University, Rohtak, in 2018 and has done masters in
Information Technology from GGSIPU, Delhi. He has research interests in the feld
of Software Reliability, IoT, Machine Learning, and Soft Computing. He has pub-
lished several papers in reputed national and international journals. He has been a
reviewer of several journals of repute and has been in various committees of inter-
national conferences.

xi
xii Editors

Ramani Kannan is currently working as a Senior Lecturer, Center for Smart Grid
Energy Research, Institute of Autonomous system, University Teknologi PETRONAS
(UTP), Malaysia. Dr. Kannan completed Ph.D. (Power Electronics and Drives) from
Anna University, India, in 2012; M.E. (Power Electronics and Drives) from Anna
University, India, in 2006; B.E. (Electronics and Communication) from Bharathiyar
University, India, in 2004. He has more than 15 years of experience in prestigious
educational institutes. Dr. Kannan has published more than 130 papers in various
reputed national and international journals and conferences. He is the editor, co-edi-
tor, guest editor, and reviewer of various books, including Springer Nature, Elsevier
etc. He has received award for best presenter in CENCON 2019, IEEE Conference on
Energy Conversion (CENCON 2019), Indonesia.
Contributors
Shivi Agarwal Renu Jain
Department of Mathematics University Institute of Engineering and
BITS Pilani Technology
Pilani, Rajasthan, India CSJM University
Kanpur, Uttar Pradesh, India
Amala Ann K. A.
Data Science Department Abhinav Juneja
CHRIST (Deemed to be University) KIET Group of Institutions
Bangalore, India Ghaziabad, Uttar Pradesh, India

A. Anitha Sapna Juneja


D.G. Vaishnav College Department of Computer science
Chennai, India IMS Engineering College
Ghaziabad, Uttar Pradesh, India
A. Bakiya
MIT Campus, Anna University Gurminder Kaur
Chennai, India Department of Computer Science and
Engineering
D. K. Chaturvedi BM Institute of Engineering &
Department of Electrical Engineering Technology
DEI, Agra, India Sonepat, India

Sanjay Kumar Dubey Alok Kumar


Department of Computer Science and University Institute of Engineering and
Engineering Technology
Amity University CSJM University
Noida, Uttar Pradesh, India Kanpur, Uttar Pradesh, India

Ayushi Ghosh Divya Kumar


Maulana Abul Kalam Azad University Department of ECE
of Technology IFTMU
Kolkata, West Bengal, India Moradabad, Uttar Pradesh, India

Swati Goyal Rafeeah Rufai Madaki


Department of Mathematics Department of Computer Science
BITS Pilani Yusuf Maitama Sule University
Pilani, Rajasthan, India (Formerly, Northwest University)
Kano, Nigeria
Yogesh Jadhav
Research Scholar Harsh Mathur
Madhyanchal Professional University Department of Computer Science
Bhopal, Madhya Pradesh, India Madhyanchal Professional University
Bhopal, Madhya Pradesh, India

xiii
xiv Contributors

Trilok Mathur Pardeep Sangwan


Department of Mathematics Department of ECE
BITS Pilani MSIT
Pilani, Rajasthan, India Delhi, India

Nirbhay Mathur Tejashree U. Sawant


Department of Electrical & Electronics Department of Computer Science
Universiti Teknologi PETRONAS Shivaji University
Perak, Malaysia Kolhapur, Maharashtra, India

Ishola D. Muraina Mamta Saxena


Department of Computer Science Ministry of Statistics
Yusuf Maitama Sule University Govt. of India
(Formerly, Northwest University) Delhi, India
Kano, Nigeria
A. Shivankit
Neelam Nehra Department of CSE
Department of ECE BM Institute of Engineering &
MSIT Technology
Delhi, India Sonepat, India

Urmila R. Pol Piratla Srihari


Department of Computer Science Department of ECE
Shivaji University Geethanjali College of Engineering and
Kolhapur, Maharashtra, India Technology
Hyderabad, Telangana, India
Akshit Rajan Rastogi
Department of CSE Aisha Umar Suleiman
ABES Engg. College Department of ECE
Ghaziabad, Uttar Pradesh, India Yusuf Maitama Sule University
(Formerly, Northwest University)
Mukund Rastogi Kano, Nigeria
Department of CSE
ABES Engg. College Neeti Tandon
Ghaziabad, Uttar Pradesh, India Research Scholar
Vikram University
Rohit Rastogi Ujjain, Madhya Pradesh, India
Department of CSE
ABES Engg. College Tina
Ghaziabad, Uttar Pradesh, India Department of Computer Science
Engineering
Sheelu Sagar Amity University
Amity International Business School Noida, Uttar Pradesh, India
Noida, Uttar Pradesh, India
Umme Salma M.
Department of Computer Science
CHRIST (Deemed to be University)
Bangalore, India
1 Random Variables in
Machine Learning
Piratla Srihari
Geethanjali College of Engineering and Technology

CONTENTS
1.1 Introduction ......................................................................................................2
1.2 Random Variable ..............................................................................................3
1.2.1 Defnition and Classifcation.................................................................3
1.2.1.1 Applications in Machine Learning ........................................4
1.2.2 Describing a Random Variable in Terms of Probabilities....................4
1.2.2.1 Ambiguity with Reference to Continuous
Random Variable ...................................................................5
1.2.3 Probability Density Function................................................................6
1.2.3.1 Properties of pdf ....................................................................6
1.2.3.2 Applications in Machine Learning ........................................7
1.3 Various Random Variables Used in Machine Learning...................................7
1.3.1 Continuous Random Variables .............................................................7
1.3.1.1 Uniform Random Variable.....................................................7
1.3.1.2 Gaussian (Normal) Random Variable....................................8
1.3.2 Discrete Random Variables ................................................................ 10
1.3.2.1 Bernoulli Random Variable ................................................. 10
1.3.2.2 Binomial Random Variable ................................................. 11
1.3.2.3 Poisson Random Variable .................................................... 12
1.4 Moments of Random Variable........................................................................ 13
1.4.1 Moments about Origin........................................................................ 13
1.4.1.1 Applications in Machine Learning ...................................... 13
1.4.2 Moments about Mean ......................................................................... 14
1.4.2.1 Applications in Machine Learning ...................................... 14
1.5 Standardized Random Variable...................................................................... 15
1.5.1 Applications in Machine Learning..................................................... 15
1.6 Multiple Random Variables ............................................................................ 16
1.6.1 Joint Random Variables...................................................................... 17
1.6.1.1 Joint Cumulative Distribution Function (Joint CDF)........... 17
1.6.1.2 Joint Probability Density Function (Joint pdf) .................... 17
1.6.1.3 Statistically Independent Random Variables ....................... 18
1.6.1.4 Density of Sum of Independent Random Variables............. 18
1.6.1.5 Central Limit Theorem ........................................................ 19

DOI: 10.1201/9781003138020-1 1
2 Handbook of Machine Learning

1.6.1.6 Joint Moments of Random Variables................................... 19


1.6.1.7 Conditional Probability and Conditional Density
Function of Random Variables ............................................ 22
1.7 Transformation of Random Variables............................................................. 23
1.7.1 Applications in Machine Learning..................................................... 23
1.8 Conclusion ......................................................................................................24
References................................................................................................................24

1.1 INTRODUCTION
Predicting the future using the knowledge about the past is the fundamental objective
of machine learning.
In a digital communication system, a binary data generation scheme referred to as
differential pulse code modulation (DPCM) works on the similar principle, where,
based on the past behaviour of the signal, its future value will be predicted, using a
predictor. A tapped delay line flter serves the purpose. More is the order of the pre-
dictor, better is the prediction, i.e. less is the prediction error.[1]
Thus, machine learning, even though not being referred to by this name earlier,
was/is an integral part of technical world.
The same prediction error with reference to a DPCM system is now being addressed
as confdence interval in connection with machine learning. Less prediction error
implies a better prediction, and as far as machine learning is concerned, the prob-
ability of the predicted value to be within the tolerable limits of error (which is the
confdence interval) should be large, which is a metric for the accuracy of prediction.
The machine learning methodology involves the process of building a statistical
model for a particular task, based on the knowledge of the past data. This collected
past data with reference to a task is referred to as data set.
This way of developing the models to predict about ‘what is going to happen’,
based on the ‘happened’, is predictive modelling.
In detective analysis also, ‘Happened’, i.e. past data, is used, but there is no neces-
sity of predicting about ‘Going to happen’.
For example, 30–35 years back, Reynolds-045 Pen ruled the market for a long
time, specifcally in South India. Presently, the sales are not that much signifcant.
If it is required to study the journey of the pen from past to present, detective anal-
ysis is to be performed, since there is no necessity of predicting its future sales.
Similarly, a study of ‘Why the sales of a particular model of an automobile vehicle
came down?’ also belongs to the same category.
The data set referred above is used by the machine to learn, and hence, it is also
referred to as training data set. After learning, the machine faces the test data. Using
the knowledge the machine gained through learning, it should act on the test data to
resolve the task assigned.
In predictive modelling, if the learning mechanism of the machine is supervised
by somebody, then the mode of learning is referred to as supervised learning. That
supervising ‘somebody’ is the training data set, also referred to as labelled training
data, where each labelled data element such as Di is mapped to a data element D0 .
Such many pairs of elements are the learning resources for the machine, and are used
Random Variables in Machine Learning 3

to build the model, using which the machine predicts, i.e. this knowledge about the
mapped will help the machine to map the test data pairs (input-output pair).
It can be inferred about the supervised learning that there is a target variable
which is to be predicted.
Example: Based on the symptoms of a patient, it is to be predicted whether
he/she is suffering from a particular disease. To enable this prediction, the past
history or statistics such as patients with what symptoms (similar) were categorized
under what disease. This past data (both symptoms and categorization) is the train-
ing data set that supervises the machine in its process of prediction. Here, the target
variable is the disease of the patient, which is to be predicted.
In unsupervised learning mechanism, the training data is considered to be
unlabelled, i.e. only Di. Major functionality of unsupervised learning is pattern
identifcation.
Some of the tasks under unsupervised learning are:

Clustering: Group all the people wearing white (near white) shirts.
Density Estimation: If points are randomly distributed along an axis, the
regions along the axis with minimum/moderate/maximum number points
need to be estimated.

It can be inferred about the unsupervised learning that there is no target variable
which is to be predicted.
With reference to previous example of patient with ill-health, it can be told that
all the people with a particular symptom of ill-health need to be grouped; however,
disease of the patient need not to be predicted, which is the target variable, with refer-
ence to supervised learning.

1.2 RANDOM VARIABLE


1.2.1 DEFINITION AND CLASSIFICATION
For an experiment E to be performed, let S be the set of all possible outcomes of
the experiment (sample space) and ξ be the outcome defned on S. The domain of
X = f (˜ ) is S.
The range of the function depends on the mapping between the outcomes of the
experiment to a numerical value, specifcally real number.
This X is referred to as random variable, and thus, the random variable is a func-
tion defned on S and is real-valued.
Example: For the experiment E = ‘simultaneous throw of two dice’,
S = {(1,1) ,(1,2 ) ,(1,6 ) ,( 2,1) ,(2,2)( 2,6 ) ,( 6,6 )}, where each number of each pair
of this set indicates the face of the particular die.
Each element of S is mapped to a real value by the function

X = f (˜ ) = sum of the two faces

Table 1.1 gives the pairs of all possible outcomes with the corresponding real values
mapped.
4 Handbook of Machine Learning

TABLE 1.1
Sample Space and Mapped Values
Pair in the Sample Space Real Value
(1,1) 2
(1,2), (2,1) 3
(1,3),(2,2),(3,1) 4
(1,4),(2,3),(3,2),(4,1) 5
(1,5),(2,4),(3,3),(4,2),(5,1) 6
(1,6),(2,5),(3,4),(4,3),(5,2),(6,1) 7
(2,6),(3,5),(4,4),(5,3),(6,2) 8
(3,6),(4,5),(5,4),(6,3) 9
(4,6),(5,5),(6,4) 10
(5,6),(6,5) 11
(6,6) 12

This X is referred to as random variable taking all the real values as mentioned.
Thus, a random variable can be considered as a rule by which a real value is
assigned to each outcome of the experiment.
If X = f (˜ ) possesses countably infnite range (points in the range are large in
number, but can be counted), X is referred to as discrete random variable (categorical
with reference to machine learning).
On the other hand, if the range of the function is uncountably infnite (large in
number, which can’t be counted), X is referred to as continuous random variable[2,3]
(similar terminology is used with reference to machine learning also).

1.2.1.1 Applications in Machine Learning


In the process of prediction, which is the major functionality for which the machine
learning is used for, the variables involved are predictor variable and target variable.
The very fundamental task of a machine learning methodology is to identify these
variables with reference to the given task.
A predictor is an independent variable, which is used for prediction, and target
variable is that being predicted and is dependent.
In machine learning terminology, variables are categorical (discrete in nature, e.g.
number of people survived in an accident) and continuous (can have an infnite num-
ber of values between the maximum and the minimum, e.g. age of a person). These
variables are nothing but the discrete and continuous random variables, and are task
specifc. Identifcation of these variables is the primary stage of machine learning.[4]

1.2.2 DESCRIBING A RANDOM VARIABLE IN TERMS OF PROBABILITIES


With reference to the above example, it can be stated that the value mapped (which is
considered as the value taken by the random variable X) is not always 2 or 3 or 4 etc.,
but there is some certainty associated with the mapping, i.e. X will take a value of 7,
7
only under certain outcomes, with a certainty of .
36
Random Variables in Machine Learning 5

Thus, a defnition of a random variable is not complete simply by specifying its


value, without its probabilistic description that deals with the probabilities that X
takes on a specifc value or values.
This description is done by the probability mass function (pmf), which assigns a
probability will for each value of X.
The probability for X = x ( value taken by X ) is P ( X = x ) and will be assigned by
the corresponding pmf.[2]
Example: Consider the case of tossing of an unbiased coin and let this process
of tossing be repeated till a head occurs for the frst time. X = Number of times
of tossing the coin is the random variable. The corresponding sample space can be
concluded as follows:
Since the coin is a fair coin, the event of getting head (H) or tail (T) in each fip is
1
with equal likelihood, i.e. P ( H ) = P ( T ) = .
2
In the frst fip, if a head occurs, there will be no second toss. Then, X = 1 with
1
probability = . On the other hand, if it is a tail, then the user will go to the second
2
fip. If the outcome is a head in the second fip, there will be no third toss. Now,
1
X = −2 with a probability = . This is recurrent.
4
Table 1.2 expresses various values taken by X, with the corresponding probabilities.
The function that assigns the probability for each value taken by X, i.e. the pmf, is
n
˝ 1ˇ
P ( X = n ) = ˆ  , n = 1,2,3
˙ 2˘
The properties possessed by pmf are:

( i ) 0 ˝ P ( X = n ) ˝ 1 ( ii ) ˜P ( X = n) = 1
n

1.2.2.1 Ambiguity with Reference to Continuous Random Variable


Let a discrete random variable X takes values from a set of N values, i.e.
{0, 1, 2,3,… N − 1}, where the values are with equal likelihood. The corresponding
1 1
pmf is P ( X = k ) = , k = 0,1,2,…( N − 1) and lim P ( X = k ) = lim = 0.
N N˝˙ N˝˙ N
Then, the random variable X is considered to be continuous and P ( X = k ) (where
k is a value among N) is found to be zero.
Thus, it can be concluded that the probability of a continuous random variable to
take a specifed value is typically zero.[2]
Under such condition, pmf is not suitable to describe a random variable.

TABLE 1.2
Probability Distribution
xi ( value taken by X ) 1 2 3 4 –
Probability 1 1 1 1 –
2 4 8 16
6 Handbook of Machine Learning

1.2.2.1.1 Cumulative Distribution Function


Instead of X taking a specifc value, consider the case of X to lie in a range,
i.e. X ˜ k , and the corresponding probability P ( X ˛ k ) is referred to as cumulative
distribution function (CDF).[1]
Thus, CDF of a random variable X is FX ( x ) = P ( X ˝ x ), where ‘x’ is the value,
and instead of pmf, this is used to explain a continuous random variable.
Since CDF is also probability, it is bounded as 0 ˛ FX ( x ) ˛ 1.
Properties of CDF are:

• FX ( −˝ ) = 0, since the event X ˜ −˛ will never happen.


• FX ( ˛ ) = 1, since the event X ˜ ° is always true.
• FX ( x1 ) ˛ FX ( x 2 ) , if x1 < x 2, since X ˜ x 2 is a super set of X ˜ x1. Thus,
CDF is a nondecreasing function of X

P ( x1 < X ˝ x 2 ) = FX ( x2 ) − FX ( x1 )

1.2.3 PROBABILITY DENSITY FUNCTION


Even though CDF is a substitute for pmf, to explain a continuous random variable, for
all the random variables, it may not be in a closed form.
ˇ k − m
Example: For a Gaussian random variable Y, the CDF P (Y ˝ k ) = 1 − Q  ,
˘ ˜ 
and the function Q (˜ ) cannot be expressed in a closed form.
Here, m and σ are, respectively, the mean and standard deviation of X.
Under such circumstances, probability density function (pdf) is an alternative tool
to describe a random variable statistically. It is defned for a random variable X at x as

P( x ˙ X < x + ˜ )
f X ( x ) = lim
˜ ˘0 ˜
The certainty or the probability with which X is in the interval ( x , x + ˜ ) is
P( x ° X < x + ˜ ), and the denominator δ is the width (length) of the interval.
Thus, f X ( x ) is the probability normalized by the width of the interval and can be
interpreted as probability divided by width.
From the properties of CDF, P ( x ˝ X < x + ˜ ) = FX ( x + ˜ ) − FX ( x ).
FX ( x + ˜ ) − FX ( x ) d
Hence, f X ( x ) = lim = FX ( x ), i.e. change in CDF is referred
˜ ˇ0 ˜ dx
to as pdf.[2]

1.2.3.1 Properties of pdf

f (x) ˛ 0
ˆ

˜ −ˆ
f ( x ) dx = 1

x
FX ( x ) =
˜−ˇ
f (° ) d °
Random Variables in Machine Learning 7

˜a
f ( x ) dx = P ( a < X < b )

1.2.3.2 Applications in Machine Learning


• Confdence interval is an estimate of a parameter computed based on the
statistical observations of it.
• This specifes a range of reasonable values of being estimated, and the
accuracy of estimation is expressed in terms of confdence level.
• With a given confdence interval, if the number of observations of a
parameter ‘p’ is p1 , p2 , … pn, with the confdence level ɛ, it can be inter-
preted that the estimated value of ‘p’ lies in the given confdence inter-
val with a probability  .
• This is expressed as P(a < X < b) , where X is the parameter being esti-
mated, ( a, b ) is the confdence interval and P(a < X < b) is the conf-
dence level, which can be computed from the probability density of the
parameter being estimated. ˙
• The confdence level P(˜ < X < ° ) can be computed as
˜˝
f ( x ) dx.

• In evaluating a model based on the predicted probabilities (in connec-


tion with binary classifcation), one of the evaluation metrics is area under
curve-receiver-operating characteristic (AUC-ROC) metric.
• This ROC is also referred to as false-positive rate-true-positive rate
curve, which can be obtained from the predicted properties, with refer-
ence to binary classifcation.
• Once this distribution curve is known, area under that curve, i.e.

˜ f ( x ) dx, will be a measure of the accuracy of the predicted.


• Ideally, the area is assumed as 1 (which is the area enclosed by any valid
density function).
• More close to 1 is the area measured, i.e. much better is the perfor-
mance of the predicted model.[4,5]

1.3 VARIOUS RANDOM VARIABLES USED


IN MACHINE LEARNING
1.3.1 CONTINUOUS RANDOM VARIABLES
1.3.1.1 Uniform Random Variable
• The density function of a random variable X (continuous) uniform over
˘ 1
 for ° < x < ˜
(˜ , ° ) is f ( x ) =  ˜ − ° with constant density within
 0 elsewhere

its specifed limits is referred to as uniform random variable (continuous).
• A uniform discrete X takes all the possible values between (˜ , ° ) with
equal likelihood.
8 Handbook of Machine Learning

The physical signifcance of uniform density is that the random variable X can
1
lie in any interval within the limits (˜ , ° ) with the same probability, i.e. ,
˜ −°
for any confdence interval ( k , k + ˜ ) , where ° < k < ˛ , the confidence level
1
P (˜ < X < x + ° ) = .
˛ −˜

• Any confdence level P(˜ 1 < X < °1 ) for the given confdence interval
°1
1
(˜ , ° ) can be obtained as˜˛1 ° − ˛
. dx
• The CDF of uniformly distributed random variable X (continuous) is
 0 for x < ˜

 x −˜
FX ( x ) =  for ˜ < x ˘ °
 ° −˜
 1 for x > °

• A uniform random variable is said to be symmetric about its mean
˙ ˜ + ° ˘ [6]
ˇˆ = .
2 

1.3.1.1.1 Applications in Machine Learning


• When there is no prior realization of the distribution of the variable being
predicted, the variable is considered to lie anywhere in the interval under
consideration with the same probability, i.e. the variable under consider-
ation is treated as uniform.
• In classifcation, decision tree learning is the most widely used algorithm.
The objective of decision tree is to have pure node. The purity of a node and
the information gain are related as:
• More impure is the node, more is the information required to describe.
• More is the information gain, more homogeneous or pure is the node.
• Information gain = 1 − entropy
• More is the entropy, less pure is the node, and vice versa.
• A uniform random variable will have the maximum entropy.
• For example, in a decision tree, in a node if there are two equiprobable
classes, then the corresponding entropy is maximum and is the indica-
tion for more impure node.[5,6]

1.3.1.2 Gaussian (Normal) Random Variable


(
• The density of normally distributed X denoted as N m, ˜ 2 is )
1 ˇ ( x − m )2 
f (x) = exp  − , where ‘m’ and ‘σ ’ are the mean and
2

2˙˜ 2 ˘ 2˜ 2 
variance of X, respectively.
Random Variables in Machine Learning 9

FIGURE 1.1 Gaussian density function.

• This density is a bell-shaped curve, with a point of symmetry at x = m, and


will have its maximum value at x = m (Figure 1.1).
x − m
• The CDF of normal variable X is FX ( x ) = 1 − Q ˘ˆ , where
ˇ ˜ 
1 
ˇ x 2
Q(k ) =
2˙ k˜ exp  −  dx, which is not having a closed form of solution.
˘ 2
• The density curve is symmetrical about its mean, i.e. equal distribution
about its mean.
• If the distribution is more oriented to the right of its mean, it is said to be
right-skewed.
• Similarly, left-skewed distribution can also be identifed.
• Generally, it is preferred to have zero coeffcient of skewness (it is a measure
of symmetry of the given density function).[1]

1.3.1.2.1 Applications in Machine Learning


• Experimental observations in many case studies can be ftted with Gaussian
density.
• Marks of all the students in a class in a particular subject
• Variations in the share price of a particular company
• With reference to machine learning, to make more predictions, more data
is to be added to the model. The following are the ways of adding the data:
• Add external data
• Use existing data more effectively
10 Handbook of Machine Learning

• Feature engineering refers to the technique of generating new features using


existing features. No new data is added.
• Feature pre-processing is one of the primary steps in feature engineer-
ing. This involves updating or transforming the existing features. This is
referred to as feature transformation.
• Feature transformation involves replacing a variable by some mathematical
functions such as ‘log’, ‘square’, ‘square root’, ‘cube’, ‘cube root’ etc.
• If any distribution is right-skewed or left-skewed, it is made normally dis-
tributed using nth root or log and nth power or exponential, respectively.
• As per central limit theorem, the density of sum of ‘n’ number independent
random variables approaches Gaussian density. This is the basis for assum-
ing the channel is normally distributed with reference to a communication
system, which facilitates the study of the noise performance of a communi-
cation system.[7]

1.3.2 DISCRETE RANDOM VARIABLES


1.3.2.1 Bernoulli Random Variable
When an experiment with only two outcomes is repeated independently multiple
number of times, such repetitions are referred to as Bernoulli trials.
Example: The experiment can be

• Single fip of a coin, where the possible outcomes are head and tail
• Identifying about the survival of a person in an accident: survived or not

The pmf of Bernoulli random variable is P ( X = m ) = ( p ) ( q ) , where m assumes


m 1− m

only two values either 0 or 1. It can assume any one value at a time. p and q ( = 1 − p )
are the probability of success and failure, respectively. Success is the required whose
probability is to be computed.
For Example: when an unbiased coin is tossed, if it is required to compute the
1 1−1
˝ 1ˇ ˝ 1ˇ 1
probability of getting a head (represented as m = 1), then P ( X = 1) = ˆ  ˆ  = .
˙ 2˘ ˙ 2˘ 2
Here, success is getting a head.
˘0 for x < 0

The CDF of Bernoulli random variable is FX ( x ) = q for 0 ˙ x < 1[6]

 p + q = 1 for x ˇ 1

1.3.2.1.1 Applications in Machine Learning


• Classifcation is used to predict categorical variables through machine
learning algorithms.
• The test data is to be assigned to a category based on some classifcation
criteria.
Random Variables in Machine Learning 11

• When the variable to be predicted is binary valued, i.e. the test data is to be
categorized into any one of the two classes available, such classifcation is
binary classifcation and the performance of such algorithms can be anal-
ysed using Bernoulli process.[5]

1.3.2.2 Binomial Random Variable


When multiple number of independent Bernoulli trials are repeated, such sequence
of trials is referred to as Bernoulli process. The output of a Bernoulli process is a
binomial random variable/distribution.
Example: Let a fair coin be thrown. Since the outcome can be either head or tail,
it comes under Bernoulli trial. When the experiment is repeated for a number of
times, such sequence of trials is Bernoulli process.
When the experiment is performed ‘r’ times (each experiment being a
Bernoulli trial), the probability of getting the success for ‘k’ times is given as
p ( X = k ) = rck ( p ) ( q ) , where p and q are the probabilities of success and failure,
k r−k

respectively, such that p + q = 1. Here, X is the number of times of having the success
in the experiment.
Here, X is the binomial random variable, since the above probability is the coef-
fcient of kth term in the binomial expansion of ( p + q ) .
r

If it is required to fnd the probability for tail to occur four times, when a fair coin
4 6
˝ 1ˇ ˝ 1ˇ
is tossed ten times, then such probability is p ( X = 4 ) = 10 4 ˆ  ˆ  .
˙ 2˘ ˙ 2˘
The probability of having success for ‘k’ times in ‘r’ number of trials of the exper-
ˆrck ( p ) ( q )
r−k
˝ k
for k = 0,1,2,r
iment in a random order is p ( X = k ) = ˙ , and this
ˆ0
ˇ otherwise
is the pmf of X.
m

The corresponding CDF is FX ( m ) = P ( X ˙ m ) = ˜r


k=0
ck ( p)k ( q )r−k .
Binomial distribution summarizes the number of successes in a series of Bernoulli
experiments, with success probability = p.
Bernoulli distribution is the binomial distribution with single trial.[8]

1.3.2.2.1 Multinoulli Distribution


Similar to Bernoulli distribution used for binary classifcation, multinoulli distribu-
tion also deals with categorical variables. It is a generalization of Bernoulli distribu-
tion (binary classifcation) to a multiclass classifcation, where k is the number of
classes.
Example: In the case of throwing a die, let the sample space be S = {1,2,3,4,5,6}.
Thus, the number of classes can be considered as 6.
Thus, Bernoulli distribution is multinoulli distribution with the number of
classes = 2.[8]
12 Handbook of Machine Learning

1.3.2.2.2 Multinomial Distribution


Multiple number of independent multinoulli trials (e.g. throwing a die multiple num-
ber of times) follows multinomial distribution, which is a generalized binomial dis-
tribution for discrete (categorical) variables/experiments, where each experiment is
with k number of outputs. Here, in n number of trials of an experiment, each experi-
ment has k number of outcomes, which are with the probability of occurrence given
as p1 , p2 ,… pk .[8]

1.3.2.2.3 Applications in Machine Learning


All the algorithms in machine learning may not be able to deal with categorical
variables. In feature pre-processing for categorical variables, to enable this handling,
the categorical variables will be converted to numerical values and this process of
conversion is referred to as variable encoding.
Example: Consider a supermarket having a chain of outlets. Different outlets
in a city are of different size and are graded as small size, medium size, and big
size. A machine learning algorithm cannot deal with such categorical values, i.e.
small, medium and big. In this example, it is the case of multiclass classifcation
with k = 3.
Table 1.3 represents the conversion of the above categorical values into numerical
values.
This process of converting the categorical variables into numeric values is referred
to as one hot encoding and is an example of multinoulli distribution.[6]

1.3.2.3 Poisson Random Variable


When a Bernoulli trial (with a binary outcome, i.e. success or failure) is repeated
independently for multiple number of times (n), the probability of getting the success
(p) for a defned number (m) of times (no restriction on the sequence/order of getting
the success) will be dealt by binomial distribution.
In the limit n ˜ ° and p ˜ 0, i.e. probability of success is infnitesimal, bino-
mial random variable can be approximated as Poisson random variable.
Example: In digital data transmission, when a large number of data bits are being
transmitted, the computation of the probability of bit error will be dealt by this ran-
dom variable.

TABLE 1.3
Variable Encoding
Outlet Size Small Medium Big
1 Big 0 0 1
2 Big 0 0 1
3 Medium 0 1 0
4 Small 1 0 0
5 Medium 0 1 0
Random Variables in Machine Learning 13

Its pmf is p ( X = m ) = e −˜ ( ˜ )m , where ˜ = np is a constant and the corresponding


m!
x
( ° )k .[3]
CDF is FX ( x ) = p ( X ˆ x ) = ˜
k=0
e− °
k!

1.4 MOMENTS OF RANDOM VARIABLE


Moments of a random variable are also referred to as its statistical averages. A ran-
dom variable can have two types of moments: moments about origin and moments
about mean or central moments.

1.4.1 MOMENTS ABOUT ORIGIN


Expected value [E(x)] or mean (m) or average value or expectation of a random value
is referred to as its frst moment about origin and is given as E ( X ) = xi p ( xi ), ˜ i
˙
where p ( xi ) is the certainty with which the random variable X = xi and
˜−˙
f ( x ) dx ,
respectively, for discrete (categorical) and continuous cases.
For a random variable, the second moment about origin is its mean square value
E ˜X
° ˛˝ .
2

Its nth moment E ˜X° ˛˝.


n [1]

1.4.1.1 Applications in Machine Learning


• The averages only are used in the study of random variable, as the certainty
with which takes different values is not unique.
• Linear models are used in regression, when the dependent and independent
random variables are linearly related.
• The preliminary modelling is referred to as benchmark model, where the
mean of the variable will be the solution for the prediction problem.
• Prediction of the relation between the experience of a person and the salary.
• The initial linear modelling will be the benchmark model, taking the
mean, i.e. average of all salaries, i.e. dependent variable as the solution
for the model.
• This may not be the accepted one, since the people with different expe-
riences may have the same salary.
• The model can be improved by introducing the curves of the form
Y = mX + C, i.e. salary = m (Experience) + C, which is a linear model.
• The values of ‘m’ and ‘C’ for which the model gives the best prediction
can be obtained from the cost function, which is given as
n

˜( s − si )
^ 2
i

Cost Function = Mean Square Error (MSE) = i=1


, where si^
n
and si are the predicted and actual ith value, respectively.
14 Handbook of Machine Learning

• This can be referred to as the second moment of the variable Si^ − si .


• Better model results in lower value of MSE.[7,9]

1.4.2 MOMENTS ABOUT MEAN


Let p ( X = 2 ) = p ( X = 6 ) = 0.5, where X is the random variable. Its frst moment
about the origin, i.e. mean, is m = ˜
i
1
2
1
xi p ( xi ) = 2 ˙ + 6 ˙ = 4.
2
To fnd the average amount by which the values taken by X differ from its
mean (the answer is 2), the frst moment about origin or the frst central moment
E [( X − m )] =
i
˜ 1 1
( xi − m ) p( xi ) is defned. But its value ( 2 − 4 ) + ( 6 − 4 ) = 0.
2 2
Thus, the very purpose of defning the frst central moment is not served.
To fnd the same for X, the second central moment E ˆ˙( X − m )2 ˘ˇ = ˜( x − m )
i
i
2
p ( xi )

2 1 2 1
is defned. Its value for the above X is ( 2 − 4 ) + ( 4 − 6 ) = 4. Its positive square
2 2
root is 2, which is the value required.
Thus, E ˝˙( X − m )2 ˆˇ is an indication of the average amount of variation of the val-
ues taken by the random variable with reference to its mean, and hence, its variance
(σ 2), and standard deviation (σ) is its positive square root.
The third central moment of X is E ˝˙( X − m )3 ˆˇ and is referred to as its skew. The
E ˙( X − m )3 ˇ˘
normalized skew, i.e. coeffcient of skewness, is given as ˆ and is a mea-
˜3
sure of symmetry of the density of X.
A random variable with symmetric density about its mean will have zero coef-
fcient of skewness.
If more values taken by the random variable are to the right of its mean, the
corresponding density function is said to be right-skewed and the respective
coeffcient of skewness will be positive (>0). Similarly, the left-skewed density
function can also be specifed and the respective coeffcient of skewness will be
negative (<0).[1]
** Identically distributed random variables will have identical moments.

1.4.2.1 Applications in Machine Learning


• Variance is used to address the concepts of underftting and overftting of a
machine learning model. When the proposed model is trained using differ-
ent data sets that differ signifcantly, at the time of performing on the test
data, the model may not result in the required accuracy. This can be referred
to the error due to variance and is due to the signifcant difference in various
training data models.
• Standard deviation can be used to measure the spread of the data, i.e. it gives
the average distance of a point in a data set from the mean of the data set.
Random Variables in Machine Learning 15

• Learning through decision trees is a widely adopted algorithm in classifca-


tion problems of machine learning.
• In a decision tree, root node represents the entire data.
• Nodes of the tree are divided into sub nodes using the process of
splitting.
• Leaf nodes cannot be further splitted.
• The best split among the available splits is that it results in the most
homogeneous subnodes.
• A higher homogenous node is said to be more purity.
• To decide the best spilt, its variance is used as a metric.
• In this tree, the variance of each child node is computed.
• Weighted average variance of each child node is the variance of the
split.
• The split with less variance is considered to be the best split.
• In feature pre-processing of feature engineering, feature transformation is
applied to a variable through some mathematical functions (as mentioned in
Section 1.3.1.2.1). This transformation is used to make the net distribution
of the variable to be symmetric about its mean, thereby aiming to get zero
coeffcient of skewness.[5,7]

1.5 STANDARDIZED RANDOM VARIABLE


X−m
Let X be a random variable with mean m and variance σ 2. Let X ° = be the new
˜
random variable. Irrespective of the nature of X , X ˜ always will be of zero mean
and unity variance. This X ˜ is referred to as standardized random variable associated
with X.[3]

1.5.1 APPLICATIONS IN MACHINE LEARNING


• Distance-based machine learning algorithms require all the variables (data)
in the same scale.[8,10]
Table 1.4 gives an example where all the data used is not in the same
scale.

TABLE 1.4
Data in Nonuniform Scaling
Loan amount EMI Income
(Rs. in lakhs) (Rs. in thousands) (Rs. in hundreds)
30 50 1600
40 40 1200
50 30 800
16 Handbook of Machine Learning

TABLE 1.5
Statistical Parameters of the Data
Standard deviation (σ) (amount by which value
Variable Mean (m) taken by the variable differs from its mean)
Loan amount 40 10
EMI 40 10
Income 1200 400

TABLE 1.6
Scaled Data
Loan Amount EMI Income
(Rs. in lakhs) (Rs. in thousands) (Rs. in hundreds)
30 − 40 50 − 40 1600 − 1200
= −1 =1 =1
10 10 400
40 − 40 40 − 40 1200 − 1200
=0 =0 =0
10 −10 400
50 − 40 30 − 40 800 − 1200
=1 = −1 = −1
10 −10 400

• All these data are subjected to scaling to make them to be on the same scale,
such that they are comparable. Thus, feature pre-processing requires scal-
ing of the variables.
• Standard scaling is a scaling method, where the scaling of variables is done
X−m
as per the formula X ° = .
˜
• Table 1.5 gives the computations of mean and standard deviation for differ-
ent variables of Table 1.4.
• Table 1.6 represents the above data subjected to scaling.
• It appears that all the scaled variables appear on the same reference scale.
• Thus, the concept of standardized random variable is used in feature
scaling.

1.6 MULTIPLE RANDOM VARIABLES


Data exploration and variable identifcation is one of the constituent stages of machine
learning life cycle.
In data exploration, the process of exploring only one variable at a time and its
study is referred to as univariate analysis.
In bivariate analysis, two variables are studied together. In such contexts, the con-
cept of multiple random variables fnds its place.
Random Variables in Machine Learning 17

1.6.1 JOINT RANDOM VARIABLES


Let E be an experiment and S be the corresponding sample space. Let X and Y be two
variables defned as real functions of S. Then, this pair of variables is referred to as
two-dimensional random variable or joint random variables.

1.6.1.1 Joint Cumulative Distribution Function (Joint CDF)


FXY ( x, y ) = P ( X ˝ x,Y ˝ y ) is the joint CDF of the random variables X and Y, where
x and y are the values taken by them, respectively.[1,11]

1.6.1.1.1 Properties

1. FXY ( −˝, −˝ ) = 0

2. FXY ( ˛, ˛ ) = 1

3. FXY ( −˝, y ) = 0

4. FXY ( x, −˝ ) = 0

5. FXY ( x, ˛ ) = FX ( x ) , which is marginal CDF of X

6. FXY ( ,˛ y ) = FY ( y ) , which is marginal CDF of Y

7. P ( x1 < X ˝ x 2 , y1 < Y ˝ y2 ) = FXY ( x 2 , y2 ) − FXY ( x1 , y2 ) − FXY ( x 2 , y1 )

+ FXY ( x1 , y1 )
8. 0 ˛ FXY ( x, y ) ˛ 1

1.6.1.2 Joint Probability Density Function (Joint pdf)


˝2
The joint pdf of two random variables X and Y is given as f XY ( x , y ) = FXY ( x, y ).
˝x ˙ ˝ y
1.6.1.2.1 Properties

ˆ ˆ
1.
˜ ˜
−ˆ −ˆ
f XY ( x, y ) dx dy = 1

ˆ
2.
˜ y=−ˆ
f XY ( x, y ) dy = f ( x ) , which is the marginal density of X

ˆ
3.
˜ x=−ˆ
f XY ( x, y ) dx = f ( y ) , which is the marginal density of Y

x y
4.
˜ ˜
−ˆ −ˆ
f XY ( x , y ) dx dy = FXY ( x , y )

x2 y2
5. P ( x1 < X < x 2 , y1 < Y < y2 ) =
˜ ˜
x1 y1
f XY ( x , y ) dx dy
18 Handbook of Machine Learning

x ˆ
6. FX ( x ) =
˜ ˜
−ˆ y=−ˆ
f XY ( x , y ) dx dy

ˆ y
7. FY ( y ) =
˜ ˜
x=−ˆ −ˆ
f XY ( x , y ) dx dy [12]

1.6.1.2.2 Joint Occurrence of Random Variables


The joint occurrence of two discrete random variables X and Y can be represented by
a matrix referred to as joint probability matrix P ( X ,Y ), which is given as

XY ˜1 ˜2 … ˜n
°1 p (° 1 , ˜1 ) p (° 1 , ˜ 2 ) … p (° 1 , ˜ n )
P ( X ,Y ) = °2 p (° 2 , ˜1 ) ° … p (° 2 , ˜ n )
° ° ° … °
°n p (° n , ˜1 ) p (° n , ˜ 2 ) … p (° n , ˜ n )

p (˜ i , ° j ) is the joint probability of occurrence of the pair ( X = ˜ , Y = ° j ).[1,3]

1.6.1.2.2.1 Properties of Joint Probability Matrix

1. Each element of the matrix is non-negative


2. ˜˜p(° , ˛ ) = 1
i j
i j

3. ˜p(° , ˛ ) = p( ˛ )
i
i j j

4. ˜p(° , ˛ ) = p(° )
j
i j i

1.6.1.3 Statistically Independent Random Variables


For such X and Y

• FXY ( x, y ) = FX ( x ) FY ( y )

• f XY ( x , y ) = f X ( x ) fY ( y )

• P ( x1 < X < x 2 , y1 < Y < y2 ) = P ( x1 < X < x 2 ) P ( y1 < Y < y2 )

1.6.1.4 Density of Sum of Independent Random Variables


Let Z = X + Y , where X and Y are the random variables with individual den-
sity functions f X ( x ) and fY ( y ), respectively, and are independent.[3] Then,
Random Variables in Machine Learning 19

ˆ ˆ
f Z ( z ) = f X ( x ) * fY ( y ) =
˜ −ˆ
f X ( x ) fY ( z − x ) dx =
˜
−ˆ
f X ( z − y ) fY ( y ) dy, which is
convolution of their individual density functions.
This principle can be extended to multiple number of independent random vari-
ables also.

1.6.1.5 Central Limit Theorem


This theorem deals with the density of sum of independent random variables.
Central limit theorem can be stated as ‘density of sum of n number of independent
identically distributed random variables approaches Gaussian density, in the limit
n ˜ °’.
This is true in the case of distinct distributions also.[3]

1.6.1.6 Joint Moments of Random Variables

1.6.1.6.1 Joint Moments about Origin


For two random variables X and Y , mnk = E  X nY k  is the ( n + k ) th-order joint
moment o about origin.
For discrete X and Y , mnk = ∑∑
xin y kj ⋅ p ( xi , y j ) and for continuous
i j
∞ ∞
X and Y , mnk =
∫ ∫
−∞ −∞
x y f ( x , y ) dx dy.
n k

For two random variables X and Y, there will be three number of second-order
joint moments.
They are

m20 = E ˝˙ X 2 ˆˇ = Mean Square value of X m02 = E ˝˙Y 2 ˆˇ = Mean Square value of Y

m11 = E [ XY ] = RXY , Correlation between X and Y

For two independent random variables, E °̃ X nY k ˛˝ = E °̃ X n ˛˝ E °̃Y k ˛˝.


If the correlation RXY = 0, X and Y are said to be orthogonal.
Independently, X and Y with zero individual means are orthogonal.

1.6.1.6.2 Joint Central Moments


mnk = E ˙( X − mx ) (Y − m y ) ˇ is referred to as ( n + k ) th-order joint central moments
n k
ˆ ˘
or joint moment about mean of two random variables X and Y.
For two random variables X and Y, there will be three number of second-order
joint moments. They are

m20 = E ˆˇ( X − mx ) ˘ = ˜ x2 , variance of X m02 = E ˆ(Y − m y ) ˘ = ˜ y2 , variance of Y


2 2
ˇ 

m11 = E ˇˆ( X − mx )(Y − m y ) ˘ = ˜ XY , covariance of X and Y , Cov ( X ,Y ) ,

which is zero for two independent random variables.


20 Handbook of Machine Learning

Let X ′ and Y ′ be two standardized variables associated with X and Y.


The second-order joint central moment m11 of X ′ and Y ′ is referred to as correla-
tion coefficient ( ρ XY ) of X and Y.

 X − mx    Y − m y    E ( X − mx )(Y − m y )  σ
ρ XY = E [ X ′Y ′ ] = E       = = XY
 σ x    σ y    σ xσ y σ xσ y

It is an indication of similarity between random variables and is bounded as


−1 ≤ ρ ≤ 1.
If the X = + βY , ( β being a constant ), irrespective of the value of β , ρ XY = +1,
which indicates that X and Y are similar.
If X = − βY , irrespective of the value of β, ρ XY = −1, which indicates that X and
Y are dissimilar.
A value of ˜ XY = +1 indicates the maximum similarity between X and Y, while
˜ XY = −1 is an indication of maximum dissimilarity.
Correlation coeffcient is used to quantify the relation between two random
variables.
A value of ˜ XY = 0 is an indication of the uncorrelated nature of X and Y.
For two jointly Gaussian random variables, independence implies uncorrelated
nature, and vice versa.[3]

1.6.1.6.3 Applications in Machine Learning


• Covariance is the simultaneous variation for two random variables.
• Linear regression involves only one predictor, where a linear relation-
ship is assumed between the dependent variable and the predictor vari-
able (independent).
• In linear regression models, slope-intercept format Y = p + qX is used.
• Here, X , Y , p, and q are the independent (predictor) variable, depen-
dent (being predicted) variable, intercept and the slope of the linear
model, respectively.
• The test data regarding X and Y is fed to the model, and the correspond-
ing p and q are to be identifed that relate X and Y.
• The error in identifying the values of p and q is the marginal error or
residual error.
• This modifes the slope-intercept format as Y = p + qX + ˜ .
• Ordinary least-squares technique is used to estimate the straight line
(linear model) that minimizes the difference between the actual and
predicted values of Y, which is the error.
• For each value of the predicted, this error can be calculated and the sum
of the squares of these errors (SSE) is
Cov ( X ,Y ) i
˜
° i2 , which is found to be least
for q = .
Var ( X )
• From this value of q, the corresponding value of p can be computed.
• The linear relationship and direction of that relation between two random
variables can be measured using the correlation coeffcient.
Random Variables in Machine Learning 21

• In a passenger ticketing system, the variables can be the age of a person


and the ticket fare.
• To develop a model for the relation between these two variables, their
correlation coeffcient is a metric used, with age being the independent
and fare of the ticket being the dependent variable.
• The possibilities in the linear modelling can be

Fare = K ( Age ) ˝ fare of the ticket increases with age

Fare = −K ( Age ) ˙ fare of the ticket decreases with age

– Fare vs. age variation is constant → fare is independent of age


• Similar inference can be made from the correlation coeffcient between
these variables (Figure 1.2).

FIGURE 1.2 Correlation coeffcient between the variables.


22 Handbook of Machine Learning

• Thus, correlation coeffcient can be used as the metric for the measure
of the linear relation between the variables.
• Variance and standard deviation are the measures of the spread of the data
set around its mean and are one-dimensional measure. When dealing with
data sets with two dimensions, the relation between these two variables
in the data set (e.g. number of hours spent by a tailor in stitching and the
number of shirts stitched can be the variables) i.e. the statistical analysis
between the two variables will be studied using covariance. Covariance for
one random variable is nothing but its variance. In the case of n variables,
covariance matrix [ n X n ] is used for the statistical analysis of all the pos-
sible pairs of the variables.[9,13,14]

1.6.1.7 Conditional Probability and Conditional


Density Function of Random Variables
During the study of bivariate random variables, conditional probability is also of
good importance.
This deals with the conditional occurrence of an event, while the occurrence of

( )
other event being the condition is denoted as p A B =
( ˜ B) = p( A, B) , which
p A
P ( B) p( B)
is the probability of event A, under the occurrence of the event B.

For categorical variables X ( taking values x1 , x 2 ,…x n ) and Y ( taking values


p ( X = xi , Y = y j )
(
y1 , y2 ,…yn ), the probability p X = xi Y = y =
j ) p (Y = y j )
.
On similar lines, for continuous X and Y, density function is used in the place of
probabilities.

1.6.1.7.1 Properties of Conditional Density Function

( )
1. f X x y is always non-negative
Y

( )
ˆ
2.
˜
−ˆ
f X x y dx = 1
Y

Similar properties hold good for discrete variables also, but defned under discrete
summation.[15]

1.6.1.7.2 Applications in Machine Learning

( )
p B A p( A)
( )
• Baye’s theorem is stated as p A B =
p( B)
.
• Let the data set consist of various symptoms leading to corona/malaria/
typhoid.
Random Variables in Machine Learning 23

• The prior knowledge about the certainty (probability) of such various


hypotheses is referred to as prior, and such probability is a priori probability.
• Based on the data set, the conclusion about the patient is to be made.
• Using above such hypothesis, based on the symptoms of the patient, conclu-
sions are to be drawn whether the patient is suffering from corona/malaria/
typhoid etc., i.e. which of these hypotheses is applicable for the patient and
with what probability. Such probability is a posteriori probability.

( )
• Then, the probability p having a specific disease Symptom S , i.e. having a
specifc symptom S, the probability of suffering from a specifc disease is
the conditional probability.
• If all the hypotheses are of equal a priori probability, then the above condi-
tional probability can be obtained from the probability of having those symp-
(
toms, knowing the disease, i.e. p Symptom S having the specific disease . )
This probability is referred to as maximum likelihood (ML) of the specifc
hypothesis.[7,9]
• Then, the required conditional probability is

(
p having a specific disease Symptom S )
=
( )
p Symptom S having the specific disease ˝ p ( having the specific disease )
p ( Symptom S )

1.7 TRANSFORMATION OF RANDOM VARIABLES


Any system, depending on its performance, transforms the input variable X into
the output variable Y, with the transformation being the system’s performance, i.e.
Y = g(X), where g() is the transformation.
This transformation can be (1) monotonic and (2) non-monotonic.
dx
The unknown density of the resulting variable Y is given as f ( y ) = f ( x ) | x = x1 ˝ 1 ,
dy
where x1 is the real solution of x from the transformation.
If the transformation results in multiple number of real solutions for X,

f ( y) =
i
˜ dx
f ( x ) | x = xi . i [3]
dy

1.7.1 APPLICATIONS IN MACHINE LEARNING


• Regression models are used to identify a functional relation between the
dependent (target) variable and the predictor variable.
• Based on this relation, future values of the target variable are to be pre-
dicted as a function of predictor variable.
• In linear regression, this functional relation is assumed to be linear and of
the form ˜ = p + q° .
24 Handbook of Machine Learning

TABLE 1.7
Linear Regression Transformation of Variables[10]
Nonlinear
Relations Reduced to Linear Law
˜ = p° n log ( ˜ ) = log ( p ) + n ˇ log (° ) ˘ Y = nX + C , with Y = log ( ˜ ) , X = log (° ) ,C = log ( x )
˜ = m° n + C Y = mX + C, with X = ˜ n , Y = °
˜ = p° n + q.log (° ) ˜ °n
Y = aX + b, with Y = ,X = , a = p, b = q
log (° ) log (° )
˜ = pe q˛ Y = m˜ + c, with Y = log ( ° ) , m = q ˇ log ( e ) ,
c = log ( p )

• If the existing relationship is not linear, the dependent variable is subjected


to some transformations to convert the existing nonlinear relation to linear.[8]
• Table 1.7 specifes some of the transformations.

1.8 CONCLUSION
Thus, random variables are playing a vital role in the felds of machine learning and
artifcial intelligence. Since prediction about the future values of a variable involves
some amount of uncertainty, theory of probability and random variables are essen-
tial constituent building blocks of the algorithms used to teach a machine to perform
certain tasks that are dealing with the principles of learning based on the experience.
These random variables are very much specifc in the theory of signal estimation too.[11]

REFERENCES
1. Bhagwandas P. Lathi and Zhi Ding – Modern Digital and Analog Communication
Systems, Oxford University Press, New York, International Fourth Edition, 2010.
2. Scott L. Miller and Donald G. Childers – Probability and Random Processes with
Applications to Signal Processing and Communications, Academic Press, Elsevier
Inc., Boston, MA, 2004.
3. Henry Stark and John W. Woods – Probability and Random Processes with Applications
to Signal Processing, Pearson, Upper Saddle River, NJ, Third Edition, 2002.
4. Kevin P. Murphy – Machine Learning: A Probabilistic Perspective, The MIT Press,
Cambridge, MA, 2012.
5. Jose Unpingco – Python for Probability, Statistics, and Machine Learning, Springer,
Cham, 2016.
6. Steven M. Kay – Intuitive Probability and Random Processes using MATLAB, Springer,
New York, 2006.
7. Peter D. Hoff – A First Course in Bayesian Statistical Methods, Springer, New York,
2009.
8. Shai Shalev-Shwartz and Shai Ben-David – Understanding Machine Learning: From
Theory to Algorithms, Cambridge University Press, New York, 2014.
9. Bernard C. Levy – Principles of Signal Detection and Parameter Estimation, Springer,
Cham, 2008.
Random Variables in Machine Learning 25

10. Michael Paluszek and Stephanie Thomas – MATLAB Machine Learning, Apress,
New York, 2017.
11. Rober M. Gray and Lee D. Davisson – An Introduction to Statistical Signal Processing,
Cambridge University Press, Cambridge, 2004.
12. Friedrich Liese and Klaus-J. Miescke – Statistical Decision Theory – Estimation,
Testing and Selection-Springer Series in Statisitcs, Springer, New York, 2008.
13. James O. Berger – Statistical Decision Theory and Bayesian Analysis, Springer-Verlag,
New York Inc., New York, Second Edition, 2013.
14. Ruise He and Zhiguo Ding (Eds.) – Applications of Machine Learning in Wireless com-
munications, IET Telecommunication Series 81, IET The Institution of Engineering
and Technology, London, 2019.
15. Robert M. Fano – Transmission of Information: A Statistical Theory of Communications,
The MIT Press, Cambridge, MA, 1961.
Exploring the Variety of Random
Documents with Different Content
"I am," remarked Toby, as he lifted his glass, "a prophet in a small
way. Old boy, your hand. To the health of our double marriage--and
no heeltaps."

Archie finished his glass.

CHAPTER XII.

ARS AMORIS.

'Tis very easy to make love;


A smile--a pressure of the hand.
A reference to the stars above,
A "fly with me to some far land,"
A sigh as soft as coo of dove,
A kiss--the rest she'll understand.

Mr. Gelthrip, thinking no one but himself knew anything, had


contradicted his clerical superior on some point connected with the
introduction of printing into England, and the vicar in great wrath
had carried off his dogmatic curate to the library in order to prove
his case. The two elder ladies were talking about Sir Rupert as Mrs.
Valpy had met him a few months previously, and Mrs. Belswin was
trying to find out all about her quondam husband, in order to
strengthen her position as much as possible. At present she knew
that she was entirely at the mercy of Sir Rupert, so if she could
discover something detrimental to his character it might serve as a
weapon against him. The scheme which she hoped to carry through
with the assistance of Ferrari, was a dangerous one; and moreover,
she was doubtful if the Italian would consent to aid her; therefore
she was anxious to try all other methods of coercing her husband
before resorting to the last and most terrible expedient. She was a
clever woman, was Mrs. Belswin, and the instinct for discovery,
which she inherited from her savage grandparents, made her
wonderfully acute in cross-examining simple Mrs. Valpy, who not
comprehending the subtlety of her companion, told all she knew
about the baronet in the most open manner. The result was not
gratifying to Mrs. Belswin; for with all her dexterity in twisting, and
turning and questioning, and hinting, she discovered nothing likely
to compromise Sir Rupert in any way.

"It's no use," she thought, with a feeling of despair in her heart,


"Rupert has it all his own way, and I can do nothing--nothing except-
---"

She smiled significantly, and simple Mrs. Valpy, seeing that the
companion was looking at Toby and her daughter, who were
amusing themselves at the piano, misinterpreted the smile, and
therefore spoke according to her misinterpretation.

"They'll make a very happy couple, won't they, Mrs. Belswin?"

Mrs. Belswin, thus being appealed to, started, smiled politely, and
assented with much outward show of interest to the remark of the
old lady.

"It's so nice for Toby to have his home here," pursued Mrs. Valpy,
with much satisfaction; "because, you know, our place is not far
from the vicarage, so I shall not be parted from my daughter."

The other woman started, and laid her hand on her breast, as if to
still the beating of her heart.

"Yes; it would be a terrible thing to part with your only child," she
said in a low voice. "I know what the pain of such a separation is."
"You have parted from your child, then?" said Mrs. Valpy,
sympathetically.

Mrs. Belswin clutched her throat, and gave an hysterical laugh.

"Well, no; not exactly;" she said, still in the same low voice; "but--
but my little daughter--my little daughter died many years ago."

It was very hard for her to lie like this when her daughter was only a
few yards away, chatting to Maxwell at the window; but Mrs. Belswin
looked upon such necessary denial as punishment for her sins, and
accepted it accordingly.

"I'm very sorry," observed Mrs. Valpy, with well-bred condolence.


"Still, time brings consolation."

"Not to all people."

"Oh, yes, I think so. Besides, now you have that dear girl, Kaituna,
and she seems very fond of you."

"Yes."

She could say no more. The strangeness of the situation excited her
to laughter, to that laughter which is very near tears, and she was
afraid to speak lest she should break down.

"And then Sir Rupert will be so glad to find his daughter has such a
good friend."

The mention of the hated name restored Mrs. Belswin to her usual
self, and with a supercilious glance at the blundering woman who
had so unconsciously wounded her, she answered in her ordinary
manner--

"I hope so! But I'm afraid I shall not have an opportunity of seeing
Sir Rupert at once, as I go to town shortly, on business."
"But you will return?"

"Oh, yes! of course I shall return, unless some unforeseen


circumstances should arise. We are never certain of anything in the
future, you know, Mrs. Valpy."

"No, perhaps not! At all events I think you will like Sir Rupert."

Mrs. Belswin sneered.

"Oh, do you think so?"

"I'm certain. Such a gentlemanly man. Quite young for his age. I
wonder he does not marry again."

"Perhaps he had enough of matrimony with his first wife," said Mrs.
Belswin, coolly.

"Oh, he was devotedly attached to her."

"Was he, indeed?"

"Yes! Simply worshipped her. She died in New Zealand when Kaituna
was a baby, I believe, and Sir Rupert told me how this loss had
overshadowed his life."

"Hypocrite!" murmured Mrs. Belswin, between her clenched teeth.

The conversation was becoming a little difficult for her to carry on,
as she dare not disclose herself yet, and did not care about
exchanging complimentary remarks on the subject of a man she
detested so heartily.

At this moment Toby struck a chord on the piano, and Tommy burst
out laughing, so, with ready wit, Mrs. Belswin made this interruption
serve as an excuse to break off the conversation.
"The young people seem to be merry," she said to Mrs. Valpy, and
rising to her feet, "I must go over and see what the joke is about."

Mrs. Valpy nodded sleepily, feeling somewhat drowsy after her


dinner, so Mrs. Belswin, seeing she did not mind being left to her
own devices, walked across to the piano and interrupted the two
lovers, for which interruption, however, they did not feel profoundly
grateful.

"Won't you sing something?" asked the companion, addressing Toby,


"or you, Miss Valpy?"

"Oh, my songs are too much of the orthodox drawing-room' type,"


replied Miss Valpy, disparagingly. "Now Toby is original in his ditties.
Come, let's have a little chin-music, Toby!"

"Wherever do you learn such slang?" said Mrs. Belswin, with a smile.

"Toby."

"I! Oh, how can you? I speak the Queen's English."

"Do you really?" said Tommy, laughing. "Well, I at present speak the
President's American, so go right along, stranger, and look slippy
with the barrel organ."

"If your mother hears you," remonstrated Mrs. Belswin, "she will----"

"Yes, I know she will," retorted Tommy, imperturbably; "but she's


asleep and I'm awake, very much so. I say, Mrs. Belswin, where's
Kaituna?"

"I think she's walking on the lawn with Mr. Maxwell."

"As a chaperon you should hunt them out," said Miss Valpy,
mischievously.
"Suppose I give the same advice to your mother," replied Mrs.
Belswin, dryly.

"Don't," said Toby, in mock horror; "as you are strong be merciful."

"Certainly, if you sing something."

"What shall I sing?"

"Anything," said Tommy, sitting down, "except that new style of


song, all chords and no tune."

Toby laughed mischievously and began to sing--

"If I mashed her would she kiss me?


No! no! no!
If I bolted would she miss me?
No! no! no!
She knows I haven't got a rap;
Besides, there is the other chap--
At him, not me, she sets her cap;
No! no! no!"

"Mr. Clendon," said Tommy, in a tone of dignified rebuke, "we don't


want any music-hall songs. If you can't sing something refined, don't
sing at all."

"I must collect my ideas first," replied Toby, running his fingers over
the piano. "Wait till the spirit moves me."

Mrs. Belswin had resumed her seat near the sleeping form of Mrs.
Valpy, and was thinking deeply, though her thoughts, judging from
the savage expression in her fierce eyes, did not seem to be very
agreeable ones, while Tommy leaned over the piano watching Toby's
face as he tried to seek inspiration from her smiles.
Outside on the short dry grass of the lawn, Kaituna was strolling,
accompanied by Archie Maxwell. The grass extended for some
distance in a gentle slope, and was encircled by tall trees, their
heavy foliage drooping over the beds of flowers below. Beyond, the
warm blue of the sky, sparkling with stars, and just over the
trembling tree-tops the golden round of the moon. A gentle wind
was blowing through the rustling leaves, bearing on its faint wings
the rich odours of the flowers, and the lawn was strewn with aerial
shadows that trembled with the trembling of the trees. Then the
white walls of the vicarage, the sloping roof neutral tinted in the
moonlight, the glimmer of the cold shine on the glass of the upstair
windows, and below, the yellow warm light streaming out of the
drawing-room casements on the gravelled walk, the lawn beyond,
and the figures of the two lovers moving like black shadows through
the magical light. A nightingale began to sing deliciously, hidden in
the warm dusk of the leaves, then another bird in the distance
answered the first. The hoot of an owl sounded faintly through the
air, the sharp whirr of a cricket replied, and all the night seemed full
of sweet sounds.

Kaituna sat down on a bench placed under the drawing-room


windows, and Archie, standing beside her, lighted a cigarette after
asking and obtaining the requisite permission. The voices of the
vicar and his curate sounded in high dispute from the adjacent
library; there was a murmur of conversation from within, where Mrs.
Belswin was talking to the other lovers, and at intervals the sharp
notes of the piano struck abruptly through the voices, the songs of
the nightingale, and the charm of the night.

"What I miss very much in the sky here," said Kaituna, looking up at
the stars, "is the Southern Cross."

"Yes; I have seen it myself," replied Archie, removing his cigarette.


"You know I have travelled a great deal."

"And intend to travel still more!"


"Perhaps."

"You don't seem very sure, Mr. Maxwell. What about South
America?"

"I thought I had told you that I had changed my mind about South
America."

Kaituna flushed a little at the significance of his words, and cast


down her eyes.

"I believe you said something about putting off your journey till the
end of the year."

"I'll put it off altogether, if a certain event takes place."

"And that certain event?"

"Cannot you guess?"

Duplicity on the part of the woman, who knew perfectly well the
event to which the young man referred.

"No, I am afraid I can't."

"Miss Pethram--Kaituna, I----"

"Hush! Mr. Clendon is singing."

It was only to gain time for reflection, as she knew that a declaration
of love trembled on his lips, but with feminine coquetry could not
help blowing hot to his cold.

And Toby was singing a bold martial song, with a curious


accompaniment like the trotting of a horse--a song which thrilled
through the listeners, with its fierce exultation and savage passion.
On God and his prophet I seven times called me;
I opened the Koran--the omen appalled me;
I read it--thou wast to be bride to another;
I knew my betrayer, 'twas him I called brother,
Zulema! Zulema!

I sprang on my steed as he waited beside me,


Then rode through the desert with Allah to guide me;
Fierce blew the sirocco, its terrors were idle;
I galloped till dawn to be first at your bridal
Zulema! Zulema!

I rode to the tent-door, your father's tribe knew me;


They dreamed of the glory they'd gain if they slew me;
I dashed through the cowards--I met my betrayer,
He fell from his saddle, and I was his slayer,
Zulema! Zulema!

You ran from your dwelling--your father's spears missed me;


You sprang to my saddle with fervour to kiss me;
We broke through the press of your kinsfolk, my foemen;
I won thee, Zulema, so false was the omen;
Zulema! Zulema!

"Ah!" said Archie, with a long breath, when the fierce cry had rung
out for the last time, "that is the way to win a bride."

Kaituna thought so too, although she did not make any remark, but
the shrill savagery of the song had stirred her hereditary instincts
profoundly, and even in the dim moonlight Archie could see the
distension of her nostrils, and the flash of excitement that sparkled
in her eyes. It gave him an idea, and throwing himself on his knees,
he began to woo her as fiercely and as freely as ever her dusky
ancestors had been wooed in the virgin recesses of New Zealand
woods.

"Kaituna, I love you! I love you. You must have seen it; you must
know it. This is no time for timid protestations, for doubtful sighing.
Give me your hands." He seized them in his strong grasp. "I am a
man, and I must woo like a man. I love you! I love you! I wish you
to be my wife. I am poor, but I am young, and with you beside me, I
can do great things. Say that you will marry me."

"But my father!"

He sprang to his feet, still holding her hands, and drew her forcibly
towards him.

"Your father may consent--he may refuse. I do not care for his
consent or his refusal. Say you will be my wife, and no human being
shall come between us. I have no money. I will gain a fortune for
you. I have no home--I will make one for you. Youth, love, and God
are on our side, and we are made the one for the other. You must
not say no! You shall not say no. You are the woman needed to
complete my life; and God has given you to me. Lay aside your
coquetry, your hesitations, your fears. Speak boldly to me as I do to
you. Let no false modesty--no false pride--no maidenly dread come
between us. I love you, Kaituna. Will you be my wife?"

There was something in this akin to the fierce wooing of primeval


man. All the artificial restraints of civilisation were laid aside. The
doubts, the fears, the looks, the shrinkings, all these safeguards and
shields of nervous natures had vanished before this whirlwind of
passion, which bore down such feeble barriers set between man and
woman. As his eyes ardent with love, passionate with longing,
flashed into her own she felt her bosom thrill, her blood rush rapidly
through her veins, and, with an inarticulate cry, wherein all the
instincts she had inherited from her Maori ancestors broke forth, she
flung herself on his heaving breast.

"Kaituna!"

"Yes! yes! take me I take me! I am yours, and yours only."


CHAPTER XIII.

EXIT MRS. BELSWIN.

She smiles she laughs! she talks of this and that--


To all appearances a very woman.
Ah! but that phrase bears deep interpretation--
"A very woman" is a treacherous thing;
Her smile's a lie--a lie to hide the truth,
For when the time is ripe for all her schemes
"A very woman" slips her smiling mask,
And lo! behold, a look which means, "You die."

One who has been in strange lands, and ventured his life in far
countries, is by no means anxious to court again the dangers he has
so happily escaped. The traveller, telling his tales by his lately gained
fireside, shudders as he remembers the perils he has dared, the
risks he has encountered, and is thankful for his present safety, so
thankful indeed that he is unwilling to place his life for the second
time at the disposal of chance.

It was somewhat after this fashion that Mrs. Belswin viewed her
present security in contrast to her past jeopardy. She had been a
free-lance, and adventuress, an unprotected woman at the mercy of
the world, so hard and pitiless to such unfortunates; but now she
had found a home, a refuge, a daughter's love, a bright oasis in the
desert of affliction, and she dreaded to be driven out of this peaceful
paradise, which held all that made her life worth having, into a
stormy world once more. Through perils more deadly than those of
savage lands, through storms more terrible than those of the ocean,
she had passed into a haven of tranquillity; but now that she was
tasting of the pleasures of hope and repose, it seemed as though
she would once more be driven forth to battle with her fellow-
creatures.

Her quondam husband held her fate in his hand. He had right and
might on his side, and she knew that she could expect no mercy
from one whom she had so deeply wronged. Had the positions been
reversed she felt that she would not have scrupled to enforce the
powers she possessed, and, therefore, never for a moment dreamed
that her husband would act otherwise. All she knew was that she
was now in Paradise, that she enjoyed her daughter's affection,
ignorant as that daughter was of the mother's identity, and that the
husband of her youth, and the father of her dearly-loved child would
expel her from this hardly won Paradise as soon as he discovered
her therein.

This being the case, she did not waste time in asking for a mercy not
likely to be granted, but set herself to work to find out some means
of retaining her position in defiance of her husband's enmity and
hatred. After her conversation with Mrs. Valpy, she saw that Rupert
Pethram had glossed over the affair of the divorce in order to avoid
all suspicion of scandal against himself and the mother of his child,
for he was unwilling that the child should suffer for the sin of her
parent. This was certainly a point in her favour, as by threatening to
denounce the whole affair if she was not allowed to retain her
position she could force him to acquiesce in her demand, in order to
avoid scandal.

But then if he, though keeping the terrible affair secret from the
outside world, told Kaituna all about her mother's disgrace, thus
destroying the love which the girl had for the memory of one whom
she thought was dead--it would be too terrible, as she could urge
nothing in extenuation of her sin, and would be forced to blush
before her own child. No, nothing could be done in that way. Should
she throw herself on the mercy of the man she had wronged? Alas!
she knew his stern nature well enough to be aware of the hopeless
folly of such an attempt. Looking at the whole affair in whatever way
that suggested itself to her fertile brain, she saw no means of
retaining her position, her child or her newly-found respectability,
except by enlisting the sympathy of Ferrari and----

But it was too terrible. It was a crime. Guilty as she was, to do this
would render her still more guilty. Even if she succeeded in getting
her husband out of the way, and it was not discovered by the law,
there was still Ferrari to be reckoned with. It would give him a
strong hold over her, which he would use to force her into marriage,
and then she would be still separated from her child, so that the
crime she contemplated would be useless.

To see this woman raging up and down her bedroom was a pitiful
sight. Flinging herself on her knees she would pray to God to soften
the heart of her husband, then, realising how futile was the hope,
she would start to her feet and think again of the crime she
contemplated committing with the assistance of her Italian lover. She
raged, she wept, she sighed, she implored. Her mood changed with
every tick of the clock; from hope she fell into despair; from despair
she changed once more to hope--tears imprecations, prayers,
threats, she tried them all in their turn, and the result was always
the same--absolute failure. She was dashing herself in vain against
an adamantine wall, for in her calmer moments she saw how
helpless she was against the position held by her husband--a
position approved of by law, approved of by the world. She could do
nothing, and she knew it.

Still, Ferrari!

Yes, she would go up and see him, for perhaps he could solve the
riddle which thus perplexed her so terribly. He would demand his
price, she knew him well enough for that. Well, she would pay it in
order to still retain possession of her child. Let her accomplish her
present desire and the future would take care of itself. So, Mrs.
Belswin, summoning all her philosophy to her aid, composed her
features, and told Kaituna that she was going up to London on
business.

"But papa will be here next week," said the girl in dismay.

"Yes; I'm sorry to go at such a time, dear," replied Mrs. Belswin, with
an immovable countenance, "but it is a very important matter that
takes me away."

"You will be back again soon?"

"In a fortnight at the least."

"Oh, I'm glad of that," said Kaituna, with a flush; "you know I want
you to help me gain papa's consent to my marriage with Archie."

Mrs. Belswin smiled bitterly as she kissed her daughter, knowing how
weak was the reed upon which the girl leaned. She ask Rupert
Pethram to consent to the marriage--she dare to demand a favour of
the man she had wronged for the child she had forsaken! She
almost laughed as she thought of the terrible irony of the situation,
but, restraining herself with her usual self-command, bade the girl
hope for the best.

"Your father must like Mr. Maxwell, he is such a charming young


fellow," she said encouragingly, "and as you love him so dearly, Sir
Rupert, for the sake of your happiness, may perhaps overlook his
want of money."

"But you will speak to papa, Mrs. Belswin?"

"Yes; if I see your father on my return I will certainly speak to him."

"How strangely you talk," said Kaituna, rather puzzled; "if you come
back in a fortnight you will be sure to see papa."
"Of course, dear! of course. I was only thinking that some
unforeseen accident----"

"Oh, no, no!"

"Kaituna, you love your father very dearly."

"Very, very dearly. He is all I have in the world."

It required all Mrs. Belswin's self-restraint to prevent her then and


there throwing herself into the girl's arms and telling her all. Such a
course, however, would have been worse than madness, so she was
forced to crush down her maternal feelings.

After this interview with Kaituna, she departed for London--departed


for the possible commission of a crime, and as the carriage left
Thornstream she looked back with a sigh to the girl standing on the
terrace.

"Perhaps I shall never see her again," she said, with a groan,
throwing herself back in her seat. "But no; that will never happen;
even if Rupert does turn me out of the house he will not tell Kaituna
anything to destroy her belief in her mother, so I shall some day
meet her with her husband."

Her lips curled as she said this, knowing well that Sir Rupert would
never give his consent to the marriage, and then she clenched her
hands with a frown.

"He must consent to the marriage--Kaituna's heart is set on it. He


can destroy my happiness, but I'll kill him before he destroys that of
my child."

And with this firm determination she left her husband's house--the
house in which she should have reigned a happy mistress and
mother, and the house into which she had crept like a disguised
thief, the house which she, in the mad instinct of her savage nature,
intended to deprive of its master.

While waiting on the railway platform for the London train, she saw
Samson Belk.

The relations between these two were peculiar. Ever since he had
seen her at his mother's cottage, Belk had followed her everywhere
like her shadow, much to Mrs. Belswin's astonishment, for, candid in
all things to herself, she could not conceive how a handsome young
man could leave younger women for one verging on middle age. Yet
such was the case. This bucolic man had fallen passionately in love,
and adored her with all the sullen ardour of his obstinate nature. He
was slow-witted, dull-headed, and it took a long time for an idea to
penetrate into his brain, but once the idea was there, nothing could
get it out again. This woman, so different from all he had known,
who spoke in a commanding way, who flashed her eyes fiercely on
all, as if they were her slaves, had, without a word, without a sign,
brought to his knees this uncultured man, who knew nothing of the
deference due to the sex, and whose only attributes were great
physical strength and a handsome exterior. Formerly, owing to these
advantages, he had gained admiration from all women, and in return
had treated them with brutal indifference, or scarcely veiled
contempt; but now the positions were reversed, and he was the
abject slave of this imperious queen, who looked down at him with
disdain. It was a case of Samson like wax in the hands of Delilah--of
Hercules subjugated by Omphale; and Samson Belk, with all his
virile strength, his handsome face, his stalwart figure, was crouching
like a dog at the feet of Mrs. Belswin.

He looked somewhat haggard as he came towards her and took off


his hat, Mrs. Belswin nodding coldly to him in return.

"Well, Mr. Belk," she said, indifferently, "what are you doing here?"

"I heard you were going to town, madam."


"Yes? How can that possibly concern you?" Belk stood twisting his
hat round and round in a sheepish manner.

"I thought I might be of service to you," he stammered, looking at


her portmanteau.

"Thank you, but there is no need. The porters will attend to all that,"
replied the lady, graciously. "But you don't look very well, Mr. Belk. I
suppose you've been drinking."

Candour was Mrs. Belswin's strong point, and looking at Belk as an


inferior animal, she treated him accordingly, but he seemed in
nowise displeased at her bluntness.

"No; I haven't been drinking, madam."

"That's just as well. You know Sir Rupert returns next week, and if
he found you to be dissipated, he'd dismiss you on the spot."

"Would he?" said Belk, sullenly. "Let him if he likes. You seem to
know Sir Rupert, madam."

"I? No; but I have heard about him."

"He's a hard man, what I've seen of him."

Mrs. Belswin was not going to discuss this subject with a servant like
Belk, so she turned indifferently away as the train came into the
station, and left him standing there, looking in sullen admiration at
her graceful form in the dark garments she now affected.

When she was safely installed in a first-class carriage, her rustic


admirer, who had seen personally after her luggage, appeared at the
window with some newspapers.

"You'll want them to read, madam," he said awkwardly, as she


thanked him. "I hope you'll have a pleasant journey."
"Thank you, Mr. Belk, I hope I shall."

"You'll be coming back soon I hope?"

He blurted out this question with a deep flush, and Mrs. Belswin
stared at him with undisguised astonishment She could not
understand the reason of this man's deference, for she judged it
impossible that he could be so deeply in love with her as all his
actions seemed to denote. Good-natured, however, when not
crossed in any way, she replied politely, as the train moved off--

"I shall return in a fortnight."

"If you don't," muttered Belk, as the long line of carriages


disappeared, "I'll follow you up to London."

"Good heavens!" said Mrs. Belswin, throwing herself back in her


seat, "what on earth can the man see in me to admire? I'm not a
vain woman. I never was a vain woman, and why that handsome
young fellow should leave youth to run after age is more than I can
understand. It's flattering; very much so; but," continued the lady,
struck by a sudden thought, "if Ferrari met my new admirer, I'm
afraid there would be trouble."

She laughed at the idea, and taking up the Telegraph began to read,
but suddenly laid it down with a nervous start.

"Ferrari loves me! Belk loves me! I love neither, but only my child.
Rupert stands between me and my happiness. Which of these men
will remove him out of my path? Ferrari--a subtle Italian, Belk--a
brutal Saxon. Humph! The fox and the lion over again--craft and
strength! I can depend on them both, and Rupert----"

She struck her hands together with a triumphant laugh.

"Rupert Pethram, you are marching blindfolded into a trap."


CHAPTER XIV.

SIGNOR FERRARI DECLINES.

"Number One is the greater number; if I assisted Number Two it would become the
lesser."

Signor Ferrari was a gentleman who knew how to make himself


thoroughly comfortable; and, in order to do so, squandered his
earnings in a most spendthrift fashion. At present he was receiving a
very handsome salary for his singing in Sultana Fatima, therefore he
denied himself nothing in the way of luxury. He was a true Bohemian
in every action of his life, and accepted his fluctuating fortunes with
the utmost equanimity. If he fared badly on dry bread and water one
day, he was hopeful of oysters and champagne the next; and when
the feast of Dives was before him, made the most of it in eating and
drinking, so as to recompense himself for all future deprivations,
which would be the lot of poverty-stricken Lazarus.

While his voice lasted he was well aware that he could command an
excellent income which satisfied him completely; for when he grew
old and songless he was quite prepared to return to Italy, and live
there the happy-go-lucky life of his youth on polenta and sour wine.
In his impulsive southern fashion he loved Mrs. Belswin madly; but,
strangely enough, it never for a moment occurred to him to save
money against his possible marriage with her. If he starved, she
would starve; if he made money, she would share it; and if she
objected to such a chequered existence, Signor Ferrari was quite
confident enough in his own powers of will and persuasion to be
satisfied that he could force her to accept his view of the matter.
This was the Ferrari philosophy, and no bad one either as times go,
seeing that a singer's livelihood depends entirely upon the caprice of
the public. As long as he could get enough to eat, be the food rich
or plain, a smoke, and plenty of sleep, the world could go hang for
all he cared. He lived in the present, never thought about the past,
and let the future take care of itself; so altogether managed to
scramble through life in a leisurely, selfish manner eminently
egotistical in fashion.

At present, being in the heyday of life, he was dining with Dives,


which was happiness enough in itself; but, in order that nothing
should be wanting to complete his felicity, he had received a letter
from Mrs. Belswin, telling him of her contemplated arrival. Under
these circumstances he had nothing left to wish for, and lounging on
the sofa in his sitting-room in a state of blissful contentment awaited
the coming of his fair friend.

"Buõno," said the signor, with smiling satisfaction, folding up the


letter and putting it in his pocket, "the singing-bird returns to its
nest. This time I will clip its wings, so that it flies not again. Per
Bacco, the kind heart of Stephano surprises himself, for who would
let his bird fly as he has done? But I fear not the jealousy, offspring
of suspicion. Ecco! she loves but me, and comes again to the nest.
And what a nest! Cospetto! My Lucrezia will be hard to please if she
likes not this palazzo del amor."

It was a very pretty nest indeed, from a lodging-house point of view,


although its incongruity of colouring and furnishing would have
driven an artist out of his mind; but then the signor was not exacting
in the way of harmonious effect, and, provided his dwelling was
fairly comfortable, felt completely satisfied. Lying on the sofa, he
looked complacently at the furniture, covered with painfully bright
blue satin, at the scarlet curtains, the green wall-paper, and at all
the wax flowers, Berlin wool mats, and gimcrack ornaments with
which the room was adorned. Ferrari had added to this splendid
furnishing an excellent piano for professional purposes, and
numerous photographs, principally feminine, of his artistic friends;
so that he conceived himself to be housed in a princely fashion.

It was three o'clock by the incorrect French timepiece on the tawdry


mantelpiece, and Ferrari was getting somewhat impatient, as Mrs.
Belswin had mentioned two o'clock as the time of her arrival; but
with his accustomed philosophy he manifested no anger at the
delay.

"La Donna é mobile," he hummed, shrugging his shoulders, as he


strolled towards the piano. "Women are always late; it is one of their
charming follies. Ah! EH! EE! Diavolo! my voice is bad this day.
These English fogs are down my throat Ah! Eh! EE! Dio! What a
note! Voce del oca.

"Ask not the stars the fate they deal.


Read in my eyes the love I feel."

"That's a good song, that serenade to Fatima. It shows off my voice.


I'll sing it to exercise my high notes."

He did so, and was just in the middle of the first verse when Mrs.
Belswin made her appearance, upon which he stopped abruptly, and
came forward to greet her with theatrical effusion.

"Stella dora! once more you shine," he cried, seizing her hands, with
a passionate look in his dark eyes. "Oh, my life! how dear it is to see
thee again."

"You missed me then, Stephano?" said Mrs. Belswin, sinking wearily


into a chair.

"Missed thee, carissima!" exclaimed the Italian, throwing himself on


his knees before her and kissing her hand; "by this, and this, and
this again, I swear that all has been dark to me without the light of
thine eyes. But you will not leave me again, angela mia. Thou hast
come back for ever to be my wife."

Mrs. Belswin drew her hand away sharply and frowned, for in her
present irritable state of mind the exaggerated manner of Ferrari
jarred on her nerves.

"Do be sensible, Stephano," she said in a vexed tone. "You are


always acting."

"How can that be acting, cruel one, which is the truth?" replied
Ferrari, reproachfully, rising from his knees. "Thou knowst my love,
and yet when I speak you are cold. Eh, Donna Lucrezia, is your
heart changed?"

"My heart remains as It always was, my friend; but I've come up to


see you on business----"

"Oh, business!" interrupted Stephano, suspiciously. "Cospetto! You


want once more to leave me."

"For a time; yes."

"Oh, for a time; yes!" echoed Ferrari, mockingly. "Amica mia, you
have a strange way of speaking to him who adores you. Dio, you
play with me like a child. I love you, and wish you for my wife. You
say 'yes,' and depart for a time. Now return you to me and again
say, 'Stephano, I leave you for a time.'"

"I made no promise to be your wife," said Mrs. Belswin, angrily, "nor
will I do so unless you help me now."

"Help you! and in what way? Has the little daughter been cruel? You
wish me to speak as father to her."
"I wish you to do nothing of the sort. My daughter is quite well, and
I was perfectly happy with her."

"And without me," cried Ferrari, jealously; upon which Mrs. Belswin
made a gesture of irritation.

"We can settle that afterwards," she said, drawing off her gloves:
"meanwhile let us talk sense. I shall be up in town for a fortnight."

"And you stay, cara?"

"At an hotel in the Strand. I'll give you the address before I leave."

"Bene! I will then have you to myself for two weeks."

"It all depends on whether you will help me in what I wish to do."

"Ebbene! Is it il marito?"

Mrs. Belswin nodded, and the Italian burst out laughing.

"Povero diavolo. He has then come again."

"No! but he arrives next week."

"How pleased you are," said Ferrari, mockingly. "Oh, yes, he will be
so sweet to behold you."

"That's the very question! I don't want him to see me."

"Then return not to the little daughter."

"I must! I must!" cried Mrs. Belswin in despair. "I can't give up my
child after meeting her again. Twenty years, Stephano, and I have
not seen her; now I am beside her every day. She loves me--not as
her mother, but as her friend. I can't give up all this because my
husband is returning."
Signor Ferrari shrugged his shoulders and lighted a cigarette.

"But there is nothing more you can do," he said, spreading out his
hands with a dramatic gesture, "eh, carrissima? Think of what is this
affair. Il marito has said to you, 'Good-bye.' The little daughter thinks
you to be dead. If then you come to reveal yourself, il marito--eh,
amica mia! it is a trouble for all."

"What can I do?"

"Nothing! oh no, certainly! You have beheld the little daughter for a
time. Now you are to me again. I say, Stella 'dora, with me remain
and forget all."

"No, I will not! I will not!" cried Mrs. Belswin, savagely, rising to her
feet. "Cannot you see how I suffer? If you love me as you say, you
must see how I suffer. Give up my child, my life, my happiness! I
cannot do it."

"Dio! you cannot make the miracles."

"I can! I must! Do you think I will stay with you while my child calls
me?"

"With me you must stay, my Norma. I love thee. I will not leave you
no more."

"You can't stop me."

"Ebbene," said Ferrari, conscious that he held the advantage. "Go,


then, and see how il marito will behold you."

Mrs. Belswin felt her helplessness, and clenched her hands with a
savage cry of despair, that seemed to be torn out of her throbbing
heart. Up and down the gaudy room she paced, with her face
convulsed with rage, and her fierce eyes flashing with an unholy fire,
while Ferrari, secure in his position, sat quietly near the window,
smoking leisurely. His self-possession seemed to provoke her, ready
as she was to vent her impotent anger on anything, and, stopping
abruptly she poured forth all her anger.

"Why do you sit there smiling, and smiling, like a fool?" she shrieked,
stamping her foot. "Can you not suggest something? Can you not do
something?"

"Eh, carissima, I would say, 'Be quiet' The people below will hear you
cry out."

"Let them! What do I care? I am a desperate woman, Ferrari, and I


am determined to keep my position beside my child. I will stop at
nothing--nothing--not even murder!"

"Murder!"

Signor Ferrari let the cigarette drop from his fingers, and jumped up
with a cry of dismay looking pale and unnerved. She saw this, and
lashing him with her tongue, taunted him bitterly.

"Yes, murder, you miserable! I thought you were a brave man; but I
see I made a mistake. You love me! You want to be my husband!
No, no, no! I marry a brave man--yes, a brave man; not a coward!"

Ferrari winced, with an angry glitter in his eyes.

"Eh, Lucrezia. You think I am a brave man if I go to assassin il


marito. Cospetto! I am an Italian; but the Italians are not fools. If
another man loved you, and would take you away, I would kill him--
yes! But il marito--eh, that is not quite the same. I kill him and you
return to the little daughter for always. What gain to me, carissima?
I kill him, and your law gives me the rope. What gain to me? No,
Donna Lucrezia. Do what you love. Stab him with a stiletto, or give
the poison, I say nothing; but as for me to obey--Dio, the life is not
trouble to me yet."
"You are afraid."

He bounded across the room, and seized her roughly by the wrist.

"Devil-woman, I have no fear! You lie to speak so I You lie, figlia


inferna."

"Then why do you refuse to help me?"

"Per Bacco, I am no assassin. Il marito is not an enemy to me. To


you he is hateful. Revenge yourself as it pleases; but I--cospetto.
You ask too much."

He flung her away from him with a gesture of anger, and began to
walk about the room. Mrs. Belswin remained silent, savagely
disappointed at the failure of her plan, and presently Ferrari began
to talk again in his rapid, impulsive fashion.

"If there was any gain. Yes. But I see not anything. I would work
against myself. You know that, Signora Machiavelli. Ah, yes; I am not
blind, cara mia. While il marito lives, you are mine. He will keep you
from the little daughter. But he dies--eh, and you depart."

"No, no! I swear----"

"I refuse your swearing. They are false. Forget, il marito--forget the
little daughter! You are mine, mia moglie, and you depart not again."

Mrs. Belswin laughed scornfully, and put on her gloves again with
the utmost deliberation. Then, taking up her umbrella, she moved
quickly towards the door; but not so quickly as to prevent Ferrari
placing himself before her.

"Where go you?" demanded the Italian, between his clenched teeth.

"To find a braver man than Stephano Ferrari."

"No; you will find no one."

You might also like