100% found this document useful (3 votes)
46 views

(PDF Download) MATLAB Machine Learning Recipes: A Problem-Solution Approach 3rd Edition Michael Paluszek Fulll Chapter

ebook

Uploaded by

stilltegit
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (3 votes)
46 views

(PDF Download) MATLAB Machine Learning Recipes: A Problem-Solution Approach 3rd Edition Michael Paluszek Fulll Chapter

ebook

Uploaded by

stilltegit
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 53

Full download test bank at ebook textbookfull.

com

MATLAB Machine Learning Recipes: A


Problem-Solution Approach 3rd

CLICK LINK TO DOWLOAD

https://textbookfull.com/product/matlab-
machine-learning-recipes-a-problem-solution-
approach-3rd-edition-michael-paluszek/

textbookfull
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

MATLAB Machine Learning Recipes: A Problem-Solution


Approach 2nd Edition Michael Paluszek

https://textbookfull.com/product/matlab-machine-learning-recipes-
a-problem-solution-approach-2nd-edition-michael-paluszek/

MATLAB Recipes: a problem-solution approach 2nd Edition


Michael Paluszek

https://textbookfull.com/product/matlab-recipes-a-problem-
solution-approach-2nd-edition-michael-paluszek/

MATLAB Machine Learning 1st Edition Michael Paluszek

https://textbookfull.com/product/matlab-machine-learning-1st-
edition-michael-paluszek/

Java 9 Recipes: A Problem-Solution Approach 3rd Edition


Josh Juneau

https://textbookfull.com/product/java-9-recipes-a-problem-
solution-approach-3rd-edition-josh-juneau/
Android Recipes A Problem-Solution Approach Dave Smith

https://textbookfull.com/product/android-recipes-a-problem-
solution-approach-dave-smith/

Raku Recipes: A Problem-Solution Approach J.J. Merelo

https://textbookfull.com/product/raku-recipes-a-problem-solution-
approach-j-j-merelo/

C Recipes - A Problem-Solution Approach 1st Edition


Shirish Chavan

https://textbookfull.com/product/c-recipes-a-problem-solution-
approach-1st-edition-shirish-chavan/

JavaScript Recipes: A Problem-Solution Approach 1st


Edition Russ Ferguson

https://textbookfull.com/product/javascript-recipes-a-problem-
solution-approach-1st-edition-russ-ferguson/

wxPython Recipes: A Problem - Solution Approach 1st


Edition Mike Driscoll

https://textbookfull.com/product/wxpython-recipes-a-problem-
solution-approach-1st-edition-mike-driscoll/
MATLAB Machine
Learning Recipes
A Problem-Solution Approach

Third Edition

Michael Paluszek
Stephanie Thomas
MATLAB Machine Learning Recipes: A Problem-Solution Approach

Michael Paluszek Stephanie Thomas


Plainsboro, NJ Plainsboro, NJ
USA USA

ISBN-13 (pbk): 978-1-4842-9845-9 ISBN-13 (electronic): 978-1-4842-9846-6


https://doi.org/10.1007/978-1-4842-9846-6

Copyright © 2024 by Michael Paluszek and Stephanie Thomas

This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material
is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, repro-
duction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic
adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed.
Trademarked names, logos, and images may appear in this book. Rather than use a trademark symbol with every
occurrence of a trademarked name, logo, or image we use the names, logos, and images only in an editorial fashion
and to the benefit of the trademark owner, with no intention of infringement of the trademark.
The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not
identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary
rights.
While the advice and information in this book are believed to be true and accurate at the date of publication, neither
the authors nor the editors nor the publisher can accept any legal responsibility for any errors or omissions that may
be made. The publisher makes no warranty, express or implied, with respect to the material contained herein.
Managing Director, Apress Media LLC: Welmoed Spahr
Acquisitions Editor: Celestin Suresh John
Development Editor: Laura Berendson
Coordinating Editor: Mark Powers

Cover designed by eStudioCalamar


Cover image by Mohamed [email protected]
Distributed to the book trade worldwide by Apress Media, LLC, 1 New York Plaza, New York, NY 10004, U.S.A.
Phone 1-800-SPRINGER, fax (201) 348-4505, e-mail [email protected], or visit
www.springeronline.com. Apress Media, LLC is a California LLC and the sole member (owner) is Springer Science
+ Business Media Finance Inc (SSBM Finance Inc). SSBM Finance Inc is a Delaware corporation.
For information on translations, please e-mail [email protected]; for reprint, paperback, or
audio rights, please e-mail [email protected].
Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook versions and licenses are
also available for most titles. For more information, reference our Print and eBook Bulk Sales web page at http://
www.apress.com/bulk-sales.
Any source code or other supplementary material referenced by the author in this book is available to readers
on GitHub (https://github.com/Apress). For more detailed information, please visit https://www.apress.com/gp/
services/source-code.

Paper in this product is recyclable


To our families.
Contents

About the Authors XVII

About the Technical Reviewer XIX

Introduction XXI

1 An Overview of Machine Learning 1


1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Elements of Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2.1 Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2.2 Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2.3 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 The Learning Machine . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.4 Taxonomy of Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . 6
1.5 Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.5.1 Kalman Filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.5.2 Adaptive Control . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.6 Autonomous Learning Methods . . . . . . . . . . . . . . . . . . . . . . . . 10
1.6.1 Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.6.2 Decision Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.6.3 Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
1.6.4 Support Vector Machines (SVMs) . . . . . . . . . . . . . . . . . . 16
1.7 Artificial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
1.7.1 What Is Artificial Intelligence? . . . . . . . . . . . . . . . . . . . . 17
1.7.2 Intelligent Cars . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
1.7.3 Expert Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
1.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

2 Data for Machine Learning in MATLAB 21


2.1 Introduction to MATLAB Data Types . . . . . . . . . . . . . . . . . . . . . 21
2.1.1 Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.1.2 Cell Arrays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.1.3 Data Structures . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
2.1.4 Numerics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
2.1.5 Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

V
CONTENTS

2.1.6 Datastore . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
2.1.7 Tall Arrays . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
2.1.8 Sparse Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
2.1.9 Tables and Categoricals . . . . . . . . . . . . . . . . . . . . . . . . 30
2.1.10 Large MAT-Files . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.2 Initializing a Data Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.3 mapreduce on an Image Datastore . . . . . . . . . . . . . . . . . . . . . . . 36
2.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
2.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
2.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
2.4 Processing Table Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
2.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
2.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
2.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
2.5 String Concatenation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.6 Arrays of Strings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
2.7 Substrings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
2.7.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
2.7.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
2.7.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
2.8 Reading an Excel Spreadsheet into a Table . . . . . . . . . . . . . . . . . . . 44
2.8.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.8.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.8.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.9 Accessing ChatGPT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
2.9.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
2.9.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
2.9.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
2.10 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48

VI
CONTENTS

3 MATLAB Graphics 49
3.1 2D Line Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
3.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
3.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
3.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
3.2 General 2D Graphics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
3.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
3.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
3.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
3.3 Custom Two-Dimensional Diagrams . . . . . . . . . . . . . . . . . . . . . . 54
3.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
3.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
3.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
3.4 Three-Dimensional Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
3.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
3.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
3.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
3.5 Draw a 3D Object with a Texture . . . . . . . . . . . . . . . . . . . . . . . . 59
3.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
3.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
3.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
3.6 General 3D Graphics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
3.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
3.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
3.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
3.7 Building a GUI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.7.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.7.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.7.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.8 Animating a Bar Chart . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
3.8.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
3.8.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
3.8.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
3.9 Drawing a Robot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
3.9.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
3.9.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
3.9.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
3.10 Importing a Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
3.10.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
3.10.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
3.10.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
3.11 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83

VII
CONTENTS

4 Kalman Filters 85
4.1 Gaussian Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
4.2 A State Estimator Using a Linear Kalman Filter . . . . . . . . . . . . . . . . 87
4.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
4.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
4.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
4.3 Using the Extended Kalman Filter for State Estimation . . . . . . . . . . . . 106
4.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 106
4.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
4.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
4.4 Using the UKF for State Estimation . . . . . . . . . . . . . . . . . . . . . . 111
4.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
4.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112
4.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
4.5 Using the UKF for Parameter Estimation . . . . . . . . . . . . . . . . . . . . 117
4.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
4.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
4.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
4.6 Range to a Car . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
4.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
4.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
4.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
4.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125

5 Adaptive Control 127


5.1 Self-Tuning: Tuning an Oscillator . . . . . . . . . . . . . . . . . . . . . . . 128
5.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129
5.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
5.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
5.2 Implement MRAC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
5.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
5.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
5.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
5.3 Generating a Square Wave Input . . . . . . . . . . . . . . . . . . . . . . . . 140
5.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
5.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
5.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
5.4 Demonstrate MRAC for a Rotor . . . . . . . . . . . . . . . . . . . . . . . . 142
5.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
5.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
5.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142

VIII
CONTENTS

5.5 Ship Steering: Implement Gain Scheduling for Steering Control of a Ship . . 145
5.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
5.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
5.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
5.6 Spacecraft Pointing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
5.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
5.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
5.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 148
5.7 Direct Adaptive Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
5.7.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
5.7.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
5.7.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153
5.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155

6 Fuzzy Logic 157


6.1 Building Fuzzy Logic Systems . . . . . . . . . . . . . . . . . . . . . . . . . 158
6.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
6.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
6.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158
6.2 Implement Fuzzy Logic . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
6.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
6.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
6.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
6.3 Window Wiper Fuzzy Controller . . . . . . . . . . . . . . . . . . . . . . . . 169
6.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
6.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
6.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
6.4 Simple Discrete HVAC Fuzzy Controller . . . . . . . . . . . . . . . . . . . . 174
6.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
6.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
6.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
6.5 Variable HVAC Fuzzy Controller . . . . . . . . . . . . . . . . . . . . . . . . 180
6.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
6.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
6.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
6.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 189

7 Neural Aircraft Control 191


7.1 Longitudinal Motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
7.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
7.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
7.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193

IX
CONTENTS

7.2 Numerically Finding Equilibrium . . . . . . . . . . . . . . . . . . . . . . . . 198


7.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
7.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
7.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199
7.3 Numerical Simulation of the Aircraft . . . . . . . . . . . . . . . . . . . . . . 200
7.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
7.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
7.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
7.4 Activation Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
7.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
7.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
7.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
7.5 Neural Net for Learning Control . . . . . . . . . . . . . . . . . . . . . . . . 204
7.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
7.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
7.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
7.6 Enumeration of All Sets of Inputs . . . . . . . . . . . . . . . . . . . . . . . 208
7.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
7.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
7.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 208
7.7 Write a Sigma-Pi Neural Net Function . . . . . . . . . . . . . . . . . . . . . 210
7.7.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
7.7.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
7.7.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 210
7.8 Implement PID Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
7.8.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
7.8.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
7.8.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
7.9 PID Control of Pitch . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
7.9.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
7.9.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
7.9.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
7.10 Neural Net for Pitch Dynamics . . . . . . . . . . . . . . . . . . . . . . . . . 223
7.10.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
7.10.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
7.10.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224
7.11 Nonlinear Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
7.11.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
7.11.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
7.11.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
7.12 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228

X
CONTENTS

8 Introduction to Neural Nets 229


8.1 Daylight Detector . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
8.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
8.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
8.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 230
8.2 Modeling a Pendulum . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
8.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
8.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
8.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 232
8.3 Single Neuron Angle Estimator . . . . . . . . . . . . . . . . . . . . . . . . . 235
8.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235
8.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
8.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
8.4 Designing a Neural Net for the Pendulum . . . . . . . . . . . . . . . . . . . 240
8.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
8.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
8.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
8.5 XOR Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
8.6 Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
8.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254

9 Classification of Numbers Using Neural Networks 257


9.1 Generate Test Images with Defects . . . . . . . . . . . . . . . . . . . . . . . 257
9.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
9.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
9.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
9.2 Create the Neural Net Functions . . . . . . . . . . . . . . . . . . . . . . . . 262
9.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 262
9.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
9.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
9.3 Train a Network with One Output Node . . . . . . . . . . . . . . . . . . . . 267
9.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
9.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
9.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269
9.4 Testing the Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
9.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
9.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 272
9.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273
9.5 Train a Network with Many Outputs . . . . . . . . . . . . . . . . . . . . . . 273
9.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273
9.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 273
9.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
9.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
XI
CONTENTS

10 Data Classification with Decision Trees 279


10.1 Generate Test Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
10.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
10.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
10.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
10.2 Drawing Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
10.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
10.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
10.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 283
10.3 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287
10.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287
10.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 287
10.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 288
10.4 Creating a Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
10.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
10.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
10.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 291
10.5 Handmade Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
10.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
10.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
10.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
10.6 Training and Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298
10.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298
10.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298
10.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 298
10.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301

11 Pattern Recognition with Deep Learning 303


11.1 Obtain Data Online for Training a Neural Net . . . . . . . . . . . . . . . . . 305
11.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
11.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
11.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
11.2 Generating Training Images of Cats . . . . . . . . . . . . . . . . . . . . . . 305
11.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
11.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
11.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 306
11.3 Matrix Convolution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
11.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
11.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309
11.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309
11.4 Convolution Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
11.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311

XII
CONTENTS

11.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311


11.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
11.5 Pooling to Outputs of a Layer . . . . . . . . . . . . . . . . . . . . . . . . . . 312
11.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
11.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
11.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
11.6 Fully Connected Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
11.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
11.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
11.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
11.7 Determining the Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . 316
11.7.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316
11.7.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316
11.7.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317
11.8 Test the Neural Network . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318
11.8.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318
11.8.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318
11.8.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 318
11.9 Recognizing an Image . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
11.9.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
11.9.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
11.9.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
11.10 Using AlexNet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
11.10.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
11.10.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
11.10.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 322
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326

12 Multiple Hypothesis Testing 327


12.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 327
12.2 Theory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
12.2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
12.2.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331
12.2.3 Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331
12.2.4 Measurement Assignment and Tracks . . . . . . . . . . . . . . . . 333
12.2.5 Hypothesis Formation . . . . . . . . . . . . . . . . . . . . . . . . . 334
12.2.6 Track Pruning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
12.3 Billiard Ball Kalman Filter . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
12.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 335
12.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336
12.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336
12.4 Billiard Ball MHT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342

XIII
CONTENTS

12.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342


12.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342
12.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342
12.5 One-Dimensional Motion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 345
12.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 345
12.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 346
12.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 347
12.6 One-Dimensional MHT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
12.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
12.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
12.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
12.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351

13 Autonomous Driving with MHT 355


13.1 Automobile Dynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356
13.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356
13.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356
13.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356
13.2 Automobile Radar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
13.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
13.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
13.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 359
13.3 Passing Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
13.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
13.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
13.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 362
13.4 Automobile Animation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 363
13.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 363
13.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
13.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
13.4.4 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 364
13.5 Automobile Simulation and the Kalman Filter . . . . . . . . . . . . . . . . . 367
13.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 367
13.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368
13.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 368
13.6 Automobile Target Tracking . . . . . . . . . . . . . . . . . . . . . . . . . . 371
13.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
13.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
13.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 371
13.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 374

XIV
CONTENTS

14 Spacecraft Attitude Determination 377


14.1 Star Catalog . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377
14.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377
14.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 378
14.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 378
14.2 Camera Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
14.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
14.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
14.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
14.3 Celestial Sphere . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383
14.3.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383
14.3.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383
14.3.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383
14.4 Attitude Simulation of Camera Views . . . . . . . . . . . . . . . . . . . . . 384
14.4.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384
14.4.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384
14.4.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 384
14.5 Yaw Angle Rotation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
14.5.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
14.5.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
14.5.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
14.6 Yaw Images . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388
14.6.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388
14.6.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388
14.6.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 388
14.7 Attitude Determination . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
14.7.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
14.7.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
14.7.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 391
14.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 399

15 Case-Based Expert Systems 401


15.1 Building Expert Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
15.1.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
15.1.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
15.1.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 404
15.2 Running an Expert System . . . . . . . . . . . . . . . . . . . . . . . . . . . 406
15.2.1 Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 406
15.2.2 Solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
15.2.3 How It Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407
15.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 410

XV
CONTENTS

A A Brief History 411

B Software for Machine Learning 419

Bibliography 431

Index 435

XVI
About the Authors

Michael Paluszek is President of Princeton Satellite Systems,


Inc. (PSS) in Plainsboro, New Jersey. Michael founded PSS
in 1992 to provide aerospace consulting services. He used
MATLAB to develop the control system and simulations for the
Indostar-1 geosynchronous communications satellite. This led to
the launch of Princeton Satellite Systems’ first commercial MAT-
LAB toolbox, the Spacecraft Control Toolbox, in 1995. Since
then, he has developed toolboxes and software packages for air-
craft, submarines, robotics, and nuclear fusion propulsion, result-
ing in Princeton Satellite Systems’ current extensive product line.
He is working with the Princeton Plasma Physics Laboratory on a compact nuclear fusion re-
actor for energy generation and space propulsion. He is also leading the development of new
power electronics for fusion power systems and working on heat engine–based auxiliary power
systems for spacecraft. Michael is a lecturer at the Massachusetts Institute of Technology.
Prior to founding PSS, Michael was an engineer at GE Astro Space in East Windsor, NJ.
At GE, he designed the Global Geospace Science Polar despun platform control system and led
the design of the GPS IIR attitude control system, the Inmarsat-3 attitude control systems, and
the Mars Observer delta-V control system, leveraging MATLAB for control design. Michael
also worked on the attitude determination system for the DMSP meteorological satellites. He
flew communication satellites on over 12 satellite launches, including the GSTAR III recovery,
the first transfer of a satellite to an operational orbit using electric thrusters.
At Draper Laboratory, Michael worked on the Space Shuttle, Space Station, and submarine
navigation. His Space Station work included designing Control Moment Gyro–based control
systems for attitude control.
Michael received his bachelor’s degree in Electrical Engineering and master’s and engi-
neer’s degrees in Aeronautics and Astronautics from the Massachusetts Institute of Technology.
He is the author of numerous papers and has over a dozen US patents. Michael is the au-
thor of MATLAB Recipes, MATLAB Machine Learning, Practical MATLAB Deep Learning:
A Projects-Based Approach, Second Edition, all published by Apress, and ADCS: Spacecraft
Attitude Determination and Control by Elsevier.

XVII
ABOUT THE AUTHORS

Stephanie Thomas is Vice President of Princeton Satellite Sys-


tems, Inc. in Plainsboro, New Jersey. She received her bache-
lor’s and master’s degrees in Aeronautics and Astronautics from
the Massachusetts Institute of Technology in 1999 and 2001.
Stephanie was introduced to the PSS Spacecraft Control Tool-
box for MATLAB during a summer internship in 1996 and has
been using MATLAB for aerospace analysis ever since. In her
over 20 years of MATLAB experience, she has developed many
software tools, including the Solar Sail Module for the Spacecraft
Control Toolbox, a proximity satellite operations toolbox for the
Air Force, collision monitoring Simulink blocks for the Prisma
satellite mission, and launch vehicle analysis tools in MATLAB and Java. She has developed
novel methods for space situation assessment, such as a numeric approach to assessing the
general rendezvous problem between any two satellites implemented in both MATLAB and
C++. Stephanie has contributed to PSS’ Spacecraft Attitude and Orbit Control textbook, featur-
ing examples using the Spacecraft Control Toolbox, and written many software User’s Guides.
She has conducted SCT training for engineers from diverse locales such as Australia, Canada,
Brazil, and Thailand and has performed MATLAB consulting for NASA, the Air Force, and
the European Space Agency. Stephanie is the author of MATLAB Recipes, MATLAB Machine
Learning, and Practical MATLAB Deep Learning published by Apress. In 2016, she was named
a NASA NIAC Fellow for the project “Fusion-Enabled Pluto Orbiter and Lander.” Stephanie
is an Associate Fellow of the American Institute of Aeronautics and Astronautics (AIAA) and
Vice Chair of the AIAA Nuclear and Future Flight Propulsion committee. Her ResearchGate
profile can be found at https://www.researchgate.net/profile/Stephanie-Thomas-2.

XVIII
About the Technical Reviewer

Joseph Mueller took a new position as Principal Astrodynamics


Engineer at Millennium Space Systems in 2023, where ground-
breaking capabilities in space are being built, one small satellite
at a time, and he is honored to be a part of it.
From 2014 to 2023, he was a senior researcher at Smart In-
formation Flow Technologies, better known as SIFT. At SIFT, he
worked alongside amazing people, playing in the sandbox of in-
credibly interesting technical problems. His research projects at
SIFT included navigation and control for autonomous vehicles,
satellite formation flying, space situational awareness, and robotic
swarms.
Joseph is married and is a father of three, living in Champlin, MN.
His Google Scholar profile can be found at https://scholar.google.com/citations?hl=en&
user=breRtVUAAAAJ and his ResearchGate profile at www.researchgate.net/profile/Joseph-
Mueller-2.

XIX
Introduction

Machine Learning is becoming important in every engineering discipline. For example:

1. Autonomous cars: Machine learning is used in almost every aspect of car control systems.

2. Plasma physicists use machine learning to help guide experiments on fusion reactors.
TAE Technologies has used it with great success in guiding fusion experiments. The
Princeton Plasma Physics Laboratory (PPPL) has used it for the National Spherical Torus
Experiment to study a promising candidate for a nuclear fusion power plant.

3. It is used in finance for predicting the stock market.

4. Medical professionals use it for diagnoses.

5. Law enforcement and others use it for facial recognition. Several crimes have been solved
using facial recognition!

6. An expert system was used on NASA’s Deep Space 1 spacecraft.

7. Adaptive control systems steer oil tankers.

There are many, many other examples.


While many excellent packages are available from commercial sources and open source
repositories, it is valuable to understand how these algorithms work. Writing your own al-
gorithms is valuable both because it gives you insight into the commercial and open source
packages and also because it gives you the background to write your custom Machine Learning
software specialized for your application.
MATLAB had its origins for that very reason. Scientists who needed to do operations
on matrices used numerical software written in FORTRAN. At the time, using computer lan-
guages required the user to go through the write-compile-link-execute process which was time-
consuming and error-prone. MATLAB presented the user with a scripting language that allowed
the user to solve many problems with a few lines of a script that executed instantaneously. MAT-
LAB has built-in visualization tools that helped the user better understand the results. Writing
MATLAB was a lot more productive and fun than writing FORTRAN.
The goal of MATLAB Machine Learning Recipes: A Problem-Solution Approach is to help
all users harness the power of MATLAB to solve a wide range of learning problems. The book
has something for everyone interested in Machine Learning. It also has material that will allow
people with an interest in other technology areas to see how Machine Learning, and MATLAB,
can help them solve problems in their areas of expertise.

XXI
INTRODUCTION

Using the Included Software


This textbook includes a MATLAB toolbox that implements the examples. The toolbox
consists of

1. MATLAB functions

2. MATLAB scripts

3. HTML help

The MATLAB scripts implement all of the examples in this book. The functions encapsulate the
algorithms. Many functions have built-in demos. Just type the function name in the command
window, and it will execute the demo. The demo is usually encapsulated in a subfunction. You
can copy out this code for your demos and paste it into a script. For example, type the function
name PlotSet into the command window, and the plot in Figure 1 will appear.
1 >> PlotSet

cos
1
A
B
0.5

0
y

-0.5

-1
0 100 200 300 400 500 600 700 800 900 1000
x

sin
1

0.5

0
y

-0.5

-1
0 100 200 300 400 500 600 700 800 900 1000
x

Figure 1: Example plot from the function PlotSet.m

XXII
INTRODUCTION

If you open the function, you will see the demo:


1 %%% PlotSet>Demo
2 function Demo
3
4 x = linspace(1,1000);
5 y = [sin(0.01*x);cos(0.01*x);cos(0.03*x)];
6 disp('PlotSet: One x and two y rows')
7 PlotSet( x, y, 'figure title', 'PlotSet Demo',...
8 'plot set',{[2 3], 1},'legend',{{'A' 'B'},{}},'plot title',{'cos','
sin'});

You can use these demos to start your scripts. Some functions, like right-hand-side functions
for numerical integration, don’t have demos. If you type a function name at the command line
that doesn’t have a built-in demo, you will get an error as in the code snippet below.
1 >> RHSAutomobileXY
2 Error using RHSAutomobileXY (line 17)
3 A built-in demo is not available.

The toolbox is organized according to the chapters in this book. The folder names are
Chapter_01, Chapter_02, etc. In addition, there is a General folder with functions that support
the rest of the toolbox. In addition, you will need the open source package GLPK (GNU Linear
Programming Kit) to run some of the code. Nicolo Giorgetti has written a MATLAB mex
interface to GLPK that is available on SourceForge and included with this toolbox. The interface
consists of

1. glpk.m

2. glpkcc.mexmaci64, or glpkcc.mexw64, etc.

3. GLPKTest.m

which are available from https://sourceforge.net/projects/glpkmex/. The second item is the


mex file of glpkcc.cpp compiled for your machine, such as Mac or Windows. Go to www.gnu.
org/software/glpk/ to get the GLPK library and install it on your system. If needed, download
the GLPKMEX source code as well and compile it for your machine, or else try another of the
available compiled builds.

XXIII
CHAPTER 1

An Overview of Machine Learning

1.1 Introduction
Machine Learning is a field in computer science where data is used to predict, or respond to,
future data. It is closely related to the fields of pattern recognition, computational statistics, and
artificial intelligence. The data may be historical or updated in real time. Machine learning is
important in areas like facial recognition, spam filtering, content generation, and other areas
where it is not feasible, or even possible, to write algorithms to perform a task.
For example, early attempts at filtering junk emails had the user write rules to determine
what was junk or spam. Your success depended on your ability to correctly identify the attributes
of the message that would categorize an email as junk, such as a sender address or words in the
subject, and the time you were willing to spend to tweak your rules. This was only moderately
successful as junk mail generators had little difficulty anticipating people’s handmade rules.
Modern systems use machine learning techniques with much greater success. Most of us are
now familiar with the concept of simply marking a given message as “junk” or “not junk” and
take for granted that the email system can quickly learn which features of these emails identify
them as junk and prevent them from appearing in our inbox. This could now be any combination
of IP or email addresses and words and phrases in the subject or body of the email, with a variety
of matching criteria. Note how the machine learning in this example is data driven, autonomous,
and continuously updating itself as you receive emails and flag them. However, even today, these
systems are not completely successful since they do not yet understand the “meaning” of the
text that they are processing.
Content generation is an evolving area. By training engines over massive data sets, the
engines can generate content such as music scores, computer code, and news articles. This has
the potential to revolutionize many areas that have been exclusively handled by people.
In a more general sense, what does machine learning mean? Machine learning can mean
using machines (computers and software) to gain meaning from data. It can also mean giving
machines the ability to learn from their environment. Machines have been used to assist humans
for thousands of years. Consider a simple lever, which can be fashioned using a rock and a length
of wood, or an inclined plane. Both of these machines perform useful work and assist people,
but neither can learn. Both are limited by how they are built. Once built, they cannot adapt to
changing needs without human interaction.

© The Author(s), under exclusive license to APress Media, LLC, part of Springer Nature 2024 1
M. Paluszek, S. Thomas, MATLAB Machine Learning Recipes,
https://doi.org/10.1007/978-1-4842-9846-6 1
CHAPTER 1 OVERVIEW

Machine learning involves using data to create a model that can be used to solve a problem.
The model can be explicit, in which case the machine learning algorithm adjusts the model’s
parameters, or the data can form the model. The data can be collected once and used to train a
machine learning algorithm, which can then be applied. For example, ChatGPT scrapes textual
data from the Internet to allow it to generate text based on queries. An adaptive control system
measures inputs and command responses to those inputs to update parameters for the control
algorithm.
In the context of the software we will be writing in this book, machine learning refers to
the process by which an algorithm converts the input data into parameters it can use when
interpreting future data. Many of the processes used to mechanize this learning derive from
optimization techniques and, in turn, are related to the classic field of automatic control. In
the remainder of this chapter, we will introduce the nomenclature and taxonomy of machine
learning systems.

1.2 Elements of Machine Learning


This section introduces key nomenclature for the field of machine learning.

1.2.1 Data
All learning methods are data driven. Sets of data are used to train the system. These sets may
be collected and edited by humans or gathered autonomously by other software tools. Control
systems may collect data from sensors as the systems operate and use that data to identify pa-
rameters or train the system. Content generation systems scour the Internet for information. The
data sets may be very large, and it is the explosion of data storage infrastructure and available
databases that is largely driving the growth in machine learning software today. It is still true that
a machine learning tool is only as good as the data used to create it, and the selection of training
data is practically a field in itself. Selection of data for many systems is highly automated.

NOTE When collecting data for training, one must be careful to ensure that the time
variation of the system is understood. If the structure of a system changes with time, it may be
necessary to discard old data before training the system. In automatic control, this is sometimes
called a forgetting factor in an estimator.

1.2.2 Models
Models are often used in learning systems. A model provides a mathematical framework for
learning. A model is human-derived and based on human observations and experiences. For
example, a model of a car, seen from above, might be that it is rectangular with dimensions that
fit within a standard parking spot. Models are usually thought of as human-derived and provide
a framework for machine learning. However, some forms of machine learning develop their
models without a human-derived structure.

2
CHAPTER 1 OVERVIEW

1.2.3 Training
A system which maps an input to an output needs training to do this in a useful way. Just as
people need to be trained to perform tasks, machine learning systems need to be trained. Train-
ing is accomplished by giving the system an input and the corresponding output and modifying
the structure (models or data) in the learning machine so that mapping is learned. In some ways,
this is like curve fitting or regression. If we have enough training pairs, then the system should
be able to produce correct outputs when new inputs are introduced. For example, if we give
a face recognition system thousands of cat images and tell it that those are cats, we hope that
when it is given new cat images it will also recognize them as cats. Problems can arise when you
don’t give it enough training sets, or the training data is not sufficiently diverse, for instance,
identifying a long-haired cat or hairless cat when the training data is only of short-haired cats.
A diversity of training data is required for a functioning algorithm.
Supervised Learning
Supervised learning means that specific training sets of data are applied to the system. The
learning is supervised in that the “training sets” are human-derived. It does not necessarily
mean that humans are actively validating the results. The process of classifying the systems’
outputs for a given set of inputs is called “labeling.” That is, you explicitly say which results are
correct or which outputs are expected for each set of inputs.
The process of generating training sets can be time-consuming. Great care must be taken
to ensure that the training sets will provide sufficient training so that when real-world data is
collected, the system will produce correct results. They must cover the full range of expected
inputs and desired outputs. The training is followed by test sets to validate the results. If the
results aren’t good, then the test sets are cycled into the training sets, and the process is repeated.
A human example would be a ballet dancer trained exclusively in classical ballet technique.
If they were then asked to dance a modern dance, the results might not be as good as required
because the dancer did not have the appropriate training sets; their training sets were not suffi-
ciently diverse.
Unsupervised Learning
Unsupervised learning does not utilize training sets. It is often used to discover patterns in data
for which there is no “right” answer. For example, if you used unsupervised learning to train
a face identification system, the system might cluster the data in sets, some of which might be
faces. Clustering algorithms are generally examples of unsupervised learning. The advantage
of unsupervised learning is that you can learn things about the data that you might not know in
advance. It is a way of finding hidden structures in data.

3
CHAPTER 1 OVERVIEW

Semi-supervised Learning
With this approach, some of the data are in the form of labeled training sets, and other data are
not [12]. Typically, only a small amount of the input data is labeled, while most are not, as the
labeling may be an intensive process requiring a skilled human. The small set of labeled data is
leveraged to interpret the unlabeled data.
Online Learning
The system is continually updated with new data [12]. This is called “online” because many of
the learning systems use data collected while the system is operating. It could also be called
recursive learning. It can be beneficial to periodically “batch” process data used up to a given
time and then return to the online learning mode. The spam filtering systems collect data from
emails and update their spam filter. Generative deep learning systems like ChatGPT use massive
online learning.

1.3 The Learning Machine


Figure 1.1 shows the concept of a learning machine. The machine absorbs information from
the environment and adapts. The inputs may be separated into those that produce an immediate
response and those that lead to learning. In some cases, they are completely separate. For ex-
ample, in an aircraft, a measurement of altitude is not usually used directly for control. Instead,
it is used to help select parameters for the actual control laws. The data required for learning
and regular operation may be the same, but in some cases, separate measurements or data will
be needed for learning to take place. Measurements do not necessarily mean data collected by
a sensor such as radar or a camera. It could be data collected by polls, stock market prices, data
in accounting ledgers, or any other means. Machine learning is then the process by which the
measurements are transformed into parameters for future operation.

Measurements (Learning)

Learning

Parameters Actions

Machine
Environment
Actions

Measurements (Immediate Use)

Figure 1.1: A learning machine that senses the environment and stores data in memory

4
CHAPTER 1 OVERVIEW

Note that the machine produces output in the form of actions. A copy of the actions may
be passed to the learning system so that it can separate the effects of the machine’s actions
from those of the environment. This is akin to a feedforward control system, which can result
in improved performance.
A few examples will clarify the diagram. We will discuss a medical example, a security
system, and spacecraft maneuvering.
A doctor might want to diagnose diseases more quickly. They would collect data on tests
on patients and then collate the results. Patient data might include age, height, weight, historical
data like blood pressure readings and medications prescribed, and exhibited symptoms. The
machine learning algorithm would detect patterns so that when new tests were performed on a
patient, the machine learning algorithm would be able to suggest diagnoses or additional tests to
narrow down the possibilities. As the machine learning algorithm was used, it would, hopefully,
get better with each success or failure. Of course, the definition of success or failure is fuzzy. In
this case, the environment would be the patients themselves. The machine would use the data
to generate actions, which would be new diagnoses. This system could be built in two ways.
In the supervised learning process, test data and known correct diagnoses are used to train the
machine. In an unsupervised learning process, the data would be used to generate patterns that
might not have been known before, and these could lead to diagnosing conditions that would
normally not be associated with those symptoms.
A security system might be put into place to identify faces. The measurements are camera
images of people. The system would be trained with a wide range of face images taken from
multiple angles. The system would then be tested with these known persons and its success rate
validated. Those that are in the database memory should be readily identified, and those that are
not should be flagged as unknown. If the success rate was not acceptable, more training might
be needed, or the algorithm itself might need to be tuned. This type of face recognition is now
common, used in Mac OS X’s “Faces” feature in Photos, face identification on the new iPhone
X, and Facebook when “tagging” friends in photos.
For precision maneuvering of a spacecraft, the inertia of the spacecraft needs to be known.
If the spacecraft has an inertial measurement unit that can measure angular rates, the inertia
matrix can be identified. This is where machine learning is tricky. The torque applied to the
spacecraft, whether by thrusters or momentum exchange devices, is only known to a certain
degree of accuracy. Thus, the system identification must sort out, if it can, the torque scaling
factor from the inertia. The inertia can only be identified if torques are applied. This leads to
the issue of stimulation. A learning system cannot learn if the system to be studied does not
have known inputs, and those inputs must be sufficiently diverse to stimulate the system so that
the learning can be accomplished. Training a face recognition system with one picture will not
work.

5
CHAPTER 1 OVERVIEW

1.4 Taxonomy of Machine Learning


In this book, we take a larger view of machine learning than is normal. Machine learning as de-
scribed earlier is the collecting of data, finding patterns, and doing useful things based on those
patterns. We expand machine learning to include adaptive and learning control. These fields
started independently but now are adapting technology and methods from machine learning.
Figure 1.2 shows how we organize the technology of machine learning into a consistent
taxonomy. You will notice that we created a title that encompasses three branches of learning;
we call the whole subject area “Autonomous Learning.” That means learning without human
intervention during the learning process. This book is not solely about “traditional” machine
learning. Other, more specialized books focus on any one of the machine learning topics. Op-
timization is part of the taxonomy because the results of optimization can be discoveries, such
as a new type of spacecraft or aircraft trajectory. Optimization is also often a part of learning
systems.

Autonomous
Learning

Control Machine Learning

State Inductive
Estimation Learning

Pattern
Adaptive Recognition Expert
Control Systems

Data Mining
System

Fuzzy Logic

Optimal
Control Optimization

Figure 1.2: Taxonomy of machine learning. The dotted lines show connections between branches

6
CHAPTER 1 OVERVIEW

There are three categories under Autonomous Learning. The first is Control. Feedback con-
trol is used to compensate for uncertainty in a system or to make a system behave differently
than it would normally behave. If there was no uncertainty, you wouldn’t need feedback. For
example, if you are a quarterback throwing a football at a running player, assume for a moment
that you know everything about the upcoming play. You know exactly where the player should
be at a given time, so you can close your eyes, count, and just throw the ball to that spot. As-
suming the player has good hands, you would have a 100% reception rate! More realistically,
you watch the player, estimate the player’s speed, and throw the ball. You are applying feedback
to the problem. As stated, this is not a learning system. However, if now you practice the same
play repeatedly, look at your success rate, and modify the mechanics and timing of your throw
using that information, you would have an adaptive control system, the second box from the top
of the control list. Learning in control takes place in adaptive control systems and also in the
general area of system identification.
System identification is learning about a system. By system, we mean the data that rep-
resents anything and the relationships between elements of that data. For example, a particle
moving in a straight line is a system defined by its mass, the force on that mass, its velocity,
and its position. The position is related to the velocity times time, and the velocity is related and
determined by the acceleration, which is the force divided by the mass.
Optimal control may not involve any learning. For example, what is known as full-state
feedback produces an optimal control signal but does not involve learning. In full-state feed-
back, the combination of model and data tells us everything we need to know about the system.
However, in more complex systems, we can’t measure all the states and don’t know the param-
eters perfectly, so some form of learning is needed to produce “optimal” or the best possible re-
sults. In a learning system, optimal control would need to be redefined as the system learns. For
example, an optimal space trajectory assumes thruster characteristics. As a mission progresses,
the thruster performance may change, requiring recomputation of the “optimal” trajectory.
System identification is the process of identifying the characteristics of a system. A sys-
tem can, to a first approximation, be defined by a set of dynamical states and parameters. For
example, in a linear time-invariant system, the dynamical equation is

ẋ = Ax + Bu
. (1.1)

where A and B are matrices of parameters, u is an input vector, and x is the state vector. System
identification would find A and B. In a real system, A and B are not necessarily time invariant,
and most systems are only linear to a first approximation.
The second category is what many people consider true Machine Learning. This is mak-
ing use of data to produce behavior that solves problems. Much of its background comes from
statistics and optimization. The learning process may be done once in a batch process or con-
tinually in a recursive process. For example, in a stock buying package, a developer might have
processed stock data for several years, say before 2008, and used that to decide which stocks
to buy. That software might not have worked well during the financial crash. A recursive pro-
gram would continuously incorporate new data. Pattern recognition and data mining fall into
this category. Pattern recognition is looking for patterns in images. For example, the early AI

7
Another random document with
no related content on Scribd:
Belasyse, Frances, 137
Belasyse, Sir Henry, 137
Belasyse, Hon. Isabella, 55
Belasyse, Jane, Lady, 137
Belasyse, John, Baron Belasyse, 47, 55, 65, 137
Bell, William, 19n
Bellamont, Richard Coote, 4th Earl, 56, 76
Bellamont, Countess (formerly Lady Oxenden), 76
Belton Street, 103, 105, 111
Bennet, Samuel, 110
Bennet’s Garden (The Bowl property), 112
Berkeley, Elizabeth, dowager Lady, 92
Berkstead, Col., 60n
Bertie, Hon. Robert, 136
Bethell, Zachery, 119n, 122n
Betterton Street, 103, 104
Bevan, —, 182
Bierly, William. (See Byerly)
Bigg, John, 39n
Bigg, Walter, 120n
Bishop, John, 3n
Bishopp, Samuel, 3n
Black Bear Inn, 107
Black Bear Yard, 108
Black Lamb, 110, 111
Blacksmith’s forge, 144
Blackwell, Jonathan, 74
Blackwell, Rev. Thos., 115n
Blague, Mary, 16n
Blisset, Joseph, 70
Blomeson, John, 126
Bloomsbury Great Close, 125n, 187
Blount, Charles (afterwards Earl of Devonshire), 126n
Blount (Blunt), Sir James. (See Mountjoy.)
Blumsberrie Fieldes, 110n
Blyke, Ric., 75n
Blythe, Arthur, 110n, 111n
Blythe, Thomas, 110
Boak, —, 66
Boak, Ann, 66, 67n
Boak, E., 66
Bochier, Thomas, 3n
Boddington, John, 169
Bol, Ferdinand, 55
Boldero, John, 184
Boldero, Mrs., 184
Bolingbroke, Lord, 149
Bolton, Charles Powlett, 2nd Duke of, 65
Bonomi, —, 151
Booker, Mr., 12
Booth, Rev. Chas., 11
Bootle, Mrs., 169
Borde, Doctor, 119, 125
Boreman, Robert, 139
Borrett, Edw., 70
Bosomysynne, 23n
Boswell, Alexander, Lord Auchinleck, 57
Boswell, Jas., 57
Boswell, John, 121
Boteler, Sir Robert, 137
Bothwell, Lord, 6n
Bothwell House, 6n
Bottomley, Joseph, 44n, 46
Boundary of parish, 1–2
Bowen, —, 57
Bower, J., 84
Bowes, Robert, 28
Bowes, William, 144
Bowl, The, 110, 111, 112n
Bowl Yard, 111
Bowne, Madame, 56
Boyle, Roger, 1st Earl of Orrery, 79
Bradley, James, 76
Bradshaw, Mr., 91
Braithwait, Mr., 18
Bramston, Sir John, 145n
Bransby, Robert, 79
Braynsgrave, William, 20
Brereton, W., 56
Brett, Richard, 21, 42, 43
Brewer, Thomas, 46n, 50
Bringhurst, Anne, 121n
Bringhurst, Isaac, 118n, 119, 121
Briscowe, James, 20n, 24n, 107
Briscowe, Joan (née Wise), 107, 119
Bristol, George Digby, 2nd Earl of, 52, 54
Bristol, John Digby, Earl of, 23n, 47n, 50, 51
Bristol House (Nos. 55 and 56, Great Queen Street), 42–58, 59, 60,
63, 65
Bristow, John, 149
Bristowe, Jas., 119
British Lying-In Hospital, 103
Broad Street, 101, 106–111
Brock (Brooke), Thos., 92
Bromeley, Robert, 108
Bromley, Sir John. (See Brownlow.)
Brooke, Catherine, Lady, 51
Brooke, Robert Greville, 2nd Baron, 51n
Brooks, Mr., 61
Broome, Peter, 7n
Broomwhoerwood, Thomas, 11
Brown and Barrow, Messrs., 63
Browne, Henry, 47, 48
Browne, Henry, 5th Viscount Montagu, 65
Browne, Isaac Hawkins (father and son), 84, 85
Browne, Robert, 126
Browne, Thomas, 126
Browne, Tom, 68
Brownlow (Bromley), Sir John, 102, 112
Brownlow, Sir John, 103, 105
Brownlow, Sir Richard, 103
Brownlow Street, 103
Brownlow Street Lying-In Hospital, 103
Brudenell, Anne, Countess of Cardigan, 90
Brudenell, Robert, 2nd Earl of Cardigan, 90
Buck, George, 28
Buck, John, 7
Buck, Margaret, 6, 7
Buck, Matthew, 20, 24
Buckeridge, Edmund, 145n
Buckeridge, Nicholas, 145n
Buckeridge, Sara, 145n
Buckingham, George Villiers, 1st Duke of, 91n
Buckingham, George Villiers, 2nd Duke of, 91n
Buckingham, Katherine, Duchess of, 91n
Buckingham and Normanby, John Sheffield, Duke of, 74
Bucknall Street, 145
Buckner, John, (afterwards Bishop of Chichester), 138, 139
Buckridge Street, 145
Burges, Thos., 87, 92
Burgh, Ulick de. (See Clanricarde).
Burghe, Edw., 59n, 60n, 67n
Burn, Thomas, 167
Burnet, Gilbert, Bishop of Salisbury, 75
Burnett, —, 71
Burrage, Thomas, 21
Burton, Thomas, 27, 29, 30, 31n, 32, 35n, 37
Burton, Thomas, 11
Burton, Walter, 29, 30, 31, 35, 40
Burton and Co., 11
Burton Lazars, 24, 27, 117–126
Byerly (Bierly), William, 6, 8n, 94
Byng, Ed., 65n
Byrcke, — Esq., 119
Byrn, Wm., 71

Calley (Cawley), Sir W., 42, 93


Cantelowe Close, 187, 188
Cardigan, Anne, Countess of, 90
Cardigan, Robert Brudenell, 2nd Earl of, 90
Carew, Anne, 6, 7
Carew, George, Dean of Windsor, 6n
Carew, Sir George, Baron Carew of Clopton and Earl of Totnes, 6
Carew, Lady Martha, 125
Carew, Peter, 6
Carew, Thomas, 119, 122, 125
Carew, Sir Wymonde, 119, 122, 125, 127
Carlisle, Charles Howard, 3rd Earl of, 92
Carlisle, Edward Howard, 2nd Earl of, 92
Carlisle, James Hay, Earl of, 19
Carnwath, Robert Dalyell, Earl of, 43
Carpenter, John, 186, 187
Carter, Benjamin, 120
Carter, Rev. Philip, 59n, 70
Cartwright, William, 74n
Cary, Lord, 101n
Case, Thomas, 78n
Castell, Samuel, 179
Castle Street, 112n, 113, 114
Castlehaven, Countess of, 102
Catton, Charles (Senior), 11, 12
Catton, Charles (Junior), 12
Cavendish, Lord Charles, 162
Cavendish, Hon. Henry, 162
Cavendish, Hon. John, 162
Cavendish, William, 3rd Earl of Devonshire, 54
Cecil, Robert, Earl of Salisbury, 36
Chaloner, Joan, Lady (widow of Sir Thomas Legh), 124, 126
Chaloner, Sir Thomas, 126n
Chamberlain, Dr., 82
Chamberlain’s Stable, 5n
Chandler, Nathaniel, 29n
Chandler, Samuel, 29n
Chandos, Henry, Duke of, 75n
Chapman, George, 135
Charing Cross Road, 118, 119
Charles I., 13
Charles Street (now Macklin Street), 30
Chaworth, Lady, 91, 92
Chaworth, Patricius, 3rd Viscount, 91
Cheek, Phineas, 11
Cheeke, —, 70
Chequer, The, Broad Street, 125
Chichester, John Buckner, Bishop of, 138, 139
Chippendale, Thomas, 58, 67n
Chippendale, William, 57, 58
Christmas, John, 23n
Christmas, William, 23
Christmasse estate, 30n
Church of All Saints, West Street, 115–116
Church of St. Giles-in-the-Fields, 127–140
Church Close (Williamsfeild), 144n, 145
Church Lane, 145
Church Street, 145
City of London, Corporation of, 16, 186, 187
City of London School, 187
Clanricarde, Marquess of, 2nd Earl of St. Albans, 46, 47, 50, 59
Clanricarde House, Great Queen Street, 37, 50
Clare, John Holles, 1st Earl of, 100, 188
Clarendon, Lord, 97n
Clarke, John (Rector), 139
Clarke, John, 32n
Clarke, Mrs. Mary, 84
Clarke (alias Sadler), Thomas, 80
Clements, Thomas, 3n
Clerke, Katherine (alias Smyth), 24
Cleveland, Duchess of, 53n
Clifton, Gervase, Lord, 101n
Clifton, Katherine, Duchess of Lennox, 101, 102
Clifton, Robert, 106, 109
Clive, Catherine (Kitty), 70, 71
Clive, George, 71
Clyff, Richard, 3
Coach Office, No. 55, Great Queen Street, 56
Coal Yard (afterwards Goldsmith Street), 21, 22
Cobham, Lord, 144n
Cock Alley, 108
Cock and Coffin, High Holborn, 3n
Cock and Pye Fields, 112
Cock and Pye Inn, Marshlands, 112
Cockerell, F. P., 63
Cockerell, Professor C. R., 63
Cockpit Side, 94
Cockshott (Cockeshute, Cockshoote), Richard, 20, 24
Coke, Edward, 148
Coke, John, 3
Coke, Sir Thomas, Lord Lovel, 148
Colchester, Richard, Lord (afterwards 4th Earl Rivers), 69, 70
Colchester, Thomas Darcy (afterwards Viscount), 67
Cole, Bassitt, 21
Cole, Francis, 9n
Cole, Salomon, 21n
Colleton, Sir John, 70n
Colleton, Sir Peter, 69n
Colleton (alias Johnson), Mrs. Elizabeth, 69, 70
Collins, William, 138
Colmanhedge Field, 123
Colman’s Hedge, 123
Cologan, John, 165
Column at Seven Dials, 113–114
Combe, Harvey Christian, 149
Complin, Mrs. Eleanor, 92
Compton, Sir Henry, 46n, 47n, 50
Con, Seignior, 67
Conduit Close, 123
Connaught Rooms, Freemasons’ Tavern, 55, 63
Const, Francis, 90, 91n
Constable, Dorothy, Lady, 51n
Constable, Sir William, 51
Conway, Anne, Lady, 78
Conway, Edward, 1st Viscount (Secretary), 30
Conway, Edward, 2nd Viscount, 59n, 60, 71n, 73n, 78
Conway, Edward, 3rd Viscount, and 1st Earl of Conway, 37n, 78, 82n
Conway, Francis Seymour, 5th Baron Conway, 61n
Conway, Popham Seymour, 78, 82
Conway, Ursula, Marchioness of Normanby, 82
Conway House, 46n, 60, 63, 78–83, 84
Cony, Sir William, 138
Cook, William, 6
Cooper, John, 71
Cooper, Thos., 71
Coote, Sir Charles, 79
Coote, Richard, 4th Earl Bellamont, 56, 76
Cope, Dame Dorothy, 126n
Cope, Isabella (afterwards Lady Rich), 126
Cope, Sir Edward, 119, 120
Cope, Sir Walter, 126
Cope (Rope), Master, 126n
Corben, J. F., 138
Cornwallis, Sir Charles, 16, 29, 31
Cornwallis, Francis, 18
Cornwallis, Sir William, 30n
Coronell, Augustine, 95n
Corrance, John, 95n, 100n
Cotham, Philip, 3n
Cotton, Sir John, 51
Covert, Thomas, 88
Cowles Field, 147
Cowles Pasture, 147
Cowper, Sir John, 7, 9n
Cowper, John, 9
Cowper, Margaret, 9
Crace, John, 71
Crane, Sir Robert, 137
Cranigh (Crainck), Burrard (Burcharde, Buckharte), 5, 6
Craven, Earl of, 96n
Crewe, Master, 11n
Crews, Robert, 150
Crofts, John, 39
Cromwell, Thomas, 123n
Crook, Thos., 59n
Cross at Aldewych, 23
Cross Lane, 18
Cross Street (now Newton Street), 27
Crouch, Gilbt., 96
Crouton, John, 23n, 144n
Crown, The, Broad Street, 109
Crown, The, Great Queen Street, 89
Crown, The, High Street, 125, 144
Crown Court, 106, 109
Cruce, John de, 23, 107
Curtis, William, 18n

Dalcona Close (Pursefield), 34


Dallison, Sir Chas., 8n
Dalyell, Sir Robert, Earl of Carnwath, 43
Dance, George (the younger), 187
Dandridge, Bartholomew, 56, 57
Daniell, William, 187
Darby, Mary (afterwards Robinson) (“Perdita”), 77–78
Darcy, Elizabeth (afterwards Lady Savage and Countess Rivers), 59,
67, 68, 73n, 90
Darcy, Thomas, Baron Darcy of Chich, afterwards Earl Rivers, 67
Darell, Henry, 16, 17
Darrell, William, 94
Dashwood, Lady Anne, 76
Dashwood, Sir Samuel, 76
D’Aubigny, Seigneurs. (See Stuart).
Davies, W. L., 138
Davis, —, 89, 92
Davison, Henry, 159
Dawes, Sir William, Archbishop of York, 110n
Dayrell’s Buildings, 16, 17
Deane, Jeremy, 38n
de Cruce, John, 23, 107
De la Chambre, John, 18
Denmark, Prince George of, 142
Denmark House, 67
Denmark Place, 121, 144
Denmark Street, 120, 121, 142
Densyle (Densyll), Master, 119, 125
Devereux, Robert, 3rd Earl of Essex, 88
Devil’s Gap, 36n
Devonshire, Charles Blount, Earl of, 126n
Devonshire, William, 3rd Duke of, 162
Devonshire, William Cavendish, 3rd Earl of, 54
Dickens, Col. Guy, 89
Dickenson, —, 83
Dickenson, —, 84
Dickey, William, 176
Digby, Lady Anne (afterwards Countess of Sunderland), 54
Digby, George, 2nd Earl of Bristol, 52, 54
Digby, Sir George, of Coleshill, 50
Digby, John, 1st Earl of Bristol, 23n, 47n, 50, 51
Digby, Hon. John, 51, 52n
Digby, Sir Kenelm, 42, 43, 93
Dilleage, Lord, 73
Dillingham, Gilbert, 139
Dive, Sir Lewis, 46n, 50
Dodswell, Jonathan, 29n
Dorset, Earl of, 6, 19, 106
Doughty, Thomas, 5
Doughty, Thomas, Junior, 5
Douglas, Captain, 97n
Douglas, John, 173
Douglas, Sylvester, Baron Glenbervie, 172, 173
Downe, Earl of, 102, 106
Downes, Edward, 119
Downes, Francis, 119, 122, 126
Downes, John, 69n
Downes, Penelope (afterwards Countess Rivers), 69n
Downes, Robert, 119
Drake, Geo., 170
Drury House, 23n, 34
Drury Lane (Aldewych), 23, 25, 30, 35, 107
Drury Lane. (See also Aldwych Close.)
Duckett, William, 6n
Dudley, Alice, Lady, 93n, 120, 121, 128, 129, 130, 135
Dudley, Sir John. (See Lisle).
Dudley, Sir Robert, 135
Dudley Court, 121
Duke Street (afterwards Sardinia Street), 100
Dummer, Thos., 66
Dunbar, Mary, Viscountess, 137
Dunn, George E., 132
Dyott, Jane, 145n
Dyott, Simon, 145n
Dyott Street, 145
Dysart, Countess of, 102n
Dyxson, Thomas, 3

Eagle and Child, High Holborn, 3n


Earl Street, 113
Eaton, Madame, 92
Edlyn, Edmund, 31n, 32
Edmonds, Jane, 29
Edwards, Thomas, 134
Eldon, John Scott, 1st Earl of, 155
Elliott, Magdalen, 88
Ellys, Thos., 3n
Elm Field, 23n, 101, 112n
Elmes, Anthony, 29, 108n
Emmanuel College, Cambridge, 5
Endell Street, Nos. 23 and 25, 105
Ennys, Captain, 97n
Essex, Elizabeth Paulet, Countess of, 61, 72, 86n, 88
Essex, Robert Devereux, 3rd Earl of, 88
Evans, Galfridus, 139
Evelyn, George, 31n
Evelyn, John, 12, 53, 113
Everard, Rev. Chas., 11
Everard, Wm., 11
Eversley, Charles, Viscount, 160
Exchequer Office, 113
Exeter, Countess of, 96

Fairfax, Dorothy (afterwards Lady Constable), 51n


Fairfax, Ferdinando, 2nd Lord, 51, 52
Fairfax, Thomas, 1st Lord, 51n
Fairfax, Thomas, 3rd Baron, 51
Falcon (Falcon and Greyhound), High Holborn, 10, 13, 14, 15
Falconer, Elizabeth, 89, 92
Fanshawe, J., 83
Farmer, Thomas, 18
Farmhouse in rear of No. 196, Tottenham Court Road, 188
Farnham, John, 5
Fauconberg, Thomas, 1st Lord, 55n, 137
Faulkner, Miss, 89
Fawlcon Yard, 15n
Feathers Court, High Holborn, 8
Feltham, manor and messuages in, 123, 124
Fenowillet, Peter, 115
Ferrand, William, 87n
Ferrers, Washington, Earl, 75n
Fielding, Henry, 71
Finch, Heneage, 1st Earl of Nottingham, 79
Finch, Sir Heneage, 79
Finch, Sir Henry, 78
Finch, Sir John, 79
Fire at Freemasons’ Hall, 62
Firmin (Firman), William, 8
Fisher, Cuthbert, 169
Fisher, Sir Edward, 119n
Fisher, Sir Thomas, 21n
Fisher, Thos., 3n
Flaxman, John, 151
Flitcroft, Henry, 130, 131, 132
Flood, John, 119n, 122n
Flood, William, 119n, 122n
Flood. (See also Lloyd.)
Florence, Henry L., 63
Flower, Geo., 9
Floyd (Flood), Robert. (See Lloyd.)
Fonseca, Don Manuel, 37n
Foote, Anne, 109n
Foote, James, 109n
Foote, Robert, 109n
Foote, Thomas, 109n
Forrester, Mary, 90
Fort, Edward, 37n
Fortescue, John, 10, 11n
Fortescue, Sir John, 23n
Fortescue Lane (alias Drury Lane), 23n
Foster, Henry, 34n
Foster, Margaret, 34n
Fotherly, John, 27n, 29, 31
Foxcroft, Isaac, 97, 99n
Francis, Matthew, 101n
Francklin, Richard, 90
Francklyn, Mrs., 87, 90–91
Francklyn, Rev. Thomas, 87, 89, 90
Freeman, Sir Ralph, 47n, 50n
Freeman, W. G., 70
Freemasons, Trustees for, 73n, 75
Freemasons’ Hall and Tavern, 47, 55, 59–84
French Ambassador, 96, 97
Froude, Mr., 90
Froude, Ashburnham, 90n, 92

Gage, George, 93
Galloway, Thomas, 82
Gallows, 144
Gally, Henry, 139
Gandy, J. M., 63
Garnault, —, 71
Garrett, Frauncis, 107n
Garrick, David, 67, 90
Gate House (near Broad Street), 110
Gate House (Great Gate) St. Giles’s Hospital, 118, 121, 125, 145
Gate Street, 5, 10
Gate Tavern, High Holborn, 15
Gatteker, Thos., 185
Gaussen, Samuel, 182
Gentleman, George, 8n
George, Prince of Wales (afterwards George IV.), 78
George, The, Broad Street, 125
George, The, High Holborn, 8
Gerard, Frances (née Godman), 21, 107
Gerard, Francis, 21, 107n
Gerard, Philip, 21n
Gerbier, —, 44, 45
Gerrard, Sir Thomas, 6
Gibbert, Mr., 13, 14
Gibbons, Walter, 28
Gibbs, Tristram, 120
Giffard, John, 10
Gifford, Dr. Andrew, 94
Gifford, Philip, 126
Gilbertson, Rev. Lewis, 153n, 169
Glenbervie, Sylvester Douglas, Baron, 172, 173
Gloucester, Duke of, 75
Glynn, John, 183
Goddard, Alexander, 3n
Godfrey, Jno., 172
Godman (afterwards Gerard), Frances, 21, 107
Godman, Olive, 21, 107
Godman, Thos., 21n
Goldsborough, Edward, 7
Goldsborough, Grace, 8
Goldsborough, Robert, 8
Goldsborough, William, 7
Goldsmith Street, 18–22
Goodman, George, 15n
Goodyer, Lady Dinely, 56
Goring, George, Earl of Norwich, 88
Gosling, Geo., 153
Gower, John, 1st Earl of, 149
Gower, Lady, 70n
Gower Street, 185
Granby, John Manners, Marquess of, 91
Grange, Sir John, 125n
Grape Street (formerly Vine Street), 124
Graunge, John, 119, 122
Gray and Davidson, Messrs., 132
Grayhound. (See Greyhound.)
Great Close of Bloomsbury, 125n, 186
Great Gate, St. Giles’ Hospital. (See Gatehouse.)
Great Portland Street, No. 122 (formerly 47), 58
Great Queen Street (Queen Street), 11n, 14, 34, 92, 149

You might also like