100% found this document useful (4 votes)
13 views

Deep Learning with Python Develop Deep Learning Models on Theano and TensorFLow Using Keras Jason Brownlee - Download the ebook now for full and detailed access

Develop

Uploaded by

mangemjiemin13
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (4 votes)
13 views

Deep Learning with Python Develop Deep Learning Models on Theano and TensorFLow Using Keras Jason Brownlee - Download the ebook now for full and detailed access

Develop

Uploaded by

mangemjiemin13
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 58

Explore the full ebook collection and download it now at textbookfull.

com

Deep Learning with Python Develop Deep Learning


Models on Theano and TensorFLow Using Keras Jason
Brownlee

https://textbookfull.com/product/deep-learning-with-python-
develop-deep-learning-models-on-theano-and-tensorflow-using-
keras-jason-brownlee/

OR CLICK HERE

DOWLOAD EBOOK

Browse and Get More Ebook Downloads Instantly at https://textbookfull.com


Click here to visit textbookfull.com and download textbook now
Your digital treasures (PDF, ePub, MOBI) await
Download instantly and pick your perfect format...

Read anywhere, anytime, on any device!

Deep Learning for Natural Language Processing Develop Deep


Learning Models for Natural Language in Python Jason
Brownlee
https://textbookfull.com/product/deep-learning-for-natural-language-
processing-develop-deep-learning-models-for-natural-language-in-
python-jason-brownlee/
textbookfull.com

Deep Learning Projects Using TensorFlow 2: Neural Network


Development with Python and Keras 1st Edition Vinita
Silaparasetty
https://textbookfull.com/product/deep-learning-projects-using-
tensorflow-2-neural-network-development-with-python-and-keras-1st-
edition-vinita-silaparasetty/
textbookfull.com

Deep Learning with Applications Using Python Chatbots and


Face, Object, and Speech Recognition With TensorFlow and
Keras Springerlink (Online Service)
https://textbookfull.com/product/deep-learning-with-applications-
using-python-chatbots-and-face-object-and-speech-recognition-with-
tensorflow-and-keras-springerlink-online-service/
textbookfull.com

Beginning Anomaly Detection Using Python-Based Deep


Learning: With Keras and PyTorch Sridhar Alla

https://textbookfull.com/product/beginning-anomaly-detection-using-
python-based-deep-learning-with-keras-and-pytorch-sridhar-alla/

textbookfull.com
Computer Vision Using Deep Learning Neural Network
Architectures with Python and Keras 1st Edition Vaibhav
Verdhan
https://textbookfull.com/product/computer-vision-using-deep-learning-
neural-network-architectures-with-python-and-keras-1st-edition-
vaibhav-verdhan/
textbookfull.com

Deep Learning for Time Series Forecasting Predict the


Future with MLPs CNNs and LSTMs in Python Jason Brownlee

https://textbookfull.com/product/deep-learning-for-time-series-
forecasting-predict-the-future-with-mlps-cnns-and-lstms-in-python-
jason-brownlee/
textbookfull.com

Reinforcement Learning: With Open AI, TensorFlow and Keras


Using Python 1st Edition Abhishek Nandy

https://textbookfull.com/product/reinforcement-learning-with-open-ai-
tensorflow-and-keras-using-python-1st-edition-abhishek-nandy/

textbookfull.com

Deep Learning with Python Learn Best Practices of Deep


Learning Models with PyTorch 2nd Edition Nikhil Ketkar
Jojo Moolayil
https://textbookfull.com/product/deep-learning-with-python-learn-best-
practices-of-deep-learning-models-with-pytorch-2nd-edition-nikhil-
ketkar-jojo-moolayil/
textbookfull.com

Deep Learning with Python Learn Best Practices of Deep


Learning Models with PyTorch 2nd Edition Nikhil Ketkar
Jojo Moolayil
https://textbookfull.com/product/deep-learning-with-python-learn-best-
practices-of-deep-learning-models-with-pytorch-2nd-edition-nikhil-
ketkar-jojo-moolayil-2/
textbookfull.com
   

��������������������������������
����������������������������
�����

��������������
Jason Brownlee

Deep Learning With Python


Develop Deep Learning Models On Theano And TensorFlow Using
Keras
i

Deep Learning With Python


Copyright 2016 Jason Brownlee. All Rights Reserved.

Edition: v1.7
Contents

Preface iii

I Introduction 1
1 Welcome 2
1.1 Deep Learning The Wrong Way . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Deep Learning With Python . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Book Organization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.4 Requirements For This Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
1.5 Your Outcomes From Reading This Book . . . . . . . . . . . . . . . . . . . . . . 7
1.6 What This Book is Not . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
1.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

II Background 10
2 Introduction to Theano 11
2.1 What is Theano? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
2.2 How to Install Theano . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.3 Simple Theano Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.4 Extensions and Wrappers for Theano . . . . . . . . . . . . . . . . . . . . . . . . 13
2.5 More Theano Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3 Introduction to TensorFlow 15
3.1 What is TensorFlow? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.2 How to Install TensorFlow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
3.3 Your First Examples in TensorFlow . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.4 Simple TensorFlow Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.5 More Deep Learning Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4 Introduction to Keras 19
4.1 What is Keras? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
4.2 How to Install Keras . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
4.3 Theano and TensorFlow Backends for Keras . . . . . . . . . . . . . . . . . . . . 20

ii
iii

4.4 Build Deep Learning Models with Keras . . . . . . . . . . . . . . . . . . . . . . 21


4.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

5 Project: Develop Large Models on GPUs Cheaply In the Cloud 23


5.1 Project Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
5.2 Setup Your AWS Account . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
5.3 Launch Your Server Instance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
5.4 Login, Configure and Run . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
5.5 Build and Run Models on AWS . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
5.6 Close Your EC2 Instance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
5.7 Tips and Tricks for Using Keras on AWS . . . . . . . . . . . . . . . . . . . . . . 34
5.8 More Resources For Deep Learning on AWS . . . . . . . . . . . . . . . . . . . . 34
5.9 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34

III Multilayer Perceptrons 36


6 Crash Course In Multilayer Perceptrons 37
6.1 Crash Course Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
6.2 Multilayer Perceptrons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
6.3 Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
6.4 Networks of Neurons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
6.5 Training Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
6.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

7 Develop Your First Neural Network With Keras 43


7.1 Tutorial Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
7.2 Pima Indians Onset of Diabetes Dataset . . . . . . . . . . . . . . . . . . . . . . 44
7.3 Load Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
7.4 Define Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
7.5 Compile Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
7.6 Fit Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
7.7 Evaluate Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
7.8 Tie It All Together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
7.9 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

8 Evaluate The Performance of Deep Learning Models 51


8.1 Empirically Evaluate Network Configurations . . . . . . . . . . . . . . . . . . . 51
8.2 Data Splitting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
8.3 Manual k-Fold Cross Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
8.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55

9 Use Keras Models With Scikit-Learn For General Machine Learning 57


9.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
9.2 Evaluate Models with Cross Validation . . . . . . . . . . . . . . . . . . . . . . . 58
9.3 Grid Search Deep Learning Model Parameters . . . . . . . . . . . . . . . . . . . 59
9.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
iv

10 Project: Multiclass Classification Of Flower Species 62


10.1 Iris Flowers Classification Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . 62
10.2 Import Classes and Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
10.3 Initialize Random Number Generator . . . . . . . . . . . . . . . . . . . . . . . . 63
10.4 Load The Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
10.5 Encode The Output Variable . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
10.6 Define The Neural Network Model . . . . . . . . . . . . . . . . . . . . . . . . . 65
10.7 Evaluate The Model with k-Fold Cross Validation . . . . . . . . . . . . . . . . . 65
10.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67

11 Project: Binary Classification Of Sonar Returns 68


11.1 Sonar Object Classification Dataset . . . . . . . . . . . . . . . . . . . . . . . . . 68
11.2 Baseline Neural Network Model Performance . . . . . . . . . . . . . . . . . . . . 69
11.3 Improve Performance With Data Preparation . . . . . . . . . . . . . . . . . . . 71
11.4 Tuning Layers and Neurons in The Model . . . . . . . . . . . . . . . . . . . . . 73
11.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

12 Project: Regression Of Boston House Prices 77


12.1 Boston House Price Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
12.2 Develop a Baseline Neural Network Model . . . . . . . . . . . . . . . . . . . . . 78
12.3 Lift Performance By Standardizing The Dataset . . . . . . . . . . . . . . . . . . 81
12.4 Tune The Neural Network Topology . . . . . . . . . . . . . . . . . . . . . . . . . 82
12.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84

IV Advanced Multilayer Perceptrons and Keras 86


13 Save Your Models For Later With Serialization 87
13.1 Tutorial Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
13.2 Save Your Neural Network Model to JSON . . . . . . . . . . . . . . . . . . . . . 88
13.3 Save Your Neural Network Model to YAML . . . . . . . . . . . . . . . . . . . . 90
13.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

14 Keep The Best Models During Training With Checkpointing 93


14.1 Checkpointing Neural Network Models . . . . . . . . . . . . . . . . . . . . . . . 93
14.2 Checkpoint Neural Network Model Improvements . . . . . . . . . . . . . . . . . 94
14.3 Checkpoint Best Neural Network Model Only . . . . . . . . . . . . . . . . . . . 95
14.4 Loading a Saved Neural Network Model . . . . . . . . . . . . . . . . . . . . . . 96
14.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97

15 Understand Model Behavior During Training By Plotting History 98


15.1 Access Model Training History in Keras . . . . . . . . . . . . . . . . . . . . . . 98
15.2 Visualize Model Training History in Keras . . . . . . . . . . . . . . . . . . . . . 99
15.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
v

16 Reduce Overfitting With Dropout Regularization 102


16.1 Dropout Regularization For Neural Networks . . . . . . . . . . . . . . . . . . . . 102
16.2 Dropout Regularization in Keras . . . . . . . . . . . . . . . . . . . . . . . . . . 103
16.3 Using Dropout on the Visible Layer . . . . . . . . . . . . . . . . . . . . . . . . . 104
16.4 Using Dropout on Hidden Layers . . . . . . . . . . . . . . . . . . . . . . . . . . 105
16.5 Tips For Using Dropout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
16.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107

17 Lift Performance With Learning Rate Schedules 108


17.1 Learning Rate Schedule For Training Models . . . . . . . . . . . . . . . . . . . . 108
17.2 Ionosphere Classification Dataset . . . . . . . . . . . . . . . . . . . . . . . . . . 109
17.3 Time-Based Learning Rate Schedule . . . . . . . . . . . . . . . . . . . . . . . . 109
17.4 Drop-Based Learning Rate Schedule . . . . . . . . . . . . . . . . . . . . . . . . . 112
17.5 Tips for Using Learning Rate Schedules . . . . . . . . . . . . . . . . . . . . . . . 114
17.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114

V Convolutional Neural Networks 115


18 Crash Course In Convolutional Neural Networks 116
18.1 The Case for Convolutional Neural Networks . . . . . . . . . . . . . . . . . . . . 116
18.2 Building Blocks of Convolutional Neural Networks . . . . . . . . . . . . . . . . . 117
18.3 Convolutional Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
18.4 Pooling Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117
18.5 Fully Connected Layers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
18.6 Worked Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
18.7 Convolutional Neural Networks Best Practices . . . . . . . . . . . . . . . . . . . 119
18.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120

19 Project: Handwritten Digit Recognition 121


19.1 Handwritten Digit Recognition Dataset . . . . . . . . . . . . . . . . . . . . . . . 121
19.2 Loading the MNIST dataset in Keras . . . . . . . . . . . . . . . . . . . . . . . . 122
19.3 Baseline Model with Multilayer Perceptrons . . . . . . . . . . . . . . . . . . . . 123
19.4 Simple Convolutional Neural Network for MNIST . . . . . . . . . . . . . . . . . 127
19.5 Larger Convolutional Neural Network for MNIST . . . . . . . . . . . . . . . . . 131
19.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134

20 Improve Model Performance With Image Augmentation 135


20.1 Keras Image Augmentation API . . . . . . . . . . . . . . . . . . . . . . . . . . . 135
20.2 Point of Comparison for Image Augmentation . . . . . . . . . . . . . . . . . . . 136
20.3 Feature Standardization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 137
20.4 ZCA Whitening . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
20.5 Random Rotations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141
20.6 Random Shifts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
20.7 Random Flips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144
20.8 Saving Augmented Images to File . . . . . . . . . . . . . . . . . . . . . . . . . . 145
20.9 Tips For Augmenting Image Data with Keras . . . . . . . . . . . . . . . . . . . 147
vi

20.10Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147

21 Project Object Recognition in Photographs 148


21.1 Photograph Object Recognition Dataset . . . . . . . . . . . . . . . . . . . . . . 148
21.2 Loading The CIFAR-10 Dataset in Keras . . . . . . . . . . . . . . . . . . . . . . 149
21.3 Simple CNN for CIFAR-10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 150
21.4 Larger CNN for CIFAR-10 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
21.5 Extensions To Improve Model Performance . . . . . . . . . . . . . . . . . . . . . 157
21.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 158

22 Project: Predict Sentiment From Movie Reviews 159


22.1 Movie Review Sentiment Classification Dataset . . . . . . . . . . . . . . . . . . 159
22.2 Load the IMDB Dataset With Keras . . . . . . . . . . . . . . . . . . . . . . . . 160
22.3 Word Embeddings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
22.4 Simple Multilayer Perceptron Model . . . . . . . . . . . . . . . . . . . . . . . . 163
22.5 One-Dimensional Convolutional Neural Network . . . . . . . . . . . . . . . . . . 165
22.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168

VI Recurrent Neural Networks 169


23 Crash Course In Recurrent Neural Networks 170
23.1 Support For Sequences in Neural Networks . . . . . . . . . . . . . . . . . . . . . 170
23.2 Recurrent Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
23.3 Long Short-Term Memory Networks . . . . . . . . . . . . . . . . . . . . . . . . . 172
23.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172

24 Time Series Prediction with Multilayer Perceptrons 174


24.1 Problem Description: Time Series Prediction . . . . . . . . . . . . . . . . . . . . 174
24.2 Multilayer Perceptron Regression . . . . . . . . . . . . . . . . . . . . . . . . . . 176
24.3 Multilayer Perceptron Using the Window Method . . . . . . . . . . . . . . . . . 181
24.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183

25 Time Series Prediction with LSTM Recurrent Neural Networks 185


25.1 LSTM Network For Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
25.2 LSTM For Regression Using the Window Method . . . . . . . . . . . . . . . . . 189
25.3 LSTM For Regression with Time Steps . . . . . . . . . . . . . . . . . . . . . . . 191
25.4 LSTM With Memory Between Batches . . . . . . . . . . . . . . . . . . . . . . . 194
25.5 Stacked LSTMs With Memory Between Batches . . . . . . . . . . . . . . . . . . 197
25.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200

26 Project: Sequence Classification of Movie Reviews 201


26.1 Simple LSTM for Sequence Classification . . . . . . . . . . . . . . . . . . . . . . 201
26.2 LSTM For Sequence Classification With Dropout . . . . . . . . . . . . . . . . . 203
26.3 LSTM and CNN For Sequence Classification . . . . . . . . . . . . . . . . . . . . 206
26.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
vii

27 Understanding Stateful LSTM Recurrent Neural Networks 209


27.1 Problem Description: Learn the Alphabet . . . . . . . . . . . . . . . . . . . . . 209
27.2 LSTM for Learning One-Char to One-Char Mapping . . . . . . . . . . . . . . . 211
27.3 LSTM for a Feature Window to One-Char Mapping . . . . . . . . . . . . . . . . 214
27.4 LSTM for a Time Step Window to One-Char Mapping . . . . . . . . . . . . . . 216
27.5 LSTM State Maintained Between Samples Within A Batch . . . . . . . . . . . . 218
27.6 Stateful LSTM for a One-Char to One-Char Mapping . . . . . . . . . . . . . . . 221
27.7 LSTM with Variable Length Input to One-Char Output . . . . . . . . . . . . . . 224
27.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 227

28 Project: Text Generation With Alice in Wonderland 228


28.1 Problem Description: Text Generation . . . . . . . . . . . . . . . . . . . . . . . 228
28.2 Develop a Small LSTM Recurrent Neural Network . . . . . . . . . . . . . . . . . 229
28.3 Generating Text with an LSTM Network . . . . . . . . . . . . . . . . . . . . . . 234
28.4 Larger LSTM Recurrent Neural Network . . . . . . . . . . . . . . . . . . . . . . 237
28.5 Extension Ideas to Improve the Model . . . . . . . . . . . . . . . . . . . . . . . 240
28.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241

VII Conclusions 242


29 How Far You Have Come 243

30 Getting More Help 244


30.1 Artificial Neural Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
30.2 Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 244
30.3 Python Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
30.4 Keras Library . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
Preface

Deep learning is a fascinating field. Artificial neural networks have been around for a long time,
but something special has happened in recent years. The mixture of new faster hardware, new
techniques and highly optimized open source libraries allow very large networks to be created
with frightening ease.
This new wave of much larger and much deeper neural networks are also impressively skillful
on a range of problems. I have watched over recent years as they tackle and handily become
state-of-the-art across a range of difficult problem domains. Not least object recognition, speech
recognition, sentiment classification, translation and more.
When a technique comes a long that does so well on such a broad set of problems, you have
to pay attention. The problem is where do you start with deep learning? I created this book
because I thought that there was no gentle way for Python machine learning practitioners to
quickly get started developing deep learning models.
In developing the lessons in this book, I chose the best of breed Python deep learning library
called Keras that abstracted away all of the complexity, ruthlessly leaving you an API containing
only what you need to know to efficiently develop and evaluate neural network models.
This is the guide that I wish I had when I started apply deep learning to machine learning
problems. I hope that you find it useful on your own projects and have as much fun applying
deep learning as I did in creating this book for you.

Jason Brownlee
Melbourne, Australia
2016

viii
Part I

Introduction

1
Chapter 1

Welcome

Welcome to Deep Learning With Python. This book is your guide to deep learning in Python.
You will discover the Keras Python library for deep learning and how to use it to develop and
evaluate deep learning models. In this book you will discover the techniques, recipes and skills
in deep learning that you can then bring to your own machine learning projects.
Deep learning does have a lot of fascinating math under the covers, but you do not need
to know it to be able to pick it up as a tool and wield it on important projects and deliver
real value. From the applied perspective, deep learning is quite a shallow field and a motivated
developer can quickly pick it up and start making very real and impactful contributions. This is
my goal for you and this book is your ticket to that outcome.

1.1 Deep Learning The Wrong Way


If you ask a deep learning practitioner how to get started with neural networks and deep learning,
what do they say? They say things like

You must have a strong foundation in linear algebra.

You must have a deep knowledge of traditional neural network techniques.

You really must know about probability and statistics.

You should really have a deep knowledge of machine learning.

You probably need to be a PhD in computer science.

You probably need 10 years of experience as a machine learning developer.

You can see that the “common sense” advice means that it is not until after you have
completed years of study and experience that you are ready to actually start developing and
evaluating machine learning model for your machine learning projects.
I think this advice is dead wrong.

2
1.2. Deep Learning With Python 3

1.2 Deep Learning With Python


The approach taken with this book and with all of Machine Learning Mastery is to flip the
traditional approach. If you are interested in deep learning, start by developing and evaluating
deep learning models. Then if you discover you really like it or have a knack for it, later you
can step deeper and deeper into the background and theory, as you need it in order to serve
you in developing better and more valuable results. This book is your ticket to jumping in and
making a ruckus with deep learning.
I have used many of the top deep learning platforms and libraries and I chose what I think
is the best-of-breed platform for getting started and very quickly developing powerful and even
state-of-the-art deep learning models in the Keras deep learning library for Python. Unlike R,
Python is a fully featured programming language allowing you to use the same libraries and
code for model development as you can use in production. Unlike Java, Python has the SciPy
stack for scientific computing and scikit-learn which is a professional grade machine library.
There are two top numerical platforms for developing deep learning models, they are Theano
developed by the University of Montreal and TensorFlow developed at Google. Both were
developed for use in Python and both can be leveraged by the super simple to use Keras library.
Keras wraps the numerical computing complexity of Theano and TensorFlow providing a concise
API that we will use to develop our own neural network and deep learning models.
You will develop your own and perhaps your first neural network and deep learning models
while working through this book, and you will have the skills to bring this amazing new
technology to your own projects. It is going to be a fun journey and I can’t wait to start.

1.3 Book Organization


This book is broken down into three parts.

Lessons where you learn about specific features of neural network models and or how to
use specific aspects of the Keras API.

Projects where you will pull together multiple lessons into an end-to-end project and
deliver a result, providing a template your your own projects.

Recipes where you can copy and paste the standalone code into your own project,
including all of the code presented in this book.

1.3.1 Lessons and Projects


Lessons are discrete and are focused on one topic, designed for you to complete in one sitting.
You can take as long as you need, from 20 minutes if you are racing through, to hours if you
want to experiment with the code or ideas and improve upon the presented results. Your lessons
are divided into five parts:

Background.

Multilayer Perceptrons.

Advanced Multilayer Perceptrons and Keras.


1.3. Book Organization 4

Convolutional Neural Networks.


Recurrent Neural Networks.

1.3.2 Part 2: Background


In this part you will learn about the Theano, TensorFlow and Keras libraries that lay the
foundation for your deep learning journey and about how you can leverage very cheap Amazon
Web Service computing in order to develop and evaluate your own large models in the cloud.
This part of the book includes the following lessons:
Introduction to the Theano Numerical Library.
Introduction to the TensorFlow Numerical Library.
Introduction to the Keras Deep Learning Library.
The lessons will introduce you to the important foundational libraries that you need to
install and use on your workstation. This is taken one step further in a project that shows how
you can cheaply harness GPU cloud computing to develop and evaluate very large deep learning
models.
Project: Develop Large Models on GPUs Cheaply In the Cloud.
At the end of this part you will be ready to start developing models in Keras on your
workstation or in the cloud.

1.3.3 Part 3: Multilayer Perceptrons


In this part you will learn about feedforward neural networks that may be deep or not and how
to expertly develop your own networks and evaluate them efficiently using Keras. This part of
the book includes the following lessons:
Crash Course In Multilayer Perceptrons.
Develop Your First Neural Network With Keras.
Evaluate The Performance of Deep Learning Models.
Use Keras Models With Scikit-Learn For General Machine Learning.
These important lessons are tied together with three foundation projects. These projects
demonstrate how you can quickly and efficiently develop neural network models for tabular
data and provide project templates that you can use on your own regression and classification
machine learning problems. These projects include:
Project: Multiclass Classification Problem.
Project: Binary Classification Problem.
Project: Regression Problem.
At the end of this part you will be ready to discover the finer points of deep learning using
the Keras API.
1.3. Book Organization 5

1.3.4 Part 4: Advanced Multilayer Perceptrons


In this part you will learn about some of the more finer points of the Keras library and API for
practical machine learning projects and some of the more important developments in applied
neural networks that you need to know in order to deliver world class results. This part of the
book includes the following lessons:

Save Your Models For Later With Network Serialization.

Keep The Best Models During Training With Checkpointing.

Understand Model Behavior During Training By Plotting History.

Reduce Overfitting With Dropout Regularization.

Lift Performance With Learning Rate Schedules.

At the end of this part you will know how to confidently wield Keras on your own machine
learning projects with a focus of the finer points of investigating model performance, persisting
models for later use and gaining lifts in performance over baseline models.

1.3.5 Part 5: Convolutional Neural Networks


In this part you will receive a crash course in the dominant model for computer vision machine
learning problems and some natural language problems and how you can best exploit the
capabilities of the Keras API for your own projects. This part of the book includes the following
lessons:

Crash Course In Convolutional Neural Networks.

Improve Model Performance With Image Augmentation.

The best way to learn about this impressive type of neural network model is to apply it.
You will work through three larger projects and apply CNN to image data for object recognition
and text data for sentiment classification.

Project: Handwritten Digit Recognition.

Project: Object Recognition in Photographs.

Project: Movie Review Sentiment Classification.

After completing the lessons and projects in this part you will have the skills and the
confidence of complete and working templates and recipes to tackle your own deep learning
projects using convolutional neural networks.
1.4. Requirements For This Book 6

1.3.6 Part 6: Recurrent Neural Networks


In this part you will receive a crash course in the dominant model for data with a sequence or
time component and how you can best exploit the capabilities of the Keras API for your own
projects. This part of the book includes the following lessons:

Crash Course In Recurrent Neural Networks.

Multilayer Perceptron Models for Time Series Problems.

LSTM Models for Time Series Problems.

Understanding State in LSTM Models for Sequence Prediction.

The best way to learn about this complex type of neural network model is to apply it.
You will work through two larger projects and apply RNN to sequence classification and text
generation.

Project: Sequence Classification of Movie Reviews.

Project: Text Generation With Alice in Wonderland.

After completing the lessons and projects in this part you will have the skills and the
confidence of complete and working templates and recipes to tackle your own deep learning
projects using recurrent neural networks.

1.3.7 Conclusions
The book concludes with some resources that you can use to learn more information about a
specific topic or find help if you need it as you start to develop and evaluate your own deep
learning models.

1.3.8 Recipes
Building up a catalog of code recipes is an important part of your deep learning journey. Each
time you learn about a new technique or new problem type, you should write up a short code
recipe that demonstrates it. This will give you a starting point to use on your next deep learning
or machine learning project.
As part of this book you will receive a catalog of deep learning recipes. This includes recipes
for all of the lessons presented in this book, as well as the complete code for all of the projects.
You are strongly encouraged to add to and build upon this catalog of recipes as you expand
your use and knowledge of deep learning in Python.

1.4 Requirements For This Book


1.4.1 Python and SciPy
You do not need to be a Python expert, but it would be helpful if you knew how to install and
setup Python and SciPy. The lessons and projects assume that you have a Python and SciPy
1.5. Your Outcomes From Reading This Book 7

environment available. This may be on your workstation or laptop, it may be in a VM or a


Docker instance that you run, or it may be a server instance that you can configure in the cloud
as taught in Part II of this book.
Technical Requirements: The technical requirements for the code and tutorials in this
book are as follows:

Python version 2 or 3 installed. This book was developed using Python version 2.7.11.

SciPy and NumPy installed. This book was developed with SciPy version 0.17.0 and
NumPy version 1.11.0.

Matplotlib installed. This book was developed with Matplotlib version 1.5.1.

Pandas installed. This book was developed with Pandas version 0.18.0.

scikit-learn installed. This book was developed with scikit-learn 0.17.1.

You do not need to match the version exactly, but if you are having problems running a
specific code example, please ensure that you update to the same or higher version as the library
specified. You will be guided as to how to install the deep learning libraries Theano, TensorFlow
and Keras in Part II of the book.

1.4.2 Machine Learning


You do not need to be a machine learning expert, but it would be helpful if you knew how to
navigate a small machine learning problem using scikit-learn. Basic concepts like cross validation
and one hot encoding used in lessons and projects are described, but only briefly. There are
resources to go into these topics in more detail at the end of the book, but some knowledge of
these areas might make things easier for you.

1.4.3 Deep Learning


You do not need to know the math and theory of deep learning algorithms, but it would be
helpful to have some basic idea of the field. You will get a crash course in neural network
terminology and models, but we will not go into much detail. Again, there will be resources for
more information at the end of the book, but it might be helpful if you can start with some
idea about neural networks.
Note: All tutorials can be completed on standard workstation hardware with a CPU. A
GPU is not required. Some tutorials later in the book can be sped up significantly by running
on the GPU and a suggestion is provided to consider using GPU hardware at the beginning of
those sections. You can access GPU hardware easily and cheaply in the cloud and a step-by-step
procedure is taught on how to do this in Chapter 5.

1.5 Your Outcomes From Reading This Book


This book will lead you from being a developer who is interested in deep learning with Python
to a developer who has the resources and capabilities to work through a new dataset end-to-end
using Python and develop accurate deep learning models. Specifically, you will know:
1.6. What This Book is Not 8

How to develop and evaluate neural network models end-to-end.

How to use more advanced techniques required for developing state-of-the-art deep learning
models.

How to build larger models for image and text data.

How to use advanced image augmentation techniques in order to lift model performance.

How to get help with deep learning in Python.

From here you can start to dive into the specifics of the functions, techniques and algorithms
used with the goal of learning how to use them better in order to deliver more accurate predictive
models, more reliably in less time. There are a few ways you can read this book. You can dip
into the lessons and projects as your need or interests motivate you. Alternatively, you can
work through the book end-to-end and take advantage of how the lessons and projects build in
complexity and range. I recommend the latter approach.
To get the very most from this book, I recommend taking each lesson and project and build
upon them. Attempt to improve the results, apply the method to a similar but di↵erent problem,
and so on. Write up what you tried or learned and share it on your blog, social media or send
me an email at [email protected]. This book is really what you make of it
and by putting in a little extra, you can quickly become a true force in applied deep learning.

1.6 What This Book is Not


This book solves a specific problem of getting you, a developer, up to speed applying deep
learning to your own machine learning projects in Python. As such, this book was not intended
to be everything to everyone and it is very important to calibrate your expectations. Specifically:

This is not a deep learning textbook. We will not be getting into the basic theory
of artificial neural networks or deep learning algorithms. You are also expected to have
some familiarity with machine learning basics, or be able to pick them up yourself.

This is not an algorithm book. We will not be working through the details of how
specific deep learning algorithms work. You are expected to have some basic knowledge of
deep learning algorithms or how to pick up this knowledge yourself.

This is not a Python programming book. We will not be spending a lot of time on
Python syntax and programming (e.g. basic programming tasks in Python). You are
expected to already be familiar with Python or a developer who can pick up a new C-like
language relatively quickly.

You can still get a lot out of this book if you are weak in one or two of these areas, but you
may struggle picking up the language or require some more explanation of the techniques. If
this is the case, see the Getting More Help chapter at the end of the book and seek out a good
companion reference text.
1.7. Summary 9

1.7 Summary
It is a special time right now. The tools for applied deep learning have never been so good.
The pace of change with neural networks and deep learning feels like it has never been so fast,
spurred by the amazing results that the methods are showing in such a broad range of fields.
This is the start of your journey into deep learning and I am excited for you. Take your time,
have fun and I’m so excited to see where you can take this amazing new technology.

1.7.1 Next
Let’s dive in. Next up is Part II where you will take a whirlwind tour of the foundation libraries
for deep learning in Python, namely the numerical libraries Theano and TensorFlow and the
library you will be using throughout this book called Keras.
Part II

Background

10
Chapter 2

Introduction to Theano

Theano is a Python library for fast numerical computation that can be run on the CPU or GPU.
It is a key foundational library for deep learning in Python that you can use directly to create
deep learning models. After completing this lesson, you will know:

About the Theano library for Python.

How a very simple symbolic expression can be defined, compiled and calculated.

Where you can learn more about Theano.

Let’s get started.

2.1 What is Theano?


Theano is an open source project released under the BSD license and was developed by the LISA
(now MILA1 ) group at the University of Montreal, Quebec, Canada (home of Yoshua Bengio).
It is named after a Greek mathematician. At it’s heart Theano is a compiler for mathematical
expressions in Python. It knows how to take your structures and turn them into very efficient
code that uses NumPy, efficient native libraries like BLAS and native code to run as fast as
possible on CPUs or GPUs.
It uses a host of clever code optimizations to squeeze as much performance as possible from
your hardware. If you are into the nitty-gritty of mathematical optimizations in code, check out
this interesting list2 . The actual syntax of Theano expressions is symbolic, which can be o↵
putting to beginners. Specifically, expression are defined in the abstract sense, compiled and
later actually used to make calculations.
Theano was specifically designed to handle the types of computation required for large
neural network algorithms used in deep learning. It was one of the first libraries of its kind
(development started in 2007) and is considered an industry standard for deep learning research
and development.
1
http://mila.umontreal.ca/
2
http://deeplearning.net/software/theano/optimizations.html#optimizations

11
2.2. How to Install Theano 12

2.2 How to Install Theano


Theano provides extensive installation instructions for the major operating systems: Windows,
OS X and Linux. Read the Installing Theano guide for your platform3 . Theano assumes a
working Python 2 or Python 3 environment with SciPy. There are ways to make the installation
easier, such as using Anaconda4 to quickly setup Python and SciPy on your machine as well
as using Docker images. With a working Python and SciPy environment, it is relatively
straightforward to install Theano using pip, for example:
sudo pip install Theano

Listing 2.1: Install Theano with pip.


New releases of Theano may be announced and you will want to update to get any bug fixes
and efficiency improvements. You can upgrade Theano using pip as follows:
sudo pip install --upgrade --no-deps theano

Listing 2.2: Upgrade Theano with pip.


You may want to use the bleeding edge version of Theano checked directly out of GitHub.
This may be required for some wrapper libraries that make use of bleeding edge API changes.
You can install Theano directly from a GitHub checkout as follows:
sudo pip install --upgrade --no-deps git+git://github.com/Theano/Theano.git

Listing 2.3: Upgrade Theano with pip from GitHub.


You are now ready to run Theano on your CPU, which is just fine for the development of
small models. Large models may run slowly on the CPU. If you have a Nvidia GPU, you may
want to look into configuring Theano to use your GPU. There is a wealth of documentation of
the Theano homepage for further configuring the library.

Theano v0.8.2is the latest at the time of writing and is used in this book.

2.3 Simple Theano Example


In this section we demonstrate a simple Python script that gives you a flavor of Theano. In this
example we define two symbolic floating point variables a and b. We define an expression that
uses these variables (c = a + b). We then compile this symbolic expression into a function using
Theano that we can use later. Finally, we use our compiled expression by plugging in some real
values and performing the calculation using efficient compiled Theano code under the covers.
# Example of Theano library
import theano
from theano import tensor
# declare two symbolic floating-point scalars
a = tensor.dscalar()
b = tensor.dscalar()
3
http://deeplearning.net/software/theano/install.html
4
https://www.continuum.io/downloads
2.4. Extensions and Wrappers for Theano 13

# create a simple symbolic expression


c = a + b
# convert the expression into a callable object that takes (a,b) and computes c
f = theano.function([a,b], c)
# bind 1.5 to a , 2.5 to b , and evaluate c
result = f(1.5, 2.5)
print(result)

Listing 2.4: Example of Symbolic Arithmetic with Theano.


Running the example prints the output 4, which matches our expectation that 1.5 + 2.5 = 4.0.
This is a useful example as it gives you a flavor for how a symbolic expression can be defined,
compiled and used. Although we have only performed a basic introduction of adding 2 and 2,
you can see how pre-defining computation to be compiled for efficiency may be scaled up to
large vector and matrix operations required for deep learning.

2.4 Extensions and Wrappers for Theano


If you are new to deep learning you do not have to use Theano directly. In fact, you are highly
encouraged to use one of many popular Python projects that make Theano a lot easier to use
for deep learning. These projects provide data structures and behaviors in Python, specifically
designed to quickly and reliably create deep learning models whilst ensuring that fast and
efficient models are created and executed by Theano under the covers. The amount of Theano
syntax exposed by the libraries varies.
Keras is a wrapper library that hides Theano completely and provides a very simple API to
work with to create deep learning models. It hides Theano so well, that it can in fact run as a
wrapper for another popular foundation framework called TensorFlow (discussed next).

2.5 More Theano Resources


Looking for some more resources on Theano? Take a look at some of the following.

Theano Official Homepage


http://deeplearning.net/software/theano/

Theano GitHub Repository


https://github.com/Theano/Theano/

Theano: A CPU and GPU Math Compiler in Python (2010)


http://www.iro.umontreal.ca/~lisa/pointeurs/theano_scipy2010.pdf

List of Libraries Built on Theano


https://github.com/Theano/Theano/wiki/Related-projects

List of Theano configuration options


http://deeplearning.net/software/theano/library/config.html
2.6. Summary 14

2.6 Summary
In this lesson you discovered the Theano Python library for efficient numerical computation.
You learned:

Theano is a foundation library used for deep learning research and development.

Deep learning models can be developed directly in Theano if desired.

The development and evaluation of deep learning models is easier with wrapper libraries
like Keras.

2.6.1 Next
You now know about the Theano library for numerical computation in Python. In the next
lesson you will discover the TensorFlow library released by Google that attempts to o↵er the
same capabilities.
Chapter 3

Introduction to TensorFlow

TensorFlow is a Python library for fast numerical computing created and released by Google.
It is a foundation library that can be used to create deep learning models directly or by using
wrapper libraries that simplify the process built on top of TensorFlow. After completing this
lesson you will know:

About the TensorFlow library for Python.

How to define, compile and evaluate a simple symbolic expression in TensorFlow.

Where to go to get more information on the Library.

Let’s get started.


Note: TensorFlow is not easily supported on Windows at the time of writing. It may be
possible to get TensorFlow working on windows with Docker. TensorFlow is not required to
complete the rest of this book, and if you are on the Windows platform you can skip this lesson.

3.1 What is TensorFlow?


TensorFlow is an open source library for fast numerical computing. It was created and is
maintained by Google and released under the Apache 2.0 open source license. The API is
nominally for the Python programming language, although there is access to the underlying
C++ API. Unlike other numerical libraries intended for use in Deep Learning like Theano,
TensorFlow was designed for use both in research and development and in production systems,
not least RankBrain in Google search1 and the fun DeepDream project2 . It can run on single
CPU systems, GPUs as well as mobile devices and large scale distributed systems of hundreds
of machines.

3.2 How to Install TensorFlow


Installation of TensorFlow is straightforward if you already have a Python SciPy environment.
TensorFlow works with Python 2.7 and Python 3.3+. With a working Python and SciPy
1
https://en.wikipedia.org/wiki/RankBrain
2
https://en.wikipedia.org/wiki/DeepDream

15
3.3. Your First Examples in TensorFlow 16

environment, it is relatively straightforward to install TensorFlow using pip There are a number
of di↵erent distributions of TensorFlow, customized for di↵erent environments, therefore to
install TensorFlow you can follow the Download and Setup instructions3 on the TensorFlow
website. , for example:

TensorFlow v0.10.0is the latest at the time of writing and is used in this book.

3.3 Your First Examples in TensorFlow


Computation is described in terms of data flow and operations in the structure of a directed
graph.

Nodes: Nodes perform computation and have zero or more inputs and outputs. Data that
moves between nodes are known as tensors, which are multi-dimensional arrays of real
values.

Edges: The graph defines the flow of data, branching, looping and updates to state.
Special edges can be used to synchronize behavior within the graph, for example waiting
for computation on a number of inputs to complete.

Operation: An operation is a named abstract computation which can take input attributes
and produce output attributes. For example, you could define an add or multiply operation.

3.4 Simple TensorFlow Example


In this section we demonstrate a simple Python script that gives you a flavor of TensorFlow. In
this example we define two symbolic floating point variables a and b. We define an expression
that uses these variables (c = a + b). This is the same example used in the previous chapter that
introduced Theano. We then compile this symbolic expression into a function using TensorFlow
that we can use later. Finally, we use our complied expression by plugging in some real values
and performing the calculation using efficient compiled TensorFlow code under the covers.
# Example of TensorFlow library
import tensorflow as tf
# declare two symbolic floating-point scalars
a = tf.placeholder(tf.float32)
b = tf.placeholder(tf.float32)
# create a simple symbolic expression using the add function
add = tf.add(a, b)
# bind 1.5 to a , 2.5 to b , and evaluate c
sess = tf.Session()
binding = {a: 1.5, b: 2.5}
c = sess.run(add, feed_dict=binding)
print(c)

Listing 3.1: Example of Symbolic Arithmetic with TensorFlow.


3
https://www.tensorflow.org/versions/r0.9/get_started/os_setup.html
3.5. More Deep Learning Models 17

Running the example prints the output 4, which matches our expectation that 1.5 + 2.5 = 4.0.
This is a useful example as it gives you a flavor for how a symbolic expression can be defined,
compiled and used. Although we have only performed a basic introduction of adding 2 and 2,
you can see how pre-defining computation to be compiled for efficiency may be scaled up to
large vector and matrix operations required for deep learning.

3.5 More Deep Learning Models


Your TensorFlow installation comes with a number of Deep Learning models that you can use
and experiment with directly. Firstly, you need to find out where TensorFlow was installed on
your system. For example, you can use the following Python script:
python -c import os; import inspect; import tensorflow;
print(os.path.dirname(inspect.getfile(tensorflow)))

Listing 3.2: Print Install Directory for TensorFlow.


Change to this directory and take note of the models/ subdirectory. Included are a number
of deep learning models with tutorial-like comments, such as:

Multi-threaded word2vec mini-batched skip-gram model.

Multi-threaded word2vec unbatched skip-gram model.

CNN for the CIFAR-10 network.

Simple, end-to-end, LeNet-5-like convolutional MNIST model example.

Sequence-to-sequence model with an attention mechanism.

Also check the examples directory as it contains an example using the MNIST dataset.
There is also an excellent list of tutorials on the main TensorFlow website4 . They show how
to use di↵erent network types, di↵erent datasets and how to use the framework in various
di↵erent ways. Finally, there is the TensorFlow playground5 where you can experiment with
small networks right in your web browser.

3.6 Summary
In this lesson you discovered the TensorFlow Python library for deep learning. You learned:

TensorFlow is another efficient numerical library like Theano.

Like Theano, deep learning models can be developed directly in TensorFlow if desired.

Also like Theano, TensorFlow may be better leveraged by a wrapper library that abstracts
the complexity and lower level details.
4
https://www.tensorflow.org/versions/r0.9/tutorials/
5
http://playground.tensorflow.org/
3.6. Summary 18

3.6.1 Next
You now know about the Theano and TensorFlow libraries for efficient numerical computation
in Python. In the next lesson you will discover the Keras library that wraps both libraries and
gives you a clean and simple API for developing and evaluating deep learning models.
Chapter 4

Introduction to Keras

Two of the top numerical platforms in Python that provide the basis for deep learning research
and development are Theano and TensorFlow. Both are very powerful libraries, but both can
be difficult to use directly for creating deep learning models. In this lesson you will discover
the Keras Python library that provides a clean and convenient way to create a range of deep
learning models on top of Theano or TensorFlow. After completing this lesson you will know:

About the Keras Python library for deep learning.

How to configure Keras for Theano or TensorFlow.

The standard idiom for creating models with Keras.

Let’s get started.

4.1 What is Keras?


Keras is a minimalist Python library for deep learning that can run on top of Theano or
TensorFlow. It was developed to make developing deep learning models as fast and easy as
possible for research and development. It runs on Python 2.7 or 3.5 and can seamlessly execute
on GPUs and CPUs given the underlying frameworks. It is released under the permissive MIT
license. Keras was developed and maintained by François Chollet, a Google engineer using four
guiding principles:

Modularity: A model can be understood as a sequence or a graph alone. All the concerns
of a deep learning model are discrete components that can be combined in arbitrary ways.

Minimalism: The library provides just enough to achieve an outcome, no frills and
maximizing readability.

Extensibility: New components are intentionally easy to add and use within the frame-
work, intended for developers to trial and explore new ideas.

Python: No separate model files with custom file formats. Everything is native Python.

19
Other documents randomly have
different content
Bibby Steamship Line, 151.
Black Ball Steamship Line, 27.
Black Diamond Steamship Line, 235.
Blue Flag Steamship Line, 129.
Bohemian, SS., 199;
wrecked, 202.
Boothby, Captain, 186.
Boulton & Watt, engineers, 334.
Brandon to Britain, 295.
Bristol City Steamship Line, 129.
Britannia, SS., 72, 74.
Britannic, SS., 118.
British and African Steamship Company, 155.
British and Colonial Steam Navigation Company, 156.
British Columbia, 334.
British India Steam Navigation Company, 148.
British and North American Royal Mail Steam-Packet
Company, 73.
British navy, 166, 175.
British Queen, SS., 97.
Brooks, Captain, 102.
Brown, Captain, 216.
Bruce Mines, S., 254.
Brunel, Isambard, 66.
Brush, George, 307, 310.
Buenos Ayrean, SS., 206.
Bulwer, Sir Edward, 159.
Burial of dead at sea, 183.
Burlington, S., 44.
Burns, Rev. Dr., 94.
Burns, Sir George, 71, 93.

Calcutta and Burmah Steam Navigation Company, 148.


Caledonia, SS., Cunard, 73.
Caledonia, SS., P. & O., 146.
Calvin Company, 287.
Cameron, Captain, 123.
Campana, S., 235.
Campania, SS., 78, 174.
Campbell, Captain Howard, 234.
Canada, SS., Cunard Line, 75.
Canada, SS., Dominion Line, 226.
Canada Shipping Company, 229.
Canadian, SS., 198-200.
Canadian canals, 258.
Canadian commerce on lakes, 283.
Canadian Pacific Railway, 158.
Canadian Pacific steamers, 160, 164, 284.
Canadian Steam Navigation Company, 316.
Canal tariffs, 303.
Cape of Good Hope, SS., 149.
Car of Commerce, S., 310.
Carthaginian, SS., 206.
Castle Steamship Line, The, 155.
Celtic, SS., 118.
Charity, SS., 195.
Charlotte Dundas, S., 33.
Chesapeake and Ohio Steamship Line, 129.
Chicora, S., 255.
Chieftain, S., 326.
Chimborazo, SS., 148.
China, SS., 75.
Chippewa, S., 254.
Cimbria, SS., sunk, 134.
Circassia, SS., 186.
Circassian, SS., 205.
City of Berlin, SS., 108.
City of Boston, SS., 107.
City of Brussels, SS., 107.
City of Chicago, SS., 107.
City of Glasgow, SS., 107.
City of Manchester, SS., 107.
City of Montreal, SS., 107.
City of New York, SS., 108.
City of Paris, SS., 108.
City of Philadelphia, SS., 107.
City of Rome, SS., 113, 128.
City of Washington, SS., 107.
City Steamship Line to India, 152.
Clan Steamship Line, The, 150.
Cleopatra, SS., 195.
Clermont, S., 41.
Cleveland, Ohio, 278, 281.
Clipper ships, 26.
Clyde River steamers, 38.
Codfish industry, 355.
Collingwood and Owen Sound, 255.
Collins, E. K., 106.
Collins Steamship Line, 99, 103.
Collision at sea, 126.
Columba, S., 38.
Comet, S., Bell’s, 34, 312.
Commerce of Great Lakes, 268.
Compagnie Générale Transatlantique, 138.
Compound engines, 100, 345.
Connal & Co., builders, 222.
Continental Steamship Lines, 130.
Cook, Captain, 86, 88.
Corona, S., 330.
Cost of running steamships, 84.
Cramp & Sons, builders, 110.
Crathie, SS., collision, 136.
Crescent, H.M.S., 189.
Crimean War, 198, 214.
Cruisers, Armed, 172.
Cumberland, S., 255.
Cunard fleet, 85.
Cunard Steamship Line, 73.
Cunard, Sir Edward, 93.
Cunard, Sir Samuel, 71, 91.
Cunard track chart, 96, 176.
Currie, Captain, 207.
Cushing, Manager, 318.
Cuzco, SS., 148.

Dakota, SS., 115.


Dalziel, Captain, 203.
Danmark, SS., foundered, 141.
Danube, SS., 157.
Dawn of steam navigation, 28.
Deeper waterways, 299, 302.
Dennys, ship-builders, 154, 198, 204.
Detroit River tonnage, 276.
Devonia, SS., 113.
Diamond Jubilee Review, 170.
Dick, Captain, 324.
Dickens, Charles, 18.
Distances, Marine, 177.
Dolphin, S., 325, 326.
Dominion Steamship Line, 221.
Dominion Steamers, 353.
Donaldson Steamship Line, 234.
Douglas, Captain, 75.
Douglas, Governor of British Columbia, 336.
Dramatic Line, The, 103.
Draught, Induced, 20.
Drummond Castle, SS., lost, 155.
Dry-docks, 342.
Duke of Marlborough, H.M.S., 168.
Duke of Wellington, H.M.S., 97, 168.
Durham boats, 260.
Durham City, SS., 190.
Dutton, Captain, 217.
Early Atlantic steamers, 50.
Eastern trade, The, 153.
East India Company, 142.
Elbe, SS., sunk, 136.
Elder, Dempster Steamship Line, 156, 235.
Elder, John, & Co., 100, 116, 132.
Eldridge, Captain, 106.
Elevator, The grain, 290.
Emerald, S., 254.
Emigrant ships, 20, 210.
Empress Steamship Line, 160.
Empire, S., 255.
Empire City, S., 271.
Enterprise, SS., 53.
Ericsson, John, inventor, 67.
Erie Canal, 280.
Erin, SS., lost, 115.
Etolia, SS., in the ice, 185.
Etruria, SS., 77, 119, 189.
Europa, SS., 75.
European, SS., 157.
Eutopia, SS., sunk, 114.
Evans, Captain, 185.
Exports from Montreal, 267.

Fares to India and the East, 147, 153.


Fairfield Ship-yard, 78, 100, 346.
Farlinger, Captain, 327.
Fast Line of Steamships, 236, 242.
Fast service to Japan, 156.
Favourite, sailing-ship, 196.
Fawcett, William, SS., 146.
Ferry-boats, American, 48.
First compound engine, 345.
First live stock shipment, 236.
First lake propeller, 252.
First steamer in Canada, 50, 312.
First steamer on Lake Ontario, 247.
First steamer on Lake Erie, 251.
First ocean steamship, 54.
First steam fog-horn, 347.
First steel steamship, 206.
First wheat shipment from Manitoba, 295.
Fleming, Sir Sandford, 159, 239, 242.
Floating elevators, 295.
Flying Squadron, The, 170.
Fox, Sir Douglas, 144.
Francis B. Ogden, S., 68.
Francis Smith, S., 255.
Frederick the Great, SS., 144.
Freight, inland rates, 303.
French Steamship Line, 138.
Friesland, SS., 113.
Frontenac, S., 247.
Fulda, SS., 86, 136.
Fulton, Robert, 41.
Furnessia, SS., 113.
Furness Steamship Line, 235.
Fürst Bismarck, SS., 131.

Gallia, SS., 234.


Garonne, SS., 148.
Gaskin, Captain, 263.
General Smyth, S., 343.
Genova, SS., 195.
German East African Steamship Line, 156.
Germanic, SS., 118, 127.
Gildersleeve, S., 320.
Gildersleeve, Manager, 316.
Glenmorag, ship, wrecked, 207.
Golconda, SS., 149.
Gore, S., 254.
Gothic, SS., 151.
Graham, Captain John, 210.
Grain-sucker, 291.
Grain elevator, 290.
Grand Trunk Railway opened, 328.
Grange, Captain, 209.
“Graphic,” The London, 171.
Graving-docks, 342.
Great Britain, SS., 61.
Great Eastern, SS., 62.
Great Lakes, The, 244.
Great Northern Transit Company, 288.
Great Republic, SS., 26.
Great Western, SS., 60.
Great Western Railroad Line, 327.
Grenville Canal, 318.
Griffin, schooner, 246.
Guion Steamship Line, 115.
Gulf ports, Map of, 241.

Hagart & Crangle Line, 287.


Haines, Captain, 89.
Haliburton, Judge, 93, 159.
Halifax harbour, 340.
Hall Steamship Line, 152.
Hamburg & American Steamship Packet Company, 130.
Hamilton, Captain Clarke, 327.
Hamilton, Hon. John, 323, 331.
Hamilton, S., 327.
Hamilton Steam Navigation Company, 330.
Handyside & Henderson, 113.
Hansa St. Lawrence Steamship Line, 235.
Harland & Wolff, 117, 123, 140, 151, 228.
Harrison, Captain, 86.
Havel, SS., 137.
Head Steamship Line, 235.
Henderson Steamship Line, 152.
Hennepin, Father, 246.
Hercules, S., 252, 309.
Hibernia, SS., 87
Hibernian, SS., 204.
Highlander, S., 324.
Hill Steamship Line, 129.
Himalaya, SS., 147.
Hindostan, SS., 146.
Hooker & Jones, forwarders, 318.
Hornet, torpedo destroyer, 169.
Horse-boat, The, 29.
Howard, Captain Thomas, 320, 327, 328.
Howe, Hon. Joseph, 159.
Howland, O. A., 301.
Hudson’s Bay Company, 332, 333.
Hungarian, SS., lost, 199, 200.

Icebergs, 183.
Idaho, SS., lost, 225.
Imrie, William, 117.
Independence, propeller, 257.
India, SS., 149.
India and the East, 142.
Indian, SS., 142, 198, 200.
Indiana, SS., U.S., 342.
Inman Steamship Line, 107.
International Steamship Line, 107, 109.
Inverclyde, Lord, 94, 99.
Ireland, propeller, 263.
Iron steamers, 61, 314.
Iron ore transportation, 279.
Iroquois, S., 326.
Ismay, Thomas, H., 116, 122.

James Swift, S., 331.


James Watt, S., 271.
John Jacob Astor, sail vessel, 256.
John Kenzie, brig, 254.
John Munn, S., 313.
Johnston Steamship Line, 235.
Jones, Captain J., 66, 202.
Jones, Captain Thomas, 209.
Jones, J. & J., forwarders, 318.
Jubilee Review, 170.
Judkins, Captain, 86.
Julia Palmer, propeller, 257.
Jura, SS., stranded, 202.

Kaiser Wilhelm der Grosse, SS., 136.


Kaiser Wilhelm II, 136.
Keefer, Thomas, C.E., 283, 301.
Kent, S., 254.
Kingsford, Historian, 263, 283.
Kingston, Ontario, 331.
Kingston, S., 327.
Klondike, Steam to, 164.

Labrador, SS., 223.


Lachine Canal, 259.
Lady Colborne, S., 314.
Lady Eglinton, S., 195.
Lady Elgin, S., 314.
Lady Sherbrooke, S., 310, 312.
Lady Washington, schooner, 247.
Lahn, SS., 136.
Lake Ontario, SS., 231.
Lake St. Peter, 266.
Lake Superior, SS., 231.
Lakes, Navigation Companies, 270.
Lakes, The Great, 244.
La France, ship, 28.
La Salle, explorer, 246.
La Bourgogne, SS., lost, 138.
La Touraine, SS., 138.
Lamport & Holt Steamship Line, 129, 157.
Lifeboats at sea, 125.
Lindall, Captain, 222.
Live stock exportation, 236.
Liverpool landing-stage, 81.
Liverpool packet-ships, 27.
Liverpool, SS., 58.
Lochearn, SS., collision, 140.
Locomotives, 294.
Lord Steamship Line, 129.
Lord Sydenham, S., 314.
Lott, Captain, 86, 88.
Lowe, James, inventor, 68.
Lucania, SS., 78.
Lusitania, SS., 148.

Magnet, S., 327.


Majestic, SS., 119.
Malsham, S., 310.
Manchester Ship Canal, 235.
Manhanset Steamship Line, 129.
Manitoba, S., 286.
Manitou, S., 270.
Map of the Gulf of St. Lawrence, 241.
Marjery, S., 40.
Marine distances, 175.
Mariposa, SS., wrecked, 225.
Marshall, Captain, 320.
Matiana, SS., 149.
Maudsley, Field & Company, engineers, 118.
Memphis, SS., lost, 235.
Merchant Lines, Hamilton, 287.
Merritt, Hon. William, 262.
Messageries Maritimes Steamship Company, 153.
Miller, Patrick, 31.
Milloy, Alexander, 316.
Miowera, SS., 164.
Missouri, SS., 141.
Moldavia, SS., 186.
Molson, Hon. John, 307.
Monarch, S., 287.
Montana, SS., 115.
Montreal Ocean Steamship Company, 198.
Montreal, Port of, 266.
Montreal steamer burned, 315.
Montreal Transportation Company, 286.
Moodie, Captain, 86.
Moravian, SS., wrecked, 202.
Morris, Hon. Alex., 159.
MountStephen, Lord, 164.
Munro, Thomas, C.E., 301.
Murrell, Captain, 141.
Mutiny at sea, 24.
Macaulay, Captain, 227.
Macdougall, Captain John, 57.
Maclean, Captain N., 217.
Macleod, Dr. Norman, 179.
Macpherson, Crane & Co., 318.
McIver, David, 71, 95.
McKean, McLarty & Co., 195.
McKenzie, Captain, 248.
McKinstry, Captain, 127.
McLennan, Hugh, 296.
McMaster, Captain, 209.

Napier, David, 35.


Napier, Robert, 71, 96, 148, 168, 205.
Napoleon, S., 314.
Naronic, SS., lost at sea, 122.
Natal Steamship Line, 156.
National Steamship Line, 114.
Navy, The Royal, 166, 175.
Nestorian, SS., 205.
Netherlands Steamship Line, 140.
New England, SS., 229.
Newfoundland, 354.
New York, SS., 108, 111.
Niagara Ship Canal, 302.
Niagara, SS., 74.
Niagara Steam Navigation Company, 329.
Nile, SS., 157.
Norman, SS., 155.
Normannia, SS., 131.
North Atlantic Steamship Company, P. E. I., 349.
North American, SS., 199.
North Briton, SS., lost, 202.
North American Transport Company, 129.
Northern Light, S., 351.
Northern Steamship Company, 272.
North German Lloyd Steamship Company, 134.
North Shore Navigation Company, 288.
North-West Fur Company, 256, 277.
North-West Navigation Company, 333.
North-West, S., 274.
North-West Transportation Company, 287.
Norwegian, SS., wrecked, 202, 204.
Nova Scotia, 340.
Nova Scotian, SS., 199.

Ogilvie, W. W., 297.


Oldfield, S., 321.
Old Man of the Sea, 102.
Ontario Lake Navigation, 328.
Ontario, S., 248, 326.
Ontario, SS., 222.
Ophir, SS., 148.
Oregon, SS., Cunard, sunk, 86, 99.
Oregon, SS., Dominion, 222.
Orient Steam Navigation Company, 147.
Orizaba, SS., 157.
Ottawa, SS., 195, 225.
Ottawa and Georgian Bay Canal, 304.
Ottawa and Rideau Forwarding Company, 310, 318.
Ottawa River steamers, 321.
Ottawa River Navigation Company, 318.
Overland route, The, 143.
Owego, S., 270.

Pacific, SS., 104, 106.


Pacific Steamship Navigation Company, 157.
Packet-ships, 27.
Papin, Denis, 20.
Paris, SS., 108, 125, 189.
Parisian, SS., 205.
Parsell, Captain, 123.
Passport, S., 327.
Patterson of Bristol, 60.
Paynter, George, 102.
Penelope, H.M.S., 168.
Peninsular and Oriental Steamship Company, 145.
Pennsylvania, SS., 101, 134.
Persia, SS., 75, 97.
Peruvian, SS., 205.
Peterson, Tate & Co., 237.
Phœnician, SS., 207.
Pioneer, S., 252.
Ploughboy, S., 254.
Polynesian, SS., 205.
Pomeranian in a storm, 203.
Pomone, French war-ship, 69.
Postal compensation, 132.
President, SS., lost at sea, 61.
Prince Edward Island, 347.
Prince of Wales, war-ship, 168.
Princeton, war-ship, 69.
Priscilla, S., 44.
Provisions, Ships’, 83.
Puffers, 319.
Pumper, S., 264.

Quebec Province, 307.


Quebec and Halifax Steamship Company, 66.
Quebec, S., 311.
Quebec Steamship Company, 235.
Queen Charlotte, S., 249.
Queen City, S., 293.
Quetta, SS., wrecked, 149.

Racing at sea, 125.


Randolph, Elder & Co., 100.
Rates of passage, 124.
Rathbun Company, 330.
Rattler, H.M.S., 69.
Recovery, brigantine, 256.
Red Star Steamship Line, 112.
Renown, H.M.S., 172.
Republic, SS., White Star, 118.
Richardson, Captain, 217.
Richard Smith, S., 347.
Richards, Mills & Co., 224.
Richelieu Steamboat Company, 314.
Rideau Canal, 264.
Ritchie, Captain, 216.
Robert Garrett, S., 48.
Rob Roy, S., 40.
Rockefeller Fleet, 271.
Rosemount, S., 286.
Royal Mail West Indies Steam-Packet Company, 156.
Royal William, S.S., 54, 340, 347.
Rubattino Steamship Line, 153.
Russell, Scott, 63.
Russia, SS., 75.

Sail versus Steam, 247.


Salier, SS., lost at sea. 136.
Sampson, propeller, 252.
Sam Ward, S., 257.
Sarah Sands, SS., 195.
Sardinian, SS., 205, 217.
Sarmatian, SS., 198.
Sarnia, SS., 222.
Sault Ste. Marie Canal, 276.
Savannah, SS., 51.
Scotia, SS., 75, 97.
Scotsman, SS., 225.
Scott & Company, 138.
Schiller, SS., wrecked, 134.
Screw propeller, The, 67.
Sealing steamers, 355.
Servia, SS., 76.
Shaw, Savill and Albion Steamship Company, 151.
Shenango, ferry steamer, 49.
Shepherd, Captain H. W., 322.
Shepherd, Captain R. W., 321.
Ship-building, 279.
Ship canals, 303.
Siberian, SS., 206.
Simpson, Sir George, 258.
Simcoe, General, 258.
Sirius, SS., 59.
Sir Robert Peel, S., 324.
Smith, T. P., inventor, 67.
Smith, Captain W. H., 194, 214.
Smith, Donald A., 159.
Smythe, Major C., 158.
Sophia, S., 249.
Sovereign, S., 317.
Spaarndam, SS., 141.
Spitfire, H.M.S., 354.
Spithead reviews, 173.
Spree, SS., 136.
Stanley, S. P. E. I., 351.
State Steamship Line, 129.
Steam Navigation in British Columbia, 334.
Steam Navigation in New Brunswick, 343.
Steam Navigation on the Ottawa, 317.
Steam Navigation in Newfoundland, 354.
Steam Navigation in Nova Scotia, 340.
Steam Navigation in Prince Edward Island, 347.
Steam Navigation in Quebec, 307.
Steam Navigation in Manitoba, 332.
Steam Navigation in Ontario, 323.
Stearns, Captain, 324.
Steel barges, 282.
Steel steamships. First, 206.
Stephen, George, 159, 164.
Stewart, Macleod, 304.
Stone, Captain, 86.
Strachan, Bishop, 21.
St. George, SS., wrecked, 202.
St. John harbour, N. B., 345.
St. Lawrence canals, 258, 264.
St. Lawrence route, 192.
St. Mary’s Falls Canal, 276, 278.
St. Louis, SS., 110.
St. Paul, SS., 110.
Strathcona, Lord, 159, 164.
Subsidies to steamship companies, 104, 111, 161.
Subventions, 120.
Suez Canal, 144, 149.
Summary of Steam Navigation, 356.
Sunday at sea, 178.
Sutherland, Captain, 327.
Swearing, Profane, 220.
Swiftsure, S., 310.
Symington, William, 31.

Tartar, SS., 164.


Taylor, T. F., 284.
Taylor, Dr. W. M., 179.
Tate Brothers, builders, 314.
Thingvalla Steamship Line, 141.
Thomas MacKay, S., 320.
Thomson, J. A., steamboat inspector, 334.
Thomson Steamship Line, 235.
Thomson, J. and G., steamship builders, 108, 113, 123.
Teutonic, SS., 119, 174.
Tidal waves, 188.
Tod & McGregor, engineers, 107.
Tonnage on the Great Lakes, 276.
Toronto and Steam Navigation, 329.
Torpedo boats, 169.
Torrance, John, 228, 308.
Torrance, Messrs. David, & Co., 221, 307.
Transportation companies, 284.
Transportation business, 289.
Trave, SS., 136.
Trent, SS., 88.
Trevethick, Engineer, 67.
Tripoli, SS., lost, 86.
Twohey, Captain, 324.

Ulster Steamship Company, 235.


Umbria, SS., 77, 119.
Unicorn, SS., 75.
Union Steamship Company, Africa, 154.
Union Steamship Company, New Zealand, 151.
United Empire, S., 287.
United Empire Loyalists, 258, 296.
United Kingdom, SS., 40.
United States Shipping Company, 129.
Up-to-date steamships, 18.
Utica, barge, 270.

Vancouver Island, 336.


Vancouver, SS., 222.
Vandalia, propeller, 252.
Vesta, SS., 106.
Vicksburg, SS., lost, 224.
Victoria, B. C., founded, 336.
Victoria Steamboat Association, 38.
Ville de Havre, SS., lost, 140.
Ville de Ciotat, SS., 153.
Voyageurs, Early, 258.

Waghorn, Lieut., 143.


Waldensian, SS., 207.
Walk-in-the-Water, S., 251.
Ward & Co., 310, 311.
Waring, Captain W. L., 345.
Warrimoo, SS., 164.
Warrior, H. M. S., 168.
Washington, schooner, 246.
Waterways of Canada, 244.
Watt, James, engineer, 67.
Welland Canal, 262.
West Indies and Pacific Steamship Lines, 156.
Whale captured, 312.
White Star Steamship Line, 116.
William Fawcett, SS., 146.
William IV., S., 324.
Williams, Captain, 122.
Wilson Connoly Company, 313.
Wilson Steamship Line, 128.
Winter Ferry, P. E. I., 349.
Woodcroft, Engineer, 67.
Woodruff, Captain, 74.
World’s Steamers, 357.
Wylie, Captain, 212.

Young, Captain, 128.

FOOTNOTES:
[1] “The Atlantic Ferry,” p. 175.
[2] If my recollection serves me aright, there were not more than
a dozen cabin passengers, and the only one of them who
ventured aloft with me was my now venerable friend, Mr. Robert
W. Graham, of the Montreal Star.
[3] “Denis Papin,” by Henry C. Ewart, in Sunday Magazine, 1880,
p. 316.
[4] Mr. Symington’s account of his interview with Mr. Fulton, as
given in the “Encyclopædia Britannica,” is as follows: “When
engaged in these experiments, I was called upon by Mr. Fulton,
who told me he was lately from North America, and intended
returning thither in a few months, but could not think of leaving
this country without first waiting upon me in expectation of
seeing the boat, and procuring such information regarding it as I
might be pleased to communicate.... In compliance with his
earnest request, I caused the engine fire to be lighted up, and in
a short time thereafter put the steamboat in motion, and carried
him four miles west on the canal, returning to the point from
which we started in one hour and twenty minutes (being at the
rate of six miles an hour), to the great astonishment of Mr. Fulton
and several gentlemen, who at our outset chanced to come on
board. During the trip Mr. Fulton asked if I had any objection to
his taking notes regarding the steamboat, to which I made no
objection, as I considered the more publicity that was given to
any discovery intended for the general good, so much the
better.... In consequence he pulled out a memorandum book,
and, after putting several pointed questions respecting the
general construction and effect of the machine, which I answered
in a most explicit manner, he jotted down particularly everything
then described, with his own observations upon the boat during
the trip.”
[5] “The Story of Helensburgh,” 1894, p. 92.
[6] These cuts, copied from Stanton’s “American Steam Vessels,”
represent first class Mississippi and Ohio light-draught, high-
pressure river steamers. The J. M. White, of 1878, was deemed
“a crowning effort in steamboat architecture in the West.” She
was 320 feet long and 91 feet in width, over the guards. Her
saloons were magnificently furnished, and all her internal fittings
of the most elaborate description. She carried 7,000 bales of
cotton and had accommodation for 350 cabinpassengers. Her
cost was $300,000. She was totally destroyed by fire in 1886.
[7] “Our Ocean Railways,” p. 69.
[8] Sufficient importance was attached to this matter to cause the
two Houses of Parliament, in Ottawa, to order a brass tablet,
commemorative of the event, to be placed in the corridor of the
Library of Parliament. The tablet, of which a facsimile is presented
in our frontispiece, was unveiled with fitting ceremony by His
Excellency the Governor-General, on the occasion of the opening
of the Colonial Conference, June 28th, 1894.—Vide: “The Journals
of the Colonial Conference” (Appendix); “Journal of the House of
Commons,” 1894; “Transactions of the Royal Society of Canada.”
[9] Others say 10½ days.
[10] Fry’s “History of Steam Navigation,” p. 182.
[11] Encyclopedia Britannica, 8th Ed., Vol. xx, p. 657.
[12] “Our Ocean Railways,” p. 75.
[13] For at least a hundred and fifty years the Post Office
Department had maintained a fleet of armed mail “packets.” They
had stations at Dover, Harwich, Holyhead, Milford, Yarmouth and
Falmouth, the last-named being the headquarters of the fleet.
During the time of the American war, 1812-15, no fewer than
thirty-two sanguinary battles were fought with American
privateers by the Falmouth packets, which, in a majority of
instances, successfully resisted their assailants.
[14] Sir John Burns in Good Words for 1887, p. 261.
[15] Fry’s “History,” p. 240.
[16] The invention is claimed for Canada in Chapter X., under the
heading of “New Brunswick.”
[17] The St. Paul, St. Louis, Paris and New York have all been
taken over by the United States Government and fitted up as
armed cruisers, the names of the last two being changed to
Harvard and Yale.
[18] Fry’s “History,” p. 193.
[19] The Germanic has since been overhauled and has now a set
of triple expansion engines, making her a seventeen-knot boat. In
July, 1895, she crossed from Queenstown to New York in 6 days,
23 hours, 45 minutes.
[20] Fry’s “History,” p. 180.
[21] A missionary of the Church of England, who ministered to a
few poor fishermen at Terence Bay, at the imminent risk of his life
put off to the wreck in a small boat and succeeded in saving the
life of the first officer of the ship after all hope of further rescue
had been abandoned, and when even the hardy fishermen
forbade the rash attempt. Mr. Ancient had formerly been attached
to the British navy, and during this heartrending scene acted the
part of a hero in his efforts to save life and to relieve the
sufferings of the survivors. Captain Williams was severely
censured, and had his certificate suspended for two years.
[22] This was written before the Hispano-American war began;
since then several of these vessels have been employed by the
United States Government with a change of nomenclature.
[23] “U. S. A. Report on Navigation for 1896,” p. 104.
[24] Last April the great Kaiser surpassed her previous record,
making the voyage from New York to Southampton (3,065 knots)
in 5 days, 17 hours, 8 minutes, showing an average speed of
22.35 knots per hour.
[25] The “Bourgogne” Disaster.—Since the sinking of the Eutopia in
Gibraltar Bay in 1891, no such marine disaster has occurred as
that which recently befell the SS. Bourgogne—a, tragedy in some
respects the most appalling that has ever been recorded. This
vessel of 7,795 tons—one of the finest of the French line of
steamers—sailed from New York for Havre on the 2nd of July,
1898, with a ship’s company, including passengers and crew, of
726 souls. Early on the morning of the 4th, when about sixty
miles south of Sable Island, during a dense fog, and while
running at the rate of some eighteen knots an hour, she came
into collision with the British sailing ship Cromartyshire, of 1,554
tons, and in a very short time foundered, carrying down with her
about 520 persons. Had it not been for her collision bulkhead the
Cromartyshire must have sunk, too. As it was, she was badly
damaged, but hove to all day in the hope of picking up survivors.
In the meantime the Allan SS. Grecian came up to the scene of
the disaster, the rescued passengers were taken on board, and
the disabled ship was towed into Halifax harbour. The survivors
were the purser of the steamship, three engineers, thirty of the
crew, and 170 passengers—204 in all. Of the seventy-two ladies
in the first cabin only one was saved. Captain Deloncle,
commander of the Bourgogne, was a lieutenant in the navy, and
a knight of the Legion of Honour, having under him a competent
staff of officers who appear to have done what they could to save
the lives of others. All of them went down with their ship into the
sailor’s grave. The loss of life was appalling, but even more
heartrending were the accounts given of the barbarous conduct
of some of the steerage passengers and sailors in the terrible
struggle for self-preservation.
[26] Fry’s “History,” p. 309.
[27] “Whitaker’s Almanack,” 1897, p. 543.
[28] “Our Ocean Railways,” p. 119.
[29] “Statistical Year-Book, 1896,” under Railways, p. 20.
[30] The Duke of Wellington was 240.6 feet long, 60 feet beam,
3,826 tons burthen, and 2,500 horse-power. She was engined by
Robert Napier & Sons, Glasgow, with geared engines and wooden
cogs, and made 10.2 knots an hour on her trial trip in 1853. The
Rattler, of 1851, was 179½ feet long, 32¾ feet beam, had
geared engines of 436 horse-power, and attained a speed of 10
knots.
[31] See also p. 90.
[32] Based on a compilation by Captain W. H. Smith.
[33] “Encyclopedia Brit.,” Vol. xvii., p. 581, 8th Ed.
[34] The Angloman was wrecked on the Skerries, in the Irish Sea,
in February, 1897. The crew were rescued, but the ship, with her
valuable cargo and a large number of cattle, became a total loss,
though fully covered by insurance.
[35] The SS. Memphis, of the African Steamship Company, but
employed by the Elder, Dempster Line, went ashore on the west
coast of Ireland in a fog in November, 1896, and became a total
wreck. Ten of the crew were drowned and 350 head of cattle.
[36] The Manchester ship canal is 35 miles long, 120 feet bottom
width, and 26 feet in depth. The docks at Manchester cover 104
acres and have five miles of quays. It was estimated to cost
£10,000,000 sterling, but cost over £15,000,000 before it was
completed. Arrangements are in progress by a Manchester
syndicate for the establishment of a weekly line of steamships of
8,500 tons capacity, to be provided with cold storage and the
most approved equipments for carrying live stock. The best
modern appliances for loading and discharging cargo, grain
elevators being included, are among the attractions which
enterprising Manchester presents to the shipping trade of
Canada.
[37] “Montreal Board of Trade Report, 1897,” pp. 52, 88.
[38]
DIMENSIONS OF THE GREAT LAKES.

Greatest Above Area.


Length. Depth.
LAKES. Width. Sea. (Sq.
(Miles.) (Feet.)
(Miles.) (Feet.) Miles.)
Ontario 180 65 500 247 7,300
Erie 240 80 210 573 10,000
Huron 280 190 802 581 24,000
Michigan
335 88 868 581 25,600

Superior 420 160 1,008 601 32,000
‡ Lake Michigan lies wholly within the United States.
[39] These figures refer exclusively to vessels belonging to the
merchant marine of the United States on the Great Lakes and are
taken from official reports.
[40] Mr. C. H. Keep, in his report on the “Internal Commerce of
the United States for 1891,” has given a graphic History of
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like