100% found this document useful (1 vote)
26 views

Convolutional Neural Networks with Swift for Tensorflow: Image Recognition and Dataset Categorization 1st Edition Brett Koonce - The latest ebook edition with all chapters is now available

The document provides information about the book 'Convolutional Neural Networks with Swift for TensorFlow' by Brett Koonce, which focuses on image recognition using convolutional neural networks. It outlines the book's structure, including sections on basics, advanced techniques, mobile applications, and state-of-the-art methods in deep learning. Additionally, it emphasizes the use of a command-line interface for effective learning and cost management in machine learning workflows.

Uploaded by

scuravort
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
26 views

Convolutional Neural Networks with Swift for Tensorflow: Image Recognition and Dataset Categorization 1st Edition Brett Koonce - The latest ebook edition with all chapters is now available

The document provides information about the book 'Convolutional Neural Networks with Swift for TensorFlow' by Brett Koonce, which focuses on image recognition using convolutional neural networks. It outlines the book's structure, including sections on basics, advanced techniques, mobile applications, and state-of-the-art methods in deep learning. Additionally, it emphasizes the use of a command-line interface for effective learning and cost management in machine learning workflows.

Uploaded by

scuravort
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 72

Read Anytime Anywhere Easy Ebook Downloads at ebookmeta.

com

Convolutional Neural Networks with Swift for


Tensorflow: Image Recognition and Dataset
Categorization 1st Edition Brett Koonce

https://ebookmeta.com/product/convolutional-neural-networks-
with-swift-for-tensorflow-image-recognition-and-dataset-
categorization-1st-edition-brett-koonce/

OR CLICK HERE

DOWLOAD EBOOK

Visit and Get More Ebook Downloads Instantly at https://ebookmeta.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Introduction to Lattice Algebra: With Applications in AI,


Pattern Recognition, Image Analysis, and Biomimetic Neural
Networks 1st Edition Ritter Gerhard X.
https://ebookmeta.com/product/introduction-to-lattice-algebra-with-
applications-in-ai-pattern-recognition-image-analysis-and-biomimetic-
neural-networks-1st-edition-ritter-gerhard-x/
ebookmeta.com

Advanced Applied Deep Learning: Convolutional Neural


Networks and Object Detection 1st Edition Umberto
Michelucci
https://ebookmeta.com/product/advanced-applied-deep-learning-
convolutional-neural-networks-and-object-detection-1st-edition-
umberto-michelucci/
ebookmeta.com

Deep Learning with Swift for TensorFlow Differentiable


Programming with Swift 1st Edition Rahul Bhalley

https://ebookmeta.com/product/deep-learning-with-swift-for-tensorflow-
differentiable-programming-with-swift-1st-edition-rahul-bhalley/

ebookmeta.com

Gestation A Dystopian LitRPG Adventure Project Chrysalis


Book 1 1st Edition John Gold

https://ebookmeta.com/product/gestation-a-dystopian-litrpg-adventure-
project-chrysalis-book-1-1st-edition-john-gold/

ebookmeta.com
Excursions in World Music Seventh Edition Volume 2 Bruno
Nettl & Timothy Rommen

https://ebookmeta.com/product/excursions-in-world-music-seventh-
edition-volume-2-bruno-nettl-timothy-rommen/

ebookmeta.com

R Programming for Data Science Roger D. Peng

https://ebookmeta.com/product/r-programming-for-data-science-roger-d-
peng/

ebookmeta.com

Junian Latinity in the Roman Empire Volume 1: History,


Law, Literature 1st Edition Pedro López Barja (Editor)

https://ebookmeta.com/product/junian-latinity-in-the-roman-empire-
volume-1-history-law-literature-1st-edition-pedro-lopez-barja-editor/

ebookmeta.com

Deep Learning Approaches to Cloud Security Deep Learning


Approaches for Cloud Security 1st Edition

https://ebookmeta.com/product/deep-learning-approaches-to-cloud-
security-deep-learning-approaches-for-cloud-security-1st-edition/

ebookmeta.com

Spiralizer Cookbook: Turn Vegetables into Delicious Noodle


Meals (spiralize it Book 1) 1st Edition Admire Publishing

https://ebookmeta.com/product/spiralizer-cookbook-turn-vegetables-
into-delicious-noodle-meals-spiralize-it-book-1-1st-edition-admire-
publishing/
ebookmeta.com
Specialty imaging Breast MRI a comprehensive imaging guide
1st Edition Sughra Raza Robyn L Birdwell

https://ebookmeta.com/product/specialty-imaging-breast-mri-a-
comprehensive-imaging-guide-1st-edition-sughra-raza-robyn-l-birdwell/

ebookmeta.com
Brett Koonce

Convolutional Neural Networks with


Swift for Tensorflow
Image Recognition and Dataset Categorization
1st ed.
Brett Koonce
Jefferson, MO, USA

Any source code or other supplementary material referenced by the


author in this book is available to readers on GitHub via the book’s
product page, located at www.​apress.​com/​978-1-4842-6167-5. For
more detailed information, please visit http://​www.​apress.​com/​
source-code.

ISBN 978-1-4842-6167-5 e-ISBN 978-1-4842-6168-2


https://doi.org/10.1007/978-1-4842-6168-2

Apress standard
© Brett Koonce 2021

This work is subject to copyright. All rights are reserved by the


Publisher, whether the whole or part of the material is concerned,
specifically the rights of translation, reprinting, reuse of illustrations,
recitation, broadcasting, reproduction on microfilms or in any other
physical way, and transmission or information storage and retrieval,
electronic adaptation, computer software, or by similar or dissimilar
methodology now known or hereafter developed.

The use of general descriptive names, registered names, trademarks,


service marks, etc. in this publication does not imply, even in the
absence of a specific statement, that such names are exempt from the
relevant protective laws and regulations and therefore free for general
use.

The publisher, the authors and the editors are safe to assume that the
advice and information in this book are believed to be true and accurate
at the date of publication. Neither the publisher nor the authors or the
editors give a warranty, expressed or implied, with respect to the
material contained herein or for any errors or omissions that may have
been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

Distributed to the book trade worldwide by Springer Science+Business


Media New York, 1 NY Plaza, New York, NY 10014. Phone 1-800-
SPRINGER, fax (201) 348-4505, e-mail [email protected],
or visit www.springeronline.com. Apress Media, LLC is a California LLC
and the sole member (owner) is Springer Science + Business Media
Finance Inc (SSBM Finance Inc). SSBM Finance Inc is a Delaware
corporation.
Introduction
In this book, we are going to learn convolutional neural networks by
focusing on the specific problem of image recognition, using Swift for
Tensorflow and a command-line Unix approach. If you are new to this
field, then I would suggest you read the first few chapters and get a
working system bootstrapped and then spend your time going through
the basics with MNIST and CIFAR repeatedly, in particular familiarizing
yourself with how neural networks work. If you feel comfortable with
the core concepts already, then feel free to skip ahead to the middle
where we explore some more powerful convolutional neural networks.

Why Swift
The short version is that I believe swift is a modern, open source,
beginner-friendly language that has proven itself by solving real
problems for iOS developers daily. By integrating automatic
differentiation into the programming language, a number of interesting
compiler techniques to address the limitations of current machine
learning software and hardware become possible in the long term. This
is in my opinion where the world is headed, one way or another.

Why image recognition


Image recognition is one of the oldest, most well-understood uses of
neural networks. As a result, we can introduce the basics and then build
up to advanced state-of-the-art approaches in a logically consistent
manner. With this foundation, you will be able to branch out to tackle
other image-related tasks (e.g., object detection and segmentation)
easily. The deep learning techniques needed to build large-scale
convolutional neural networks translate easily to reinforcement
learning and generative adversarial networks (GANs) , two important
areas of modern research. In addition, I believe this foundation will
make it easy to make the transition to time sequence models such as
recurrent neural networks (RNNs) and long short-term memory
(LSTM) once you have mastered CNNs.
Why CLI
Broadly speaking, this book is going to focus on a command-line
interface (CLI) –based approach using both a local machine on your
home network and virtual machines in the remote, Google Cloud. This
is in my opinion the best approach because
We can control costs very effectively. In the worst-case scenario, you
can perform the majority of your work using a local machine built for
under a thousand dollars, and your only remaining cost will be
electricity and time.
We can scale easily from anywhere in the world. Using cloud
instances full time can quickly become expensive, and so many
people avoid learning cloud workflows. But using on-demand cloud-
based resources periodically to augment your local workflow means
you can learn the cloud in a very practical and efficient way.
Eventually, you will be able to prototype and build solutions on your
primary machine , then quickly scale them up in the cloud to
parallelize computation and access more powerful hardware when
needed or available.
We can get the best of both worlds. While minimizing costs is
certainly important, I have found that focusing on how much money
you are spending tends to produce a mindset where you are afraid to
try new things and experiment in general. Building your own
machine puts you into the mindset of putting in more cycles to
reduce your costs, which is in my opinion the key to success.
So, toward this end, we will utilize a command-line workflow with
the following goals:
We will use a local terminal interface to log in to all of our
machines, so that there is literally no difference between our
approaches on the desktop and in the cloud.
We will utilize the same operating system and software locally
and in the cloud so that we do not have to learn about differences
between platforms. Then, by definition, any workflow you can do on
your computer, you will be able to do in the cloud, and vice versa.
Ultimately, by blurring the line between your personal computer
and the cloud, my goal is for you to understand that there is
fundamentally no difference between doing things locally or remotely.
The real limiting factor then is your imagination, not resources.
Doing things this way will be more work at first, I will admit. But
once you have mastered this workflow, it will be much easier for you to
scale in the future. If you are willing to put in the time now, this
approach will make your skills much more flexible and powerful in the
future. What you do with them is up to you.
How this book is organized
This book is organized as follows.

Basics
We will explore the basic building blocks of neural networks and how
to combine them with convolutions to perform simple image
recognition tasks.
Neural networks (1D MLP/multilayer perceptron) and MNIST
Convolutional neural networks (2D CNN) and MNIST
Color, CNN stacks, and CIFAR

Advanced
We will build upon the above to produce actual state-of-the-art
approaches in this field.
VGG16
ResNet 34
ResNet 50

Mobile
We will look at some different approaches for mobile devices, which
require us to utilize our computing resources carefully.
SqueezeNet
MobileNet v1
MobileNet v2

State of the art


We will look at the work that leads up to EfficientNet, the current state
of the art for image recognition. Then we will look at how people are
working on finding ways to produce similar results by combining many
different papers together.
EfficientNet
MobileNetV3
Bag of tricks/reading papers

Future
We will zoom out a bit and look at why I am excited about swift for
tensorflow as a whole and give you my vision of what the future of
machine learning looks like.
MNIST revisited
You are here

Appendices
Here’s some information that didn’t quite fit in with the above but I still
feel is important:
A: Cloud Setup
B: Hardware Prerequisites, Software Installation Guidelines, and Unix
Quickstart
C: Additional Resources
Table of Contents
Chapter 1:​MNIST:​1D Neural Network
Dataset overview
Dataset handler
Code:​Multilayer perceptron + MNIST
Results
Demo breakdown (high level)
Imports (1)
Model breakdown (2)
Global variables (3)
Training loop:​Updates (4)
Training loop:​Accuracy (5)
Demo breakdown (low level)
Fully connected neural network layers
How the optimizer works
Optimizers + neural networks
Swift for Tensorflow
Side quests
Recap
Chapter 2:​MNIST:​2D Neural Network
Convolutions
3x3 additive blur example
3x3 Gaussian blur example
Combined 3x3 convolutions – Sobel filter example
3x3 striding
Padding
Maxpool
2D MNIST model
Code
Side quest
Recap
Chapter 3:​CIFAR:​2D Neural Network with Blocks
CIFAR dataset
Color
Breakdown
Code
Results
Side quest
Recap
Chapter 4:​VGG Network
Background:​ImageNet
Getting ImageNet
Imagenette dataset
Data augmentation
VGG
Code
Results
Memory usage
Model refactoring
VGG16 with subblocks
Side quests
Recap
Chapter 5:​ResNet 34
Skip connections
Noise
Batch normalization
Code
Results
Side quest
Recap
Chapter 6:​ResNet 50
Bottleneck blocks
Code
Results
Side Quest:​ImageNet
Recap
Chapter 7:​SqueezeNet
SqueezeNet
Fire modules
Deep compression
Model pruning
Model quantization
Size metric
Difference between SqueezeNet 1.​0 and 1.​1
Code
Training loop
Results
Side quest
Recap
Chapter 8:​MobileNet v1
MobileNet (v1)
Spatial separable convolutions
Depthwise convolutions
Pointwise convolutions
ReLU 6
Example of the reduction in MACs with this approach
Code
Results
Recap
Chapter 9:​MobileNet v2
Inverted residual blocks
Inverted skip connections
Linear bottleneck layers
Code
Results
Recap
Chapter 10:​EfficientNet
Swish
SE (Squeeze + Excitation) block
Code
Results
EfficientNet variants
EfficientNet [B1-8]
RandAugment
Noisy Student
EfficientDet
Recap
Chapter 11:​MobileNetV3
Hard swish and hard sigmoid
Remove the Squeeze and Excitation (SE) block logic for half the
network
Custom head
Hyperparameters
Performance
Code
Results
Recap
Chapter 12:​Bag of Tricks
Bag of tricks
What to learn from this
Reading papers
Stay behind the curve
How I read papers
Recap
Chapter 13:​MNIST Revisited
Next steps
Pain points
TPU case study
Tensorflow 1 + Pytorch
Enter functional programming
Swift + TPU demo
Results
Recap
Chapter 14:​You Are Here
A (short and opinionated) history of computing
History of GPUs
Cloud computing
Crossing the chasm
Computer vision
Direct applications
Indirect applications
Natural language processing
Reinforcement learning and GANs
Simulations in general
To infinity and beyond
Why Swift
Why LLVM
Why MLIR
Why ML is the most important field
Why now
Why you
Appendix A:​Cloud Setup
Outline
Google Cloud with CPU instances
How to sign up for Google Cloud
Creating your first few instances
Google Cloud with preconfigured GPU instance
Google Cloud nits
Cattle, not pets
Basic Google Cloud nomenclature
Cleaning up
Recap
Appendix B:​Hardware Prerequisites, Software Installation
Guidelines, and Unix Quickstart
Hardware
Don’t go alone!
GPU
Multiple GPUs
CPU
RAM
SSD
Recommendations
Hardware recap
Installing Ubuntu
General prep
OS install
Ubuntu recap
Installing swift for tensorflow
Installing graphics card drivers and swift for tensorflow
Swift for Tensorflow recap
Installing s4tf from scratch
There be dragons here
Installing s4tf from scratch recap
Client setup process + Unix quickstart
Setting up your client computer/​crash course in Unix
General config
Configuring your network for remote access
Crash course in tmux
Appendix C:​Additional Resources
Python --> swift transition guide
Python 3
REPL
Python --> Swift bridge
Python --> C bridge
Python libraries
Self-study guide
Things to study
System monitoring/​utilities
Index
About the Author
Brett Koonce
is the CTO of QuarkWorks, a mobile consulting agency. He’s a developer
with 5 years of experience creating apps for iOS and Android. His team
has worked on dozens of apps that are used by millions of people
around the world. Brett knows the pitfalls of development and can help
you avoid them. Whether you want to build something from scratch,
port your app from iOS to Android (or vice versa), or accelerate your
velocity, Brett can help.
About the Technical Reviewer
Vishwesh Ravi Shrimali
graduated from BITS Pilani in 2018, where he studied mechanical
engineering. Since then, he has worked with Big Vision LLC on deep
learning and computer vision and was involved in creating official
OpenCV AI courses. Currently, he is working at Mercedes-Benz
Research and Development India Pvt. Ltd. He has a keen interest in
programming and AI and has applied that interest in mechanical
engineering projects. He has also written multiple blogs on OpenCV and
deep learning on LearnOpenCV, a leading blog on computer vision. He
has also coauthored Machine Learning for OpenCV 4 (Second Edition)
by Packt. When he is not writing blogs or working on projects, he likes
to go on long walks or play his acoustic guitar.
© Brett Koonce 2021
B. Koonce, Convolutional Neural Networks with Swift for Tensorflow
https://doi.org/10.1007/978-1-4842-6168-2_1

1. MNIST: 1D Neural Network


Brett Koonce1
(1) Jefferson, MO, USA

In this chapter, we will look at a simple image recognition dataset called


MNIST and build a basic one-dimensional neural network, often called a
multilayer perceptron, to classify our digits and categorize black and
white images.

Dataset overview
MNIST (Modified National Institute of Standards and Technology) is a
dataset put together in 1999 that is an extremely important testbed for
computer vision problems. You will see it everywhere in academic
papers in this field, and it is considered the computer vision equivalent
of hello world. It is a collection of preprocessed grayscale images of
hand-drawn digits of the numbers 0–9. Each image is 28 by 28 pixels
wide, for a total of 784 pixels. For each pixel, there is a corresponding 8-
bit grayscale value, a number from 0 (white) to 255 (completely black).
At first, we’re not even going to treat this as actual image data. We’re
going to unroll it – we’re going to take the top row and pull off each row
at a time, until we have a really long string of numbers. We can imagine
expanding this concept across the 28 by 28 pixels to produce a long row
of input values, a vector that’s 784 pixels long and 1 pixel wide, each
with a corresponding value from 0 to 255.
The dataset has been cleaned so that there’s not a lot of non-digit
noise (e.g., off-white backgrounds). This will make our job simpler. If you
download the actual dataset, you will usually get it in the form of a
comma-separated file, with each row corresponding to an entry. We can
convert this into an image by literally assigning the values one a time in
reverse. The actual dataset is 60000 hand-drawn **training** digits with
corresponding **labels** (the actual number), and 10000 **test** digits
with corresponding **labels**. The dataset proper is usually distributed
as a python pickle (a simple way of storing a dictionary) file (you don’t
need to know this, just in case you run across this online).
So, our goal is to learn how to correctly guess what number we are
looking at in the **test** dataset, based on our **model** that we have
learned from the **training** dataset. This is called a **supervised
learning** task since our goal is to emulate what another human (or
model) has done. We will simply take individual rows and try to guess
the corresponding digit using a simple version of a neural network
called a **multilayer perceptron**. This is often shortened to **MLP**.

Dataset handler
We can use the dataset loader from “swift-models,” part of the Swift for
Tensorflow project, to make dealing with the preceding sample simpler.
In order for the following code to work, you will need to use the
following swift package manager import to automatically add the
datasets to your code.
BASIC: If you are new to swift programming and just want to get
started, simply use the swift-models checkout you got working in the
chapter where we set up Swift for Tensorflow and place the following
code (MLP demo) into the “main.swift” file in the LeNet-MNIST example
and run “swift run LeNet-MNIST”.
ADVANCED: If you are a swift programmer already, here is the base
swift-models import file we will be using:

```
/// swift-tools-version:5.3
// The swift-tools-version declares the minimum
version of Swift required to build this package.

import PackageDescription

let package = Package(


name:
"ConvolutionalNeuralNetworksWithSwiftForTensorFlow",
platforms: [
.macOS(.v10_13),
],
dependencies: [
.package(
name: "swift-models", url:
"https://github.com/tensorflow/swift-models.git",
.branch("master")
),
],
targets: [
.target(
name: "MNIST-1D", dependencies:
[.product(name: "Datasets", package: "swift-
models")],
path: "MNIST-1D"),
]
)
```
Hopefully, the preceding code is not too confusing. Importing this
code library will make our lives much easier. Now, let’s build our first
neural network!

Code: Multilayer perceptron + MNIST


Let’s look at a very simple demo. Put this code into a “main.swift” file
with the proper imports, and we’ll run it:

```

/// 1
import Datasets
import TensorFlow

// 2
struct MLP: Layer {
var flatten = Flatten<Float>()
var inputLayer = Dense<Float>(inputSize: 784,
outputSize: 512, activation: relu)
var hiddenLayer = Den se<Float>(inputSize: 512,
outputSize: 512, activation: relu)
var outputLayer = Dense<Float>(inputSize: 512,
outputSize: 10)

@differentiable
public func forward(_ input: Tensor<Float>) ->
Tensor<Float> {
return input.sequenced(through: flatten,
inputLayer, hiddenLayer, outputLayer)
}
}

// 3
let batchSize = 128
let epochCount = 12
var model = MLP()
let optimizer = SGD(for: model, learningRate: 0.1)
let dataset = MNIST(batchSize: batchSize)

print("Starting training...")

for (epoch, epochBatches) in


dataset.training.prefix(epochCount).enumerated() {
// 4
Context.local.learningPhase = .training
for batch in epochBatches {
let (images, labels) = (batch.data,
batch.label)
let (_, gradients) = valueWithGradient(at:
model) { model -> Tensor<Float> in
let logits = model(images)
return softmaxCrossEntropy(logits: logits,
labels: labels)
}
optimizer.update(&model, along: gradients)
}

// 5
Context.local.learningPhase = .inference
var testLossSum: Float = 0
var testBatchCount = 0
var correctGuessCount = 0
var totalGuessCount = 0
for batch in dataset.validation {
let (images, labels) = (batch.data,
batch.label)
let logits = model(images)
testLossSum += softmaxCrossEntropy(logits:
logits, labels: labels).scalarized()
testBatchCount += 1

let correctPredictions =
logits.argmax(squeezingAxis: 1) .== labels
correctGuessCount += Int(Tensor<Int32>
(correctPredictions).sum().scalarized())
totalGuessCount = totalGuessCount +
batch.data.shape[0]
}

let accuracy = Float(correctGuessCount) /


Float(totalGuessCount)
print(
"""
[Epoch \(epoch + 1)] \
Accuracy: \(correctGuessCount)/\
(totalGuessCount) (\(accuracy)) \
Loss: \(testLossSum / Float(testBatchCount))
"""
)
}
```
Results
When you run the preceding code, you should get an output that looks
like this:

```
Loading resource: train-images-idx3-ubyte Loading
resource: train-labels-idx1-ubyte Loading resource:
t10k-images-idx3-ubyte Loading resource: t10k-
labels-idx1-ubyte
Starting training…
[Epoch 1] Accuracy: 9364/10000 (0.9364) Loss:
0.21411717
[Epoch 2] Accuracy: 9547/10000 (0.9547) Loss:
0.15427242
[Epoch 3] Accuracy: 9630/10000 (0.963) Loss:
0.12323072
[Epoch 4] Accuracy: 9645/10000 (0.9645) Loss:
0.11413358
[Epoch 5] Accuracy: 9700/10000 (0.97) Loss:
0.094898805
[Epoch 6] Accuracy: 9747/10000 (0.9747) Loss:
0.0849531
[Epoch 7] Accuracy: 9757/10000 (0.9757) Loss:
0.076825164
[Epoch 8] Accuracy: 9735/10000 (0.9735) Loss:
0.082270846
[Epoch 9] Accuracy: 9782/10000 (0.97) Loss:
0.07173009
[Epoch 10] Accuracy: 9782/10000 (0.97) Loss:
0.06860765
[Epoch 11] Accuracy: 9779/10000 (0.9779) Loss:
0.06677916
[Epoch 12] Accuracy: 9794/10000 (0.9794) Loss:
0.063436724

Congratulations, you’ve done machine learning! This demo is only a


few lines long, but a lot is actually happening under the hood. Let’s
break down what’s going on.

Demo breakdown (high level)


We will look at all of the preceding code, going through section by
section using the number in the comments (e.g., //1, //2, etc.). We will
first do a pass to try and explain what is going on at a high level and then
do a second pass where we explain the nitty-gritty details.

Imports (1)
Our first few lines are pretty simple; we’re importing the swift-models
MNIST dataset handler and then the TensorFlow library.

Model breakdown (2)


Next, we build our actual neural network, an MLP model:

```
/// 2
struct MLP: Layer {
var flatten = Flatten<Float>()
var inputLayer = Dense<Float>(inputSize: 784,
outputSize: 512, activation: relu)
var hiddenLayer = Dense<Float>(inputSize: 512,
outputSize: 512, activation: relu)
var outputLayer = Dense<Float>(inputSize: 512,
outputSize: 10)

@differentiable
public func forward(_ input: Tensor<Float>) ->
Tensor<Float> {
return input.sequenced(through: flatten,
inputLayer, hiddenLayer, outputLayer)
}
}
```
What’s in this data structure? Our first line just defines a new struct
called MLP, which subclasses **Layer**, a type in swift for tensorflow. To
define this class, S4tf enforces a **protocol** definition that we
implement the function **forward** (formerly **callAsFunction**),
which takes an **input** and maps it to an **output**. Our middle lines
then actually define the layers of our perceptron:

```
var flatten = Flatten<Float>()
var inputLayer = Dense<Float>(inputSize: 784,
outputSize: 512, activation: relu)
var hiddenLayer = Dense<Float>(inputSize: 512,
outputSize: 512, activation: relu)
var outputLayer = Dense<Float>(inputSize: 512,
outputSize: 10)
```

We have four internal layers:


1) A flatten operation: This just takes the input and reduces it to a
single row of input numbers (a vector).
Our dataset is internally giving us a picture of 28x28 pixels, and
this just converts it into a row of numbers, 784 pixels long.
Next, we have three **dense** layers, which are a special type of
neural network called **fully connected** layers. The first goes from
our initial input (e.g., the flattened 784x1 vector) to 512 nodes, like
so.

2) A dense layer: 784 (the preceding input) to 512 nodes.

3) Another dense layer: 512 nodes to 512 nodes again.

4) An output layer: 512 nodes to 10 nodes (the number of digits, 0–9).


And then, finally, a forward function, which is where our neural
network logic magic happens. We literally take the input, run it
through the flatten, dense1, dense2, and output layers to produce
our result.
And so our

return input.sequenced(through: flatten,


inputLayer,
hiddenLayer, outputLayer)

is then the call that actually takes the input and maps it through
these four layers. We will look at the actual training loop next to
understand how all of that actually happens, but a very large part of the
magic of swift for tensorflow is on these few lines. We’ll talk a little bit
more about what is happening here in a second, but conceptually this
function is nothing more than applying the preceding four layers in a
sequence.

Global variables (3)


These lines are just setting up some different tools we’re going to use:

```
let batchSize = 128
let epochCount = 12
var model = MLP()
let optimizer = SGD(for: model, learningRate: 0.1)
let dataset = MNIST(batchSize: batchSize)
```

The first two lines set a couple of global variables: our batchSize
(how many MNIST examples we are going to look at each pass) and
epochCount (number of passes over the dataset we’re going to do).
The next line initializes our model, which we talked about earlier.
The fourth line initializes our optimizer, which we’re going to talk
about more in a second.
The last line sets up our dataset handler.
The next line starts our actual training process by looping over our
data:

```
for (epoch, epochBatches) in
dataset.training.prefix(epochCount).enumerated() {
```
Now we can get into the actual training loop!

Training loop: Updates (4)


Here’s what the actual core of our training loop looks like. Conceptually,
we’re going to be taking a set of pictures or **batch** and showing each
individual picture to the first input set of dense nodes, which will
**fire** and go to the next hidden set of dense nodes, which will **fire**
and go to the final output set of dense nodes. Then, we will take all of the
outputs of the final layer of our network, select the largest one, and look
at it. If this node is the same number as the original input we gave it,
then we will give the network a **reward** and tell it to increase its
confidence in the results. If this answer is the wrong one, then we will
give the network a **negative reward** and tell it to decrease its
confidence in its results. By repeating this process using thousands of
samples, our network can learn to accurately predict inputs it has never
seen before.

```
Context.local.learningPhase = .training
for batch in epochBatches {
let (images, labels) = (batch.data,
batch.label)
let (_, gradients) = valueWithGradient(at:
model) { model -> Tensor<Float> in
let logits = model(images)
return softmaxCrossEntropy(logits: logits,
labels: labels)
}
optimizer.update(&model, along: gradients)
}

How does this work under the hood? A little bit of calculus mixed
together with all of our data. For each training example, we get the raw
pixel values (image data) and then the corresponding label (actual
number for the picture). Then, we determine the **gradient** for the
**model** by calculating the values that the model will predict for X and
then see how our prediction compares with the actual value y using a
function called softmaxCrossEntropy . Conceptually, softmax just takes a
collection of inputs and then normalizes their results across the set as a
percentage. This can be a bit complex mathematically, so converting the
numbers to use the natural log e and then dividing by the sum of the
exponents has the useful dual properties of being consistent across
arbitrary inputs and easy to evaluate on a computer. Then, we update
our **model** in the direction of that it differs from where it should be
slightly (more in the right direction if it’s correct, away if it’s not). Our
learning rate determines how far we should go each pass (e.g., since our
rate is .1, we’re only going to go 10% of the direction the network thinks
is the right one each time). In the for loop that calls all of this, we will
repeat this process across all of our data (one pass) for multiple rounds,
or **epochs**.

Training loop: Accuracy (5)


Next, we run our model on our test data and calculate how often it was
correct on images it hasn’t seen yet (but that we know the right answers
to). So then, what does accuracy mean, and how do we calculate it? Our
code looks like this:

```
Context.local.learningPhase = .inference
var testLossSum: Float = 0
var testBatchCount = 0
var correctGuessCount = 0
var totalGuessCount = 0
for batch in dataset.validation {
let (images, labels) = (batch.data,
batch.label)
let logits = model(images)
testLossSum += softmaxCrossEntropy(logits:
logits, labels: labels).scalarized()
testBatchCount += 1
let correctPredictions =
logits.argmax(squeezingAxis: 1) .== labels
correctGuessCount += Int(Tensor<Int32>
(correctPredictions).sum().scalarized())
totalGuessCount = totalGuessCount +
batch.data.shape[0]
}

let accuracy = Float(correctGuessCount) /


Float(totalGuessCount)
print(
"""
[Epoch \(epoch + 1)] \
Accuracy: \(correctGuessCount)/\
(totalGuessCount) (\(accuracy)) \
Loss: \(testLossSum / Float(testBatchCount))
"""
)
```
In a similar process to our training dataset, we simply take our test
input images, run them through our model, and then compare our
results to what we know the right answer to be. Then we literally
calculate the number of correct answers divided by the total number of
images to produce our accuracy percentage. Our final few lines just print
out various numbers each pass through the dataset, or **epoch**, so we
can see if our loss is decreasing (e.g., the network is getting more
accurate with each pass).

Demo breakdown (low level)


Okay, we’ve walked through our MNIST example at a high level. Now
let’s go through some of these functions we’re calling and explore our
simple training loop more deeply.

Fully connected neural network layers


Fully connected layers form the backbone of our network, so it’s worth
taking some time to understand them. At a high level, each set of nodes
from the input dataset is mapped to the output dataset. Then each edge
of the network has a weight that is updated by our training function. The
math then for each node is literally [weight] * [input] + [bias], with the
value of the output node being the result of this math function.
**Weight** is how much value we’re going to place on the input to this
node, and then **bias** is a constant amount of value assigned to the
node regardless of what happens. The values for both of these will be
learned by our training. We use matrix math to represent our variables,
so that is why each value is in [brackets].
For a single node, the preceding math is simple enough to
understand, but the real magic of neural networks comes from many of
these nodes firing together. Loosely each neuron learns one part or
**feature** of the input, and then by working with the other neurons,
they collectively learn the set of weights needed to produce the result
we are looking for. The second element of how all this works is that we
are combining multiple layers together. The nodes are not learning their
values independently, they are learning from other nodes which are
updating as well. What this means is that by combining with the idea of
working together to figure out when to fire, the neurons are working
together to find the most efficient way of representing the input data.
Please note that we are using the word learn very loosely here. The
preceding math all works correctly, but people often attribute far more
intelligence to this process than actually exists. I believe the best way to
think about it is simply to think of your input data as a collection of
semi-related samples (e.g., a distribution), and then the neural network
is a way of reducing that distribution into an extremely small
representation. We will keep on exploring different ways of
understanding this key concept.
ReLU is a simple enough function to explain mathematically: relu(x)
= max(0, x). All this means is that we return the original value, and then
for all values below zero, we just return a zero. There are other choices
here (which we will discuss in a future chapter), notably sigmoid
functions, but since ReLU produces good results and is so easy to
evaluate (and by extension fast), it has become the de facto standard
activation function you will find in practice.

How the optimizer works


To continue with the preceding ideas, our goal then is to try to find a set
of neurons that will fire together to represent our data. So at a high level,
we will show our network our data, then calculate how far our model is
from our theoretical result, and then try to move our network slightly
closer to being more correct the next time around. This process then is
what our optimizer does. If our network guesses correct and is moving
in the right direction, then we tell it to keep on going. If our network
guesses wrong and is moving in the wrong direction, then we tell it to
keep on going in the opposite direction.
The easiest way of representing this is to consider trying to find the
minimum of a curve like y = x^2. We can literally take any random point
on the curve and calculate the result at another point nearby (a step
away, so to speak). Then either one of two possibilities will happen:
either we are getting further away from the base (e.g., moving in the
wrong direction) or we are getting closer. Then for our next step, we can
either keep on going in the same direction or reverse our course. Either
way, we will eventually end up near the bottom of the curve.
To continue the preceding ideas, there are a few problems with our
approach. The first is when our step size is too large. Further away from
the bottom, this will converge faster, but as we get near the bottom, we
will eventually end up in a state where we are jumping from side to side.
The flip side of this is choosing too small of a step size and taking a long
time to get to the minima, but that isn’t too much of a problem normally
(if it gets there, it gets there). The next trick then is to add what is called
momentum (or second-order gradients). The basic idea is that we don’t
completely change velocity each step but rather keep our previous
motion (e.g., we only add say 10% of the step’s change in direction each
step).

Optimizers + neural networks


The preceding idea is what is called convex optimization. When dealing
with neural networks, though, things are a little more tricky. The first is
that by definition we are updating an optimization function for
**every** neuron, and so the problem explodes to dealing with many
different functions in hyperdimensional space. To the computer, this is
nothing more than a very large math problem, but to humans there’s no
longer a good way to visualize what is going on. This is a large open area
of math called nonconvex optimization.
The second problem is simpler: for our math problem, it’s easy for us
to calculate whether or not we’re moving in the right direction because
we know what the right answer is. A very large problem in neural
networks (especially for more advanced areas) is finding the right goal
function for our problem. For what we’ll be doing in this book, we’ll
mostly be using softmax cross-entropy loss. For the problem of image
recognition, this is easily represented by comparing our answers with
the known results (e.g., we’re just grading things right or not). But
constructing custom loss functions is an interesting problem in more
advanced uses of neural networks you should be aware of.

Swift for Tensorflow


The preceding text covers the neural network piece. Now, let’s look at
where swift for tensorflow comes in. The mentioned approach is
hopefully reasonably simple enough to understand from a mathematical
perspective. The problem with applying it to our neural network
problem in a way that scales to larger problem is more complicated. The
largest problem is that for real-world networks keeping track of all of
our gradients in memory makes updating them much more simple and
significantly faster. The second is that when building these models by
hand, it is easy to introduce subtle bugs that will create problems down
the road. Swift for tensorflow uses Swift’s type system to require the
layer protocol, as we saw earlier. The basic idea then is simply that we
make sure each model enforces this protocol. Then we can add new
pieces to the model, and as long as they extend this protocol than in
theory, any arbitrary combination of said pieces will work as well.
Enforcing this layer protocol forces us, the programmer, to keep our
chain of functions correct and by extension allows the compiler to model
our gradients in whatever manner it so desires. By extension, then, the
compiler can output code for whatever hardware device we have on
hand. This is why we are using swift for tensorflow: to get compile-time
checking of our networks as well as the ability to run our models on
many different hardware back ends using platform-specific
optimizations.
Side quests
Here are a couple of simple tweaks you can make to your code in order
to understand what is happening:
Try making the dense layers smaller or larger (e.g., change the 512 in
the inputLayer, hiddenLayer, and outputLayer to 128 or 1024), and
run things again to see how that affects results.
Try increasing the number of epochs to 30 and reducing the learning
rate to .001 to see how smaller step sizes will still converge to the
same result.

Recap
We’ve looked at how to interact with a simple dataset called MNIST,
which is composed of grayscale hand-drawn digits from 0 to 9, ten
categories in total. We’ve built a simple, one-dimensional neural
network (called a **multilayer perceptron**) to classify MNIST digits
using swift for tensorflow. We’ve looked at how we can use a statistical
technique called **stochastic gradient descent** to update our neural
network each time it sees a new image to produce better and better
results. We’ve built a basic but functional training loop that goes through
the dataset multiple times, or **epochs**, to train our neural network
from an initial random state (where it was essentially guessing) to
eventually be able to recognize more than 90% of the digits it is shown.
This is the hardest chapter of the book conceptually. Literally,
everything we are going to be doing forward is simply taking this same
basic approach and improving it more and more. Spend some time
getting everything mentioned down before moving forward. Next, we’ll
add some convolutions to the neural network we built to produce our
first convolutional neural network.
© Brett Koonce 2021
B. Koonce, Convolutional Neural Networks with Swift for Tensorflow
https://doi.org/10.1007/978-1-4842-6168-2_2

2. MNIST: 2D Neural Network


Brett Koonce1
(1) Jefferson, MO, USA

In this chapter, we will modify our one-dimensional neural network by


adding convolutions to produce our first actual convolutional (2D)
neural network and use it to categorize black and white (e.g., MNIST)
images again.

Convolutions
Convolutions are a deep area of computer vision theory. At a high level
we might think of taking an input image and producing another output
image:

[cat] --> [magic black box] --> [dog]

Broadly, for any input image there’s a way to convert it to the target
image. At the simplest level we might destroy the source image (e.g.,
multiply by zero) and then insert our target image (e.g., add its pixels
in):

[cat] --> 0 * [cat] + [dog] --> [dog]

Then, we can model our middle step using simple math:

```a[X] + b```
This piece of math is called a kernel. This is a convolution, albeit not
a terribly useful one.
Broadly speaking, for every image in the universe, we can come up
with a kernel to convert it into anything else we desire. By extension,
there’s a kernel for **anything** that you can imagine.
This is a very, very deep area of research in computer vision in
general, and there are many different things that can be done here.

3x3 additive blur example


Next, let’s look at a slightly more complicated example, a 3x3 additive
blur. The actual kernel looks like this:

[ 1, 1, 1 ]
[ 1, 1, 1 ]
[ 1, 1, 1 ]

What this convolution will do is produce a simple blur to an input


image. It does so by literally creating an output pixel for each block of
3x3 pixels in the input image that is the sum of the 9 pixels we are
looking at. By then stepping over the row of the input image using a 1
step stride, we end up with a final image that blurred because each
output pixel has information from not only the original corresponding
pixel but also its neighbors. All of our outputs are larger numbers than
we started with. We apply a final simple step to **normalize** the
result by dividing all the values by 9 to produce values similar to the
original image.

3x3 Gaussian blur example


This next bit you don’t need to understand 100%, we’re just trying to
build upon the concepts.
We can change the 3x3 data and keep the same operation to
produce something more complicated. Here’s a slightly different
multiplicative kernel we can use:

[1/16, 1/8, 1/16]


[1/8, 1/4, 1/8]
[1/16, 1/8, 1/16]
And we can then produce different results by using our same basic
method as earlier. Here, we’re taking advantage of matrix multiplication
to keep more of our center pixel and less from the ones further away. At
a 3x3 size, it’s a bit difficult to see the difference between this and our
first example, but if you can imagine building larger versions of the
above matrix, this is the math that produces larger Gaussian blurs in
image editing programs such as Photoshop.

Combined 3x3 convolutions – Sobel filter example


For an even more advanced example of what can be done with
convolutions, let’s look at combining two of these kernel operations
together to produce what is called the Sobel filter . Once again, you
don’t need to understand this 100%.
Our first kernel looks like this:

[1, 0, -1]
[2, 0, -2]
[1, 0, -1]

And our second kernel looks like this:

[1, 2, 1]
[0, 0, 0]
[-1, -2, -1]

And then we combine them together with our input image like so,
one after the other:

[A] x [B] = [C]

The result is interesting; what happens is that pixels that are similar
get multiplied to zero (e.g., black), but sets of pixels that have
significant differences get multiplied to infinity (e.g., white). So with a
couple of basic convolutional kernels we have produced an edge
detector! Let’s avoid going deeper down the rabbit hole of convolutions
for now. Just know that this is a deep, deep field and many things are
possible.
3x3 striding
Very broadly, we’re going to not actually be building our own
convolutions. Instead, we’re going to have the neural network learn
them! For this, we really only need to focus on one key concept, which is
the process of going over our image in these 3x3 blocks. This is called
striding, and it’s an extremely important concept to understand.
Basically, the neural network will learn to make its own convolutions on
the fly and then will be using them to better understand our input data,
and then each step will be updating them slightly to improve its results.
Don’t worry, it’s a bit mind bendy at first. Let’s have the network learn
some, and then we can look at how they work on a real-world example.

Padding
“Same” padding and “valid” padding are the two forms of padding you
will encounter with convolutions. We will be using the “same” padding
for our first few chapters, but “valid” is the default of the 2D
convolution operator in swift for tensorflow, and so you will need to
understand both.
Valid is perhaps easier to understand. Each stride advances until the
far edge of the convolution hits the edge of the input image and then
stops. This means that this convolutional type will by definition
produce a smaller output than the input image (except for the special
case of 1x1 filters). “Same” padding extends the edge of the input data
to continue working on the input image until the leading edge of the
stride hits the limits of the input image.
This means that “same” padding (when using a stride size of 1) will
produce an output image that is the same size as the input image. We’re
going to use this same padding to jump to some more complicated
models in the next few chapters, so focus on understanding that for
now.

Maxpool
The other key concept you need to understand is maxpooling . All we’re
going to do is take each group of 4 input pixels, stepping across our
image in strides of two, and convert it to a single output by selecting the
largest value. For region, we’re simply going to find the largest pixel and
make that be our output.

2D MNIST model
If we take these two concepts together and revisit the MNIST problem,
we can actually significantly improve our quality just by changing how
we’re modeling our data. We’re going to take our same 784, but we’ll
treat it as an actual image, so it’ll be 28x28 pixels now. We’ll run it
through two layers of 3x3 convolutions, a maxpool operation, and then
we’ll keep our same densely connected layers and output of ten
categories.

Code
Here’s what the actual swift code for this looks like. I’ve taken the
example from before and added a stack of convolutions on top. Then,
we take our input, run it through our convolutional layer, and then send
it to our same output and densely connected layers as before. This will
run a bit, and eventually we’ll get up to about 98% accuracy on the
MNIST dataset. So by simply changing how we modeled the input data
to use convolutions instead, we’re able to cut our error rate in half on
this toy problem. In addition, convolutions are much easier to evaluate
than our dense layers, so as our datasets start getting larger, we’ll still
be able to continue using this approach.

```
import Datasets
import TensorFlow

struct CNN: Layer {


var conv1a = Conv2D<Float>(filterShape: (3, 3,
1, 32), padding: .same, activation: relu)
var conv1b = Conv2D<Float>(filterShape: (3, 3,
32, 32), padding: .same, activation: relu)
var pool1 = MaxPool2D<Float>(poolSize: (2, 2),
strides: (2, 2))
Another Random Document on
Scribd Without Any Related Topics
CHAPTER VII

I N the autumn of 1912 the family went to Skernevizi, their Polish estate, in
order to indulge the Emperor’s love for big-game hunting. In the vast
forests surrounding the estate all kinds of game were preserved and the
sport of hunting there was said to be very exciting. During the war these
woods and all the game were destroyed by the Germans, but until after 1914
Skernevizi was a favorite retreat of the Emperor. I had returned to my house
in Tsarskoe Selo but I was not allowed long to remain there. A telegram from
the Empress conveyed the disquieting news that Alexei, in jumping into a
boat, had injured himself and was now in a serious condition. The child had
been removed from Skernevizi to Spala, a smaller Polish estate near Warsaw,
and to Warsaw I accordingly traveled. Here I was met by one of the Imperial
carriages and was driven to Spala. Driving for nearly an hour through deep
woods and over a heavy, sandy road I reached my destination, a small
wooden house, something like a country inn, in which the suite was lodged.
Two rooms had been set apart for me and my maid, and here I found Olga
and Tatiana waiting to help me get settled. Their mother, they said, was
expecting me, and without any loss of time I went with them to the palace.
I found the Empress greatly agitated. The boy was temporarily improved
but was still too delicate to be taken back to Tsarskoe Selo. Meanwhile the
family lived in one of the dampest, gloomiest palaces I have ever seen. It
was really a large wooden villa very badly planned as far as light and
sunshine were concerned. The large dining room on the ground floor was so
dark that the electric lights had to be kept on all day. Upstairs to the right of
a long corridor were the rooms of the Emperor and Empress, her sitting
room in bright English chintzes being one of the few cheerful spots in the
house. Here we usually spent our evenings. The bedrooms and dressing
rooms were too dark for comfort, but the Emperor’s study, also on the right
of the corridor, was fairly bright.
As long as the health of little Alexei continued fairly satisfactory the
Emperor and his suite went stag hunting daily in the forests of the estate.
Every evening after dinner the slain stags were brought to the front of the
palace and laid out for inspection on the grass. The huntsmen with their
flaring torches and winding horns standing over the day’s bag made, I was
told, a very picturesque spectacle. The Emperor and his suite and most of the
household used to enjoy going out after dinner to enjoy this fine sight. I
never went myself, having a foolish love of animals which prevents
enjoyment of the royal sport of hunting. I even failed to appreciate, as the
head of the estate, kind Count Velepolsky, thought I should, the many
trophies of the chase with which the corridors and apartments of the palace
were adorned.
What I did enjoy was the beautiful park which surrounded the palace, and
the rapid little river Pilitsa that flowed through it. There was one leafy path
through which I often walked in the mornings with the Emperor. This was
called the Road of Mushrooms because it ended in a wonderful mushroom
bench. The whole place was so remote and peaceful that I deeply
sympathized with their Majesties’ irritation that even there they could never
stir abroad without being haunted by the police guard.
Although Alexei’s illness was believed to have taken a favorable turn and
he was even beginning to walk a little about the house and gardens, I found
him pale and decidedly out of condition. He occasionally complained of
pain, but the doctors were unable to discover any actual injury. One day the
Empress took the child for a drive and before we had gone very far we saw
that indeed he was very ill. He cried out with pain in his back and stomach,
and the Empress, terribly frightened, gave the order to return to the palace.
That return drive stands out in my mind as an experience of horror. Every
movement of the carriage, every rough place in the road, caused the child the
most exquisite torture, and by the time we reached home he was almost
unconscious with pain. The next weeks were endless torment to the boy and
to all of us who had to listen to his constant cries of pain. For fully eleven
days these dreadful sounds filled the corridors outside his room, and those of
us who were obliged to approach had often to stop our ears with our hands in
order to go about our duties. During the entire time the Empress never
undressed, never went to bed, rarely even lay down for an hour’s rest. Hour
after hour she sat beside the bed where the half-conscious child lay huddled
on one side, his left leg drawn up so sharply that for nearly a year afterwards
he could not straighten it out. His face was absolutely bloodless, drawn and
seamed with suffering, while his almost expressionless eyes rolled back in
his head. Once when the Emperor came into the room, seeing his boy in this
agony and hearing his faint screams of pain, the poor father’s courage
completely gave way and he rushed, weeping bitterly, to his study. Both
parents believed the child dying, and Alexei himself, in one of his rare
moments of consciousness, said to his mother: “When I am dead build me a
little monument of stones in the wood.”
The family’s most trusted physicians, Dr. Rauchfuss and Professor
Fedoroff and his assistant Dr. Derevanko, were in charge of the case and
after the first consultations declared the Tsarevitch’s condition hopeless. The
hemorrhage of the stomach from which he was suffering seemed liable to
turn into an abscess which could at any moment prove fatal. We had two
terrible moments in which this complication threatened. One day at luncheon
a note was brought from the Empress to the Emperor who, pale but
collected, made a sign for the physicians to leave the table. Alexei, the
Empress had written, was suffering so terribly that she feared the worst was
about to happen. This crisis, however, was averted. On the second occasion,
on an evening after dinner when we were sitting very quietly in the
Empress’s boudoir, Princess Henry of Prussia, who had come to be with her
sister in her trouble, appeared in the doorway very white and agitated and
begged the members of the suite to retire as the child’s condition was
desperate. At eleven o’clock the Emperor and Empress entered the room,
despair written on their faces. Still the Empress declared that she could not
believe that God had abandoned them and she asked me to telegraph
Rasputine for his prayers. His reply came quickly. “The little one will not
die,” it said. “Do not allow the doctors to bother him too much.” As a matter
of fact the turning point came a few days later, the pain subsided, and the
boy lay wasted and utterly spent, but alive.
Curiously enough there was no church on this Polish estate, but during
the illness of the Tsarevitch a chapel was installed in a large green tent in the
garden. A new confessor, Father Alexander, celebrated mass and after the
first celebration he walked in solemn procession from the altar to the
sickroom bearing with him holy communion for the sick boy. The Emperor
and Empress were very much impressed with Father Alexander and from
that time on they retained him in their private chapel at Tsarskoe Selo. He
was a good man but not a brave one, for when the Revolution came, and the
Emperor and the Empress sent for him to come to them, he confessed
himself afraid to go. Poor man! His caution, after all, did not save him. He
was shot by the Bolsheviki a year or two afterwards, on what pretext I do not
know.
The convalescence of Alexei was slow and wearisome. His nurse, Marie
Vechniakoff, had grown so hysterical with fatigue that she had to be
relieved, while the Empress was so exhausted that she could hardly move
from room to room. The young Grand Duchesses were tireless in their
devotion to the poor invalid, as was also M. Gilliard, who read to him and
diverted him hours on end. Gradually the distracted household assumed a
more normal aspect. The Emperor, in Cossack uniform, began once more to
entertain the officers of his Varsovie Lancers, commanded by a splendid
soldier, General Mannerheim, of whom the world has heard much. As
Alexei’s health continued to improve there was even a little shooting, and a
great deal of tennis which the girls, after their long confinement to the house,
greatly enjoyed. All of us began to be happy again, but one day the Emperor
called me into his study and showed me a telegram from his brother, Grand
Duke Michail, in which the latter announced his morganatic marriage to the
Countess Brassoff, of whom the Emperor strongly disapproved. It was not
the marriage itself that so strongly disturbed the Emperor, but that Michail
had solemnly given his word of honor that it would never take place. “He
broke his word—his word of honor,” the Emperor repeated again and again.
Another blow which the Emperor received at this time was the suicide of
Admiral Chagin, commandant of the Standert and one of the closest friends
of the family. The Admiral shot himself on account of an unhappy love
affair, and deeply as the Emperor mourned his death he was even more
indignant at the manner of it. Russians, I know, are inclined to morbidity,
and suicide with them is not an uncommon thing. But Nicholas II always
regarded it as an act of dishonor. “Running away from the field of battle,”
was his characterization of such an act, and when he heard of Chagin’s
suicide he gave way to a terrible mood of anger and grief. Speaking of both
Michail and Chagin he said bitterly: “How, in the midst of the boy’s illness
and all our trouble, how could they have done such things?” The poor
Emperor, to whom every failure of those he loved and trusted came as an
utterly unexpected blow, how near was his hour of complete and final
disillusionment of nearly all earthly loyalties.
We had a few weeks of peaceful enjoyment before leaving Spala that
autumn. The girls, bright and happy once more, rode every morning, the
crisp air and the exercise coloring their cheeks and raising their spirits high.
The Emperor tramped the woods, sometimes with me as his companion, and
on one of these outings we both had a narrow escape from drowning. The
Emperor took me for a row on the river which, as I have said, had a very
rapid current. Intent on keeping the boat well into the current, the Emperor
ran us into a small island, and for a few seconds escape from an ignominious
upset seemed impossible. I was thoroughly frightened, the Emperor not a
little embarrassed, and ardor for water sports was, for a time, rather lessened
in both of us.
On October 21 (Russian Calendar) we celebrated the accession to the
throne with high mass and holy communion, and a few days later the doctors
decided that Alexei was well enough to be moved to Tsarskoe Selo. The
Imperial train was made ready and their Majesties decided that I was to
travel on it with the rest of the suite. This was, as a matter of fact, contrary to
strict etiquette, and the announcement created among the ladies in waiting
much consternation, not to say rancor. There is no question that being a
regularly appointed lady in waiting to royalty and having nothing to do when
a mere friend of the exalted one happens to be at hand is a bit irritating, so I
cannot really blame the Empress’s ladies for objecting to me as a traveling
companion. The Imperial train, now used, one hears, by the inner circle of
the Communists, was composed of a number of luxurious carriages, more
like a home than a railway train. In the carriage of the Emperor and Empress
the easy chairs and sofas were upholstered in bright chintz and there were
books, family photographs, and all sort of familiar trinkets. The emperor’s
study was in his favorite green leather, and adjoining their dressing rooms
was a large and perfectly equipped bathroom. In this carriage also were
rooms for the personal attendants of their Majesties. The Grand Duchesses
and their maids had a similar carriage, and Alexei’s carriage, which had
compartments for the maids of honor and myself, was furnished with every
imaginable comfort. The last carriage was the dining wagon with a small
anteroom where the inevitable zakouski, the Russian table of hors d’œuvres,
was served. At the long dining table the Emperor sat with his daughters on
either hand, while facing him were Count Fredericks and the ladies in
waiting. Throughout the journey of nearly two days the Empress was served
in her own room or beside the bed where Alexei lay, very weak, but bright
and cheerful once more.
This chapter may well close with one of the opening events of 1913, the
Jubilee of the Romanoffs, celebrating the three hundredth anniversary of
their reign. In February the Court moved from Tsarskoe Selo to the Winter
Palace in Petrograd, a place they disliked because of the vast gloominess of
the building and the fact that the only garden was a tiny space hardly large
enough for the children to play or to exercise in. On reaching Petrograd the
family drove directly across the Neva to Christ’s Chapel, the little church of
Peter the Great, where is, or was, preserved a miraculous picture of the
Christ, very old and highly revered. The public had not been notified that the
Imperial Family would first visit this chapel, but their presence quickly
became known and they drove back to the Winter Palace through excited,
but on the whole undemonstrative, masses of people, a typical Petrograd
crowd.
The actual celebration of the Jubilee began with a solemn service in the
Cathedral of Our Lady of Kazan, which everyone familiar with Petrograd
remembers as one of the most beautiful of Russian churches. The vast
building was packed to its utmost capacity, and that means a much larger
crowd than in ordinary churches, since in Russia the congregation stands or
kneels through the entire service. From my position I had a very good view
of both the Emperor and the Tsarevitch, and I was puzzled to see them raise
their heads and gaze long at the ceiling, but afterwards they told me that two
doves had appeared and had floated for several minutes over their heads. In
the religious exaltation of the hour this appeared to the Emperor a symbol
that the blessing of God, after three
THE EMPEROR AND EMPRESS IN OLD SLAVONIC
DRESS. 1913 JUBILEE.
THE INVALID EMPRESS ON HER BALCONY AT
PETERHOF.

centuries, continued to rest on the House of Romanoff. There followed a


long series of functions at the palace, with deputations coming from all over
the Empire, the women appearing at receptions and dinners in the beautiful
national dress, which were also worn by the Empress and her daughters. The
Empress, for all her weariness, was regal in her richly flowing robes and
long-veiled, high kokoshnik, the Russian national headdress, set with
magnificent jewels. She also wore the wide-ribboned order of St. Andrew,
which was her sole privilege to wear, and at the most formal of the state
dinners she wore the most splendid of all the crown jewels. The young
Grand Duchesses were simply but beautifully gowned on all occasions, and
they wore the order of Catherine the Great, red ribbons with blazing
diamond stars. The crowds were enormous in all the great state rooms, the
Imperial Family standing for hours while the multitudes filed past with
sweeping curtsies and low bows. So long and fatiguing were these
ceremonies that at the end the Empress was literally too fatigued to force a
smile. Poor little Alexei also, after being carried through the rooms and
obliged to acknowledge a thousand greetings, was taken back to his room in
a condition of utter exhaustion.
There were state performances at the theater and the opera, Glinka’s “Life
for the Tsar” being sung to the usual tumult of applause and adulation, but
for all that I felt that there was in the brilliant audience little real enthusiasm,
little real loyalty. I saw a cloud over the whole celebration in Petrograd, and
this impression, I am almost sure, was shared by the Empress. She told me
that she could never feel happy in Petrograd. Everything in the Winter
Palace reminded her of earlier years when she and her husband used to go
happily to the theater together and returning would have supper in their
dressing gowns before the fire talking over the events of the day and
evening. “I was so happy then,” she said plaintively, “so well and strong.
Now I am a wreck.”
Much as both she and the Emperor desired to shorten their stay in
Petrograd, they were obliged to remain several weeks after the close of the
official celebration because Tatiana, who unwisely had drunk the infected
water of the capital, fell ill of typhoid and could not for some time be
moved. With her lovely brown hair cut short, we finally went back to
Tsarskoe Selo, where she made good progress back to health.
In the spring began the celebration of the Jubilee throughout the Empire.
The visit to the Volga, especially to Kostrama, the home of the first
Romanoff monarch, Michail Feodorovnitch, was a magnificent success, the
people actually wading waist deep in the river in order to get nearer the
Imperial boat. It was the same through all the surrounding governments,
crowds, cheers, acclamations, prayers, and great choruses singing the
national hymn, very evidence of love and loyalty. I particularly remember
when the cortège reached the town of Pereyaslovl, in the Vladimir
Government, because it was from there that my father’s family originated,
and some of his relatives took part in the day’s celebration. The Empress, to
my regret, was not present, being confined to her bed on the Imperial train,
ill and fatigued, yet under obligation to be ready for special ceremonies in
Moscow. It would need a more eloquent pen than mine adequately to
describe those days in Moscow, the Holy City of Russia. The weather was
perfect, and under the clear sunshine the floating flags and banners, the
flower-trimmed buildings, and the numberless decorations made up a
spectacle of unforgettable beauty. Leaving his car at some distance from the
Kremlin, the Emperor entered the great gate on foot, preceded by chanting
priests with waving censers and holy images. Behind the Emperor and his
suite came the Empress and Alexei in an open car through crowds that
pressed hard against the police lines, while overhead all the bells of Moscow
pealed welcome to the Sovereigns. Every day it was the same,
demonstrations of love and fealty it seemed that no time or circumstance
could ever alter.
CHAPTER VIII

N INETEEN-FOURTEEN, that year of fate for all the world, but more
than all for my poor country, began its course in Russia, as elsewhere, in
apparent peace and tranquillity. With us, as with other civilized people,
the tragedy of Sarajevo came as a thrill of horror and surmise. I do not know
exactly what we expected to follow that desperate act committed in a distant
province of Austria, but certainly not the cataclysm of a World War and the
ruin of three of the proudest empires of earth. Very shortly after the
assassination of the Austrian heir and his wife the Emperor had gone to
Kronstadt, headquarters of the Baltic fleet, to meet French and British
squadrons then on cruise in Russian waters.[2] From Kronstadt he proceeded
to Krasnoe, near Petrograd, the great summer central review center of the old
Russian Army where the usual military maneuvers were in progress.
Returning to Peterhof, the Emperor ordered a hasty departure to Finland
because, he said, the political horizon was darkening and he needed a few
days of rest and distraction. We sailed on July 6 (Russian Calendar) and had
a quiet cruise, the last one we were ever destined to enjoy. Not that we
intended it to be our last, for returning to Peterhof, from whence the
Emperor hurried again to the reviews, we left nearly all our luggage on the
yacht. The Empress, however, in one of her fits of melancholy, told me that
she felt that we would never again be together on the Standert.
The political skies were indeed darkening. The Serbian murders and the
unaccountably arrogant attitude of Austria grew in importance every
succeeding day, and for many hours every day the Emperor was closeted in
his study with Grand Duke Nicholas, Foreign Minister Sazonoff and other
Ministers, all of whom urged on the Emperor the imperative duty of standing
by Serbia. During the short intervals of the day when we saw the Emperor he
seemed half dazed by the momentous decision he was called upon to make.
A few days before mobilization I went to lunch at Krasnoe with a friend
whose husband was on the Russian General Staff. In the middle of luncheon
this officer, Count Nosstiz, burst into the room exclaiming: “Do you know
what the Emperor has done? Can you guess what they have made him do?
He has promoted the young men of the Military Academy to be officers, and
he has sent the regiments back to their casernes to await orders. All the
military attachés are telegraphing their Governments to ask what it means.
What can it mean except war?”
From my friend’s house I went almost at once back to Peterhof and
informed the Empress what I had heard. Her amazement was unbounded,
and over and over she repeated that she did not understand, that she could
not imagine under what influence the Emperor had acted. He was still at the
maneuvers, and although I remained late with the Empress I did not see him
that night. The days that followed were full of suspense and anxiety. I spent
most of my time playing tennis—very badly—with the girls, but from my
occasional contacts with the Empress I knew that she was arguing and
pleading against the war which apparently the Emperor felt to be inevitable.
In one short talk I had with him on the subject he seemed to find a certain
comfort in the thought that war always strengthened national feeling, and in
his belief Russia would emerge from a truly righteous war stronger and
better than ever. At this time a telegram arrived from Rasputine in Siberia,
which plainly irritated the Emperor. Rasputine strongly opposed the war, and
predicted that it would result in the destruction of the Empire. But the
Emperor refused to believe it and resented what was really an almost
unprecedented interference in affairs of state on the part of Rasputine.
I think I have spoken of the Emperor’s aversion to the telephone. Up to
this time none of his studies were ever fitted with telephones, but now he
had wires and instruments installed and spent a great deal of time in
conversations with Ministers and members of the military staff. Then came
the day of mobilization, the same kind of a day of wild excitement, waving
street crowds, weeping women and children, heartrending scenes of parting,
that all the warring countries saw and ever will remember. After watching
hours of these dreadful scenes in the streets of Peterhof I went to my evening
duties with the Empress only to find that she had remained in absolute
ignorance of what had been taking place. Mobilization! It was not true, she
exclaimed. Certainly armies were moving, but only on the Austrian frontiers.
She hurried from the room and I heard her enter the Emperor’s study. For
half an hour the sound of their excited voices reached my ears. Returning,
the Empress dropped on her couch as one overcome by desperate tidings.
“War!” She murmured breathlessly. “And I knew nothing of it. This is the
end of everything.” I could say nothing. I understood as little as she the
incomprehensible silence of the Emperor at such an hour, and as always,
whatever hurt her hurt me. We sat in silence until eleven when, as usual, the
Emperor came in to tea, but he was distraught and gloomy and the tea hour
also passed in almost complete silence.
The whole world has read the telegrams sent to Nicholas II by ex-
Emperor William in those beginning days of the war. Their purport seemed
to be sincere and intimate, begging his old friend and relative to stop
mobilization, offering to meet the Emperor for a conference which yet might
keep the peace. Historians of the future will have to decide whether those
tenders were made in good faith or whether they were part of the sinister
diplomacy of that wicked war. Nicholas II did not believe in their good faith,
for he replied that he had no right to stop mobilization in Russia when
German mobilization was already a matter of fact and that at any hour his
frontiers might be crossed by German troops. After this interval the Emperor
seemed to be in better spirits. War had come indeed, but even war was better
than the threat and the uncertainty of the preceding weeks. The extreme
depression of the Empress, however, continued unrelieved. Up to the last
moment she hoped against hope, and when the German formal declaration of
war was announced she gave way to a perfect passion of weeping, repeating
to me through her tears: “This is the end of everything.” The state visit of
their Majesties to Petrograd soon after the declaration really seemed to
justify the Emperor’s belief that the war would arouse the national spirit, so
long latent, in the Russian people. Never again do I expect to behold such a
sight as the streets of Petrograd presented on that day. To say that the streets
were crowded, thronged, massed, does not half express it. I do not believe
that one single able-bodied person in the whole city remained at home
during the hours spent in the capital by the Sovereigns. The streets were
almost literally impassable, and the Imperial motor cars, moving at snail’s
pace from quay to palace through that frenzied sea of people, cheering,
singing the national hymn, calling down blessings on the Emperor, was
something that will live forever in the memories of all who witnessed it. The
Imperial cortège was able, thanks to the police, to reach the Winter Palace at
last, but many of the suite were halted by the crowds at the entrance to the
great square in front of the palace and had to enter at a side door opening
from the small garden to the west.
Inside the palace the crowd was relatively as great as that on the outside.
Apparently every man and woman who had the right to appear at Court were
massed in the corridors, the staircases, and the state apartments. Slowly their
Majesties made their way to the great Salle de Nicholas, the largest hall in
the palace, and there for several hours they stood receiving the most
extraordinary tokens of homage from thousands of officials, ministers, and
members of the noblesse, both men and women. Te Deums were sung,
cheers and acclamations arose, and as the Emperor and Empress moved
slowly through the crowds men and women threw themselves on their knees,
kissing the hands of their Sovereigns with tears and fervent expressions of
loyalty. Standing with others of the suite in the Halle de Concert, I watched
this remarkable scene, and I listened to the historic speech of the Emperor
which ended with the assurance that never would there be an end to Russian
military effort until the last German was expelled from the beloved soil.
From the Salle de Nicholas the Sovereigns passed to a balcony overlooking
the great square. There with the Tsarevitch at their side they faced the wildly
exulting people who with one accord dropped to their knees with mute
gestures of love and obedience. Then as countless flags waved and dipped
there arose from the lips and hearts of that vast assembly the moving strains
of our great hymn: “God Save the Tsar.”
Thus in a passion of renewed love and patriotism began in Russia the war
of 1914. That same day the family returned to Peterhof, the Emperor almost
immediately leaving for the casernes to bid farewell to regiments leaving for
the front. As for the Empress, she became overnight a changed being. Every
bodily ill and weakness forgotten, she began at once an extensive plan for a
system of hospitals and sanitary trains for the dreadful roll of wounded
which she knew must begin with the first battle. Her projected chain of
hospitals and sanitary centers reached from Petrograd and Moscow to
Charkoff and Odessa in the extreme south of Russia. The center of her
personal activity was fixed in a large group of evacuation hospitals in and
around Tsarskoe Selo, and there, after bidding farewell to my only brother,
who immediately left for the southern front, I joined the Empress. Already
her plans were so far matured that ten sanitary trains, bearing her name and
the children’s, were in active service, and something like eighty-five
hospitals were open, or preparing to open, in Tsarskoe Selo, Peterhof,
Pavlovsk, Louga, Sablino, and neighboring towns. The Empress, her two
older daughters, and myself immediately enrolled under a competent woman
surgeon, Dr. Gedroiz, as student nurses, spending two hours of every
afternoon under theoretical instruction, and the entire hours of the morning
in ward work in the hospitals. For the benefit of those who imagine that the
work of a royal nurse is more or less in the nature of play I will describe the
average routine of one of those mornings in which I was privileged to assist
the Empress Alexandra Feodorovna and the Grand Duchesses Olga and
Tatiana, the two last-named girls of nineteen and seventeen. Please
remember that we were then only nurses in training. Arriving at the hospital
shortly after nine in the morning we went directly to the receiving wards
where the men were brought in after having first-aid treatment in the
trenches and field hospitals. They had traveled far and were usually
disgustingly dirty as well as blood-stained and suffering. Our hands
scrubbed in antiseptic solutions we began the work of washing, cleaning,
and bandaging maimed bodies, mangled faces, blinded eyes, all the
indescribable mutilations of what is called civilized warfare. These we did
under the orders and the direction of trained nurses who had the skill to do
the things our lack of experience prevented us from doing. As we became
accustomed to the work, and as both the Empress and Tatiana had
extraordinary ability as nurses, we were given more important work. I speak
of the Empress and Tatiana especially because Olga within two months was
almost too exhausted and too unnerved to continue, and my abilities proved
to be more in the executive and organizing than in the nursing end of
hospital work. I have seen the Empress of Russia in the operating room of a
hospital holding ether cones, handling sterilized instruments, assisting in the
most difficult operations, taking from the hands of the busy surgeons
amputated legs and arms, removing bloody and even vermin-infected
dressings, enduring all the sights and smells and agonies of that most
dreadful of all places, a military hospital in the midst of war. She did her
work with the humility and the gentle tirelessness of one dedicated by God
to a life of ministration. Tatiana was almost as skillful and quite as devoted
as her mother, and complained only that on account of her youth she was
spared some of the more trying cases. The Empress was spared nothing, nor
did she wish to be. I think I never saw her happier than on the day, at the end
of our two months’ intensive training, she marched at the head of the
procession of nurses to receive the red cross and the diploma of a certificated
war nurse.
From that time on our days were literally devoted to toil. We rose at seven
in the morning and very often it was an hour or two after midnight before we
sought our beds. The Empress, after a morning in the operating room of one
hospital, snatched a hasty luncheon and spent the rest of the day in a round
of inspection of other hospitals. Every morning early I met her in the little
Church of Our Lady of Znamenie, where we went for prayers, driving
afterwards to the hospitals. On the days when the sanitary trains arrived with
their ghastly loads of wounded we often worked from nine until three
without stopping for food or rest. The Empress literally shirked nothing.
Sometimes when an unfortunate soldier was told by the surgeons that he
must suffer an amputation or undergo an operation which might be fatal, he
turned in his bed calling out her name in anguished appeal. “Tsaritsa! Stand
near me. Hold my hand that I may have courage.” Were the man an officer
or a simple peasant boy she always answered the appeal. With her arm under
his head she would speak words of comfort and encouragement, praying
with him while preparations for the operation were in progress, her own
hands assisting in the merciful work of anesthesia. The men idolized her,
watched for her coming, reached out bandaged hands to touch her as she
passed, smiling happily as she bent over their pillows. Even the dying smiled
as she knelt beside their beds murmuring last words of prayer and
consolation.
In the last days of November, 1914, the Empress left Tsarskoe Selo for an
informal inspection of hospitals within the radius of her especially chosen
district. Dressed in the gray uniform of a nursing sister, accompanied by her
older daughters, myself, and a small suite, she went to towns surrounding
Tsarskoe Selo and southward as far as Pskoff, staff headquarters, where the
younger Grand Duchess Marie Pavlovna was a hospital nurse. From there
she proceeded to Vilna, Kovno, and Grodno, in which city she met the
Emperor and with him went on to Dvinsk. The enthusiasm and affection
with which the Empress was met in all these places and in stations along the
route beggars description. A hundred incidents of the journey crowd my
memory, each one worth the telling had I space to include them in this
narrative. I remember, for example, the remarkable scene in the big fortress
of Kovno, where acres of hospital beds were assembled and where the tall
figure of the Empress, moving through those interminable aisles, was
greeted like the visit of an angel. I never recall that journey without
remembering the hospital at Grodno, where a gallant young officer lay dying
of his wounds. Hearing that the Empress was on her way to the hospital, he
rallied unexpectedly and declared to his nurses that he was determined to
live until she came. Sheer will power kept life in the man’s body until the
Empress arrived, and when, at the door of the hospital, she was told of his
dying wish to see her she hurried first to his bedside, kneeling beside it and
receiving his last smile, his last gasping words of greeting and farewell.
After one very fatiguing day our train passed a sanitary train of the Union
of Zemstvos moving south. The Empress, who should have been resting in
bed at the time, ordered her train stopped that she might visit, to the surprise
and delight of the doctors, this splendidly equipped rolling hospital. Another
surprise visit was to the estate of Prince Tichkevitch, whose family
supported on their own lands a very efficient hospital unit. It was impossible
to avoid noticing how in the towns visited by the Empress, dressed as a
simple sister of mercy, the love of the people was most manifest. In Grodno,
Dvinsk, and other cities where she appeared with the Emperor there was
plenty of enthusiasm, but on those occasions etiquette obliged her to lay
aside her uniform and to dress as the wife of the Emperor. Much better the
people loved her when she went among them in her nurse’s dress, their
devoted friend and sister. Etiquette forgotten, they crowded around her,
talked to her freely, claimed her as their own.
Soon after returning from this visit of inspection the Empress
accompanied by Grand Duchesses Olga and Tatiana, General Racine,
Commander of the Palace Guards, a maid of honor and myself, set off on a
journey to Moscow, where to my extreme sorrow and dismay I perceived for
the first time unmistakable evidences of a spreading intrigue against the
Imperial Family. At the station in Moscow the Empress was met by her
sister, the Grand Duchess Serge and the latter’s intimate friend and the
executive of her convent, Mme. Gardieve. Welcome from the people there
was none, as General Djounkovsky, Governor of Moscow, had announced,
without any authority whatsoever, that the Empress was in the city incognito
and did not wish to meet anyone. In consequence of this order we drove to
the Kremlin through almost empty streets. Nevertheless the Empress began
at once the inspection of hospitals, accompanied by General Racine and her
maid of honor, Baroness Boukshoevden, daughter of the Russian
Ambassador in Denmark. During our stay in Moscow I was not as constantly
with the Empress as usual, our rooms in the Kremlin being far apart.
However, General Odoevsky, the fine old Governor of the Kremlin, installed
a telephone between our rooms, and on her free evenings the Empress often
summoned me to sit with her in her dressing room, hung with light blue
draperies and looking out over the river and the ancient roofs of Moscow. I
lunched and dined with others of the suite in an old part of the immense
palace known as the Granovita Palata, and here occurred one night a
disagreeable scene in which General Racine, in the presence of the whole
company, administered a stinging rebuke to General Djounkovsky, Governor
of Moscow, for his responsibility for the cold welcome accorded her
Majesty. The Governor turned very pale but made no answer to the
accusation of General Racine. Already my mind was in a tumult of trouble,
more and more conscious of the atmosphere of intrigue, plots, and
conspiracies, the end of which I could not see. In the coldness of the Grand
Duchess Serge, in my childhood such a friend to me and to my family, her
chilly refusal to listen to her sister’s denial of preposterous tales of the
political influence exerted by Rasputine, by the general animosity towards
myself, I began dimly to realize that there was a plot to strike at her Majesty
through Rasputine and myself. There was absolutely nothing I could do, and
I had to watch with tearless grief the breach between the sisters grow wider
and deeper until their association was robbed of most of its old intimacy. I
knew well enough, or I was convinced that I knew, that the dismissed maid
of honor, Mlle. Tutcheff, was at the bottom of the whole affair, her family
being among the most prominent in Moscow. But I could say nothing, do
nothing.
With great relief we saw our train leave Moscow for a round of visits in
surrounding territory, and here again the enthusiasm with which the people
welcomed the Empress was unbounded. In the town of Toula, for example,
and a little farther on in Orel, the people were so tumultuous in their
greeting, they crowded so closely around their adored Empress, that our
party could scarcely make our way to church and hospital. Once, following
the Empress out of a church, carrying in my hands an ikon which had been
presented to her, I was fairly overthrown by the crowding multitude and fell
halfway down the high flight of steps before friendly hands could get me to
my feet. I did not mind this, being only too rejoiced at evidences of love and
devotion which the simple people of Russia felt for their Empress. In one
town where there were no modern carriages she was dragged along in an old
coach of state such as a medieval bishop might have used, the coach being
quite covered with flowers and branches. In the town of Charkoff hundreds
of students met the train bearing aloft portraits of her Majesty. In the small
town of Belgorod, where the Empress wished to stop in order to visit a very
sacred monastery, I shall never forget the joy with which the sleepy
ischvostiks hurried through the darkness of the night to drive us the three or
four versts from the railway to the monastery. Nor can I forget the arrival at
the monastery, the sudden flare of lights as the monks hastened out to meet
and greet their Sovereign Empress. These were the people, the plain people
of Russia, and the difference between them and the plotting officials we had
left behind in Moscow was a sad and a terrible contrast.
On December 6 (Russian Calendar), the birthday of the Emperor, we met
his train at Voronezh, where our parties joined in visits to Tambov, Riasan,
and other towns where the people gave their Majesties wonderful greetings.
In Tambov the Emperor and Empress visited and had tea with a charming
woman of advanced age, Mme. Alexandra Narishkin, friend of Alexander III
and of many distinguished men of her time. Mme. Narishkin, horrible to
relate, was afterwards murdered by the Bolsheviki, neither her liberal mind
nor her long services to her country, and especially to her humble friends in
Tambov, sparing her from the blood lust of the destroyers of Russia.
The journey of their Majesties terminated at Moscow, where the younger
children of the family awaited them. I can still see the slim, erect figure of
Alexei standing at salute on the station platform, and the rosy, eager faces of
Marie and Anastasie welcoming their parents after their long separation. The
united family drove to the Kremlin, this time not quite so inhospitably
received. In the days following the Moscow hospitals and military
organizations were visited in turn, and we included in these visits out of
town activities of the Moscow Zemstvo (county council), canteens, etc. In
one of these centers our host was Prince Lvoff, afterwards active in
demanding the abdication of the Tsar, and I remember with what deference
he received their Majesties, and the especial attention he paid to the
Tsarevitch, whose autograph he begged for the visitors’ book. Before we left
Moscow the Empress paid two visits, one to the old Countess Apraxin, sister
of the former first lady in waiting, Princess Galatzine and, with the Emperor,
to the Metropolitan Makari, a good man, but mercilessly persecuted during
the Revolution.
There was one small but significant incident which happened after our
return to Tsarskoe Selo, near the end of the year 1914. It failed of its
intended effect, but had it not failed it might have had a far-reaching
influence on world events at that time. Looking back on it now, I sometimes
wonder exactly what lay back of the plot, and who was responsible for its
inception. One evening late in the year I received a visit from two war nurses
lately released from a German prison where they had been taken with a
portion of a captured Russian regiment. In much perturbation of spirit these
nurses told me of a third nurse who had been captured and imprisoned with
them. This woman they had come to distrust as she had been accorded many
special favors by the Germans. She had been given good food and even
champagne, and when the nurses were released she alone was conveyed to
the frontier in a motor car, the others going on foot. While in prison this
woman had boasted that she expected to be received by the Emperor, to
whom she proposed to present the flag of the captured regiment. The other
nurses declared that in their opinion his Majesty should be warned of the
woman’s dubious character.
Hardly knowing what to think of such an extraordinary story, I thought it
my duty to lay the matter before General Voyeikoff, Chief Commander of
the Palace Guards, and when I learned from him that the Emperor had
consented to receive the nurse I begged that the woman be investigated
before being allowed to enter the palace. The Emperor showed some
vexation, but he consented. When General Voyeikoff examined the woman
she made a display of great frankness, handing him a revolver which she
said it had been necessary for her to carry at the front. General Voyeikoff,
thinking it strange that the weapon had not been taken away from her by the
Germans, immediately ordered a search of her effects. In the handbag which
she would certainly have carried with her to the palace were found two more
loaded revolvers. The woman was, of course, arrested, and although I cannot
explain why, her arrest caused great indignation among certain members of
the aristocracy who previously had received her at their homes. The whole
onus of her arrest was placed on me, although the Emperor declared his
belief that she was a German spy sent to assassinate him. That she was a spy
I have never doubted, but in my own mind I have never even tried to guess
from whence she came.
CHAPTER IX

A VERY few days after the events chronicled in the last chapter I became
the victim of a railroad accident which brought me to the threshold of
death and for many months made it impossible for me to follow the
events of the war, or the growing conspiracy against the Sovereigns. At a
little past five o’clock of the afternoon of January 2, 1915, I took the train at
Tsarskoe for a short visit to my parents in Petrograd. With me in my carriage
was Mme. Shiff, a sister of a distinguished officer of Cuirassiers. We sat
talking the usual commonplaces of travel when suddenly, without a
moment’s notice, there came a tremendous shock and a deafening crash, and
I felt myself thrown violently forward, my head towards the roof of the
carriage, and both legs held as in a vise in the coils of the steam-heating
apparatus. The overturned carriage lurched and broke in two like an eggshell
and I felt the bones of my left leg snap sharply. So intense was the pain that I
momentarily lost consciousness. Too soon my senses returned to me and I
found myself firmly wedged in the wreckage of wood and iron, a great bar
of steel crushing my face, and my mouth so choked with blood that I could
not utter a sound. All I could do in my agony was silently to pray that God
would give me the relief of a quick death, for I could not believe that any
human being could endure such pain and live.
After what seemed to me an interminable length of time I felt the pressure
on my face removed and a kind voice asked: “Who lies here?” As I managed
to breathe my name the rescuers exclaimed in astonishment and alarm, and
immediately began to endeavor to extricate me from my agonizing position.
By means of ropes passed under my arms and using great care and
gentleness they ultimately got me free and laid me on the grass. In a
moment’s flash I recognized one as a Cossack of the Emperor’s special
guard, an excellent man named Lichatchieff, and the other as a soldier of the
railway battalion. Then I fainted. Ripping loose one of the doors of the
railway carriage, the men placed me on it and carried me to a near-by hut
already crowded with wounded and dying. Regaining consciousness for a
moment, I begged in whispers that Lichatchieff would telephone my parents
in Petrograd and their Majesties at the palace. This the good fellow did
without delay, and he also brought to my corner one of the surgeons
summoned to the wreck. The man gave me a rapid examination and said
briefly: “Do not disturb her. She is dying.” He left to attend to more hopeful
cases, but the faithful soldiers still knelt beside me, straightening my crushed
and broken legs and wiping the blood from my lips. In about two hours
another doctor, this time the surgeon Gedroiz, under whom the Empress, her
daughters, and myself had taken our nurses’ training, approached the corner
where I lay. I looked with a kind of terror into the face of this woman, for I
knew her to be no friend of mine. Simply giving my wounded head a
superficial examination she said carelessly that I was a hopeless case, and
left me without the slightest attempt to soothe my pain. Not until ten o’clock
that night, four hours after the collision which had wrecked two trains, did
any help reach me. At that hour arrived General Racine from the palace with
orders from their Majesties to do everything possible in my behalf. At his
imperative commands I was again placed on a stretcher and carried to a
relief train made up of cattle cars. At the moment my poor father and mother
arrived from Petrograd and the last things I remember were their sobs and a
teaspoonful of brandy mercifully poured down my throat.
At the end of the journey to Tsarskoe Selo I dimly recognized the
Empress and the four Grand Duchesses who had come to the station to meet
the train. Their faces were full of sympathy and grief, and as they bent over
me I found strength to whisper to them: “I am dying.” I believed it because
the doctors had said so, and because my pain was so great. Then came the
ordeal of being lifted into the ambulance and the half-consciousness that the
Empress was there too, holding my head on her knees and begging me to
have courage. After that came an interval of darkness out of which I awoke
in bed and almost free from pain. The Empress who, with my parents,
remained near me, asked me if I would like to see the Emperor. Of course I
replied that I would, and when he came I pressed the hand he gave me. Dr.
Gedroiz, who was in charge of the ward, told everyone coldly to take leave
of me as I could not possibly live until morning. “Is it so hopeless?” asked
the Emperor. “She still has some strength in her hand.”
Later on, I do not know exactly when, I opened my eyes quite clearly, and
saw standing beside my bed the tall, gaunt form of Rasputine. He looked at
me fixedly and said in a calm voice: “She will live, but will always be a
cripple.” A prediction which was literally fulfilled, for to this day I can walk
only slowly and with the aid of a stout stick. I have been told that Rasputine
recalled me from unconsciousness, but of his words I know only what I have
recorded.
The next morning I was operated on and for the six weeks following I
suppose I suffered as greatly as one can and live. My left leg which had
sustained a double fracture, troubled me less than my back and my right leg
which had been horribly wrenched and lacerated. My head wounds were also
intensely painful and for a time I suffered from inflammation of the brain.
My parents, the Empress, and the children came every day to see me, but
despite their presence the neglect and unkindness of Dr. Gedroiz continued.
The suggestion of the Empress that her trusted physician, Dr. Federoff, be
brought into consultation was rudely repulsed by this woman, of whom I
may finally say that she is now in high favor with the Bolsheviki whose
ranks she joined in the autumn of 1917. Waited upon by none but the most
inexperienced nurses, I do not know what might have become of me had not
my mother brought to the hospital an old family nurse whom she absolutely
insisted should take charge of me. Things went a little better after this, but
happy was I when at the end of the sixth week, against the will of Dr.
Gedroiz, I left that wretched hospital and was removed to my own home.
There in the peace and security of my comfortable bedroom I enjoyed for the
first time since my accident quiet and refreshing sleep.
It seems strange that the hostile and envious Court circle had deeply
resented the daily visits of the Emperor and Empress to my bedside. To
placate the gossipers the Emperor, before visiting me, used to make the
rounds of all the wards. In spite of it all I had many visitors and many daily
inquiries from the Empress Dowager and others. Very soon after my arrival
home I was examined by skillful surgeons, among them Drs. Federoff and
Gagentorn, who pronounced my crushed right leg to be in a very bad
condition and placed it in a plaster cast, where it remained for two months.
The Empress visited me daily, but the Emperor I seldom saw because, as I
learned indirectly, the War was going very badly on the Russian front, and
the Emperor was almost constantly with the armies. In the last week before
Lent he came to my bedside with the Empress, in accordance with an old
Russian custom, before confession, to beg my forgiveness for possible
wrongs done me during the year past. Their pious humility and also the
white and careworn face of the Emperor filled me with emotion which later
events served only to increase, for very momentous and trying hours were
even then crowding the destiny of Nicholas II, Tsar of all the Russias.
A soldier of the sanitary corps, a man named Jouk, had been assigned to
duty at my house, and as soon as I was able to leave my bed he took me
daily in a wheeled chair to church, and to the palace. This was the summer
of 1915, a time of great tribulation for the Russian Army, as every student of
the World War is aware. Grand Duke Nicholai Nicholaievitch was pursuing a
policy which rightly disturbed the Emperor, who constantly complained that
the commander in chief of his armies sent the men forward without proper
ammunition, without artillery support, and with no adequate preparations for
safe retreat. Disaster after disaster confirmed the Emperor’s fears. Fortress
after fortress fell to the Germans. Kovno fell. Novogeorgiesk fell, and finally
Warsaw itself fell. It was a terrible day when the Emperor, white and
trembling, brought this news to the Empress as we sat at tea on her balcony
in the warm autumn air. The Emperor was fairly overcome with grief and
humiliation as he finished his tale. “It cannot go on any longer like this,” he
exclaimed bitterly, and then he went on to declare that in spite of ministerial
opposition he was determined to take personal command of the army
himself. Only that day Krivosheim, Minister of Agriculture, had addressed
him on the impossible condition of Russian internal affairs. Nicholai
Nicholaievitch, not content with military supremacy, had assumed almost
complete authority over all the business of the Empire. There were in fact
two governments in Russia, orders being constantly issued from military
headquarters without the knowledge, much less the consent, of the Emperor.
Very soon after the fall of Warsaw it became clear to the Emperor that if
he were to retain any dignity whatever he would have to depose Nicholai
Nicholaievitch, and I wish here to state, without any reservation whatever,
that this decision was reached by the Emperor without advice from
Rasputine, myself, or any other person. Even the Empress, although she
approved her husband’s resolution, had no part in forming it. M. Gilliard has
written that the Emperor was forced to his action by bad advisers, especially
the Empress and Rasputine, but in this he is absolutely mistaken. M. Gilliard
writes that the Emperor was told that Grand Duke Nicholai Nicholaievitch
was plotting to confine his Sovereign in a monastery. I do not believe for a
moment that Rasputine ever made such a statement, but he did, in my
presence, warn the Emperor to watch Nicholai Nicholaievitch and his wife
who, he alleged, were at their old practices of table-tipping and spiritism,
which he thought to be a highly dangerous way to conduct a war against the
Germans. As for me, I repeat that never once did I say or do anything to
influence the Emperor in state affairs. I wish I could here reproduce a letter
written to my father by the Emperor in which all the reasons for taking the
step he did were explained. The letter, alas! was taken from me by the
Bolsheviki after my father’s death, and I suppose was destroyed.
On the evening when the Emperor met his ministers to announce his great
decision I dined at the palace, and I was deeply impressed with the firmness
of the Emperor’s decision not to be overborne by arguments or vain fears on
the part of timid statesmen. As he arose to go to the council chamber the
Emperor begged us to pray for him that his resolution should not falter. “You
do not know how hard it has been for me to refrain from taking an active
part in the command of my beloved army,” he said at parting. Overcome and
speechless, I pressed into his hand a tiny ikon which I had always worn
around my neck, and during the long council which followed the Empress
and I prayed fervently for the Emperor and for our distracted country.
As the time passed the Empress’s anxiety grew so great that, throwing a
cloak around her shoulders and beckoning me to follow, she went out on the
balcony, one end of which gave on the council room. Through the lace of the
window curtains we could see the Emperor sitting very upright, surrounded
by his ministers, one of whom was on his feet speaking earnestly. Our eleven
o’clock tea was served long before the Emperor, entirely exhausted, returned
from the conference. Throwing himself in an armchair, he stretched himself
out like a man spent after extreme exertion, and I could see that his brow and
hands were wet with perspiration.
“They did not move me,” he said in a low, tense voice. “I listened to all
their long, dull speeches, and when all had finished I said: ‘Gentlemen, in
two days from now I leave for the Stavka.’ ” As he repeated the words his
face lightened, his shoulders straightened, and he appeared like a man whose
strength was suddenly renewed.
Yet one more struggle was before him. The Empress Dowager, whom the
Emperor visited immediately after the ministerial conference, was by this
time thoroughly imbued with the German-spy mania in which the Empress
and Rasputine, not to mention myself, were involved. She believed the
whole preposterous tissue of lies which had been built up and with all her
might she struggled against the Emperor’s decision to assume supreme
command of the army. For over two hours a painful scene was enacted in the
Empress Dowager’s gardens, he trying to show her that utter disaster
threatened the army and the Empire under existing conditions, and she
repeating over and over again the wicked slanders of German plots which
she insisted that he was furthering. In the end the Emperor left, terribly
shaken, but with his resolution as strong as ever.
Before leaving for staff headquarters the Emperor and his family took
communion together at the Feodorovsky Cathedral and at their last meal
together he showed himself calm and collected as he had not been for some
time; in fact, not since the beginning of the last disastrous campaign. From
headquarters the Emperor wrote full accounts of the scenes which took place
when he assumed personal command, and of the furious anger, not only of
the deposed Nicholai Nicholaievitch but of all his staff, “Every one of
whom,” wrote the Emperor, “has the ambition himself to govern Russia.”
I am not attempting to write a military history of those years, and I am
quite aware of the fact that most published accounts of the Russian Army
represent Nicholai Nicholaievitch as the devoted friend of the Allies and the
Emperor as the pliant tool of German influences. It is undeniable, however,
that almost as soon as Nicholai Nicholaievitch had been sent to the Caucasus
and the Emperor took command of the Western Army a marked
improvement in the general morale became apparent. Retreat at various
points was stopped, the whole front strengthened, and a new spirit of loyalty
to the Empire was manifest.
I wish to interpolate here, in connection with the Emperor’s personal
command of the army, a word on the immense service he rendered it at the
beginning of the War in suppressing the manufacture and sale of vodka, the
curse of the Russian peasantry. The Emperor did this entirely on his own
initiative, without advice from his ministers or the Grand Dukes. The
Emperor said at the time: “At least by this I will be remembered,” and he
was, because the condition of the peasants, the town workers, and of course
the army became at once immeasurably better. In the midst of war-time
privations the savings-banks accounts of the people increased enormously,
and in the army there was none of the hideous debauchery which disgraced
Russia in the Russo-Japanese War. As an eminent French correspondent long
afterwards wrote: “It is to the dethroned Emperor Nicholas that we must
accord the honor of having effected the greatest of all internal reforms in
war-time Russia, the suppression of alcoholism.”
In October the Emperor came to Tsarskoe Selo for a brief visit, and on his
return he took with him to the Stavka the young Tsarevitch. This is the first
time he had ever separated the boy from his mother, and the Empress was
never happy except in the few minutes each day when she was reading the
child’s daily letter. At nine o’clock at night she went up to his bedroom
exactly as though he were there and she was listening to his evening prayers.
By day the Empress continued her tireless work in the hospitals from which,
by reason of my accident, I had long been excluded. However, at this time, I
received from the railroad as compensation for my injuries the considerable
sum of eighty thousand rubles, and with the money I established a hospital
for convalescent soldiers in which maimed and wounded men received
training in various useful trades. This, it is needless to say, became a great
source of happiness to me, since I knew as well as the soldiers what it meant
to be crippled and helpless. From the first my hospital training school was a
most gratifying success, and my personal interest in it never ceased until the
Revolution, after which all my efforts at usefulness and service ended in
imprisonment and persecution.
Not this action of mine, patriotic though it must have appeared, no
amount of devotion of the Empress to the wounded, sufficed to check the
rapidly growing propaganda which sought to convict the Imperial Family
and all its friends of being German spies. The fact that in England the
Empress’s brother-in-law, Prince Louis of Battenberg, German-born but a
loyal Briton, was forced to resign his command in the British Navy was used
with effect against the Empress Alexandra Feodorovna. She knew and
resented keenly this insane delusion, and she did everything in her power to
overcome it. I remember a day when the Empress received a letter from her
brother Ernest, Grand Duke of Hesse, in which he implored her to do
something to improve the barbarous conditions of German prisoners in
Russia. With streaming tears the Empress owned herself powerless to do
anything at all in behalf of the unhappy captives. She had organized a
committee for the relief of Russian prisoners in Germany, but this had been
fiercely attacked, especially in the columns of Novy Vremya, an influential
organ of the Constitutional Democratic Party. In this newspaper and in
general society the Empress’s committee was accused of being a mere
camouflage gotten up to shield her real purpose of helping the Germans.
Against such attacks the Empress had no defense. Her secretary, Count
Rostovseff, indeed tried to refute the story concerning the Empress’s prison-
camp committee, but the editors of Novy Vremya insolently refused to
publish his letter of explanation.
The German-spy mania was extended from the palace to almost every
Russian who had the misfortune to possess a name that sounded at all

You might also like