(Ebook) Computational Analysis and Deep Learning for Medical Care: Principles, Methods, and Applications by Amit Kumar Tyagi (editor) ISBN 9781119785729, 1119785723 2024 scribd download
(Ebook) Computational Analysis and Deep Learning for Medical Care: Principles, Methods, and Applications by Amit Kumar Tyagi (editor) ISBN 9781119785729, 1119785723 2024 scribd download
com
OR CLICK HERE
DOWLOAD EBOOK
ebooknice.com
ebooknice.com
https://ebooknice.com/product/sat-ii-success-
math-1c-and-2c-2002-peterson-s-sat-ii-success-1722018
ebooknice.com
ebooknice.com
(Ebook) Master SAT II Math 1c and 2c 4th ed (Arco Master
the SAT Subject Test: Math Levels 1 & 2) by Arco ISBN
9780768923049, 0768923042
https://ebooknice.com/product/master-sat-ii-math-1c-and-2c-4th-ed-
arco-master-the-sat-subject-test-math-levels-1-2-2326094
ebooknice.com
ebooknice.com
ebooknice.com
ebooknice.com
https://ebooknice.com/product/data-science-for-genomics-48742576
ebooknice.com
Table of Contents
Cover
Title Page
Copyright
Preface
Part 1: Deep Learning and Its Models
1 CNN: A Review of Models, Application of IVD Segmentation
1.1 Introduction
1.2 Various CNN Models
1.3 Application of CNN to IVD Detection
1.4 Comparison With State-of-the-Art Segmentation
Approaches for Spine T2W Images
1.5 Conclusion
References
2 Location-Aware Keyword Query Suggestion Techniques
With Artificial Intelligence Perspective
2.1 Introduction
2.2 Related Work
2.3 Artificial Intelligence Perspective
2.4 Architecture
2.5 Conclusion
References
3 Identification of a Suitable Transfer Learning Architecture
for Classification: A Case Study with Liver Tumors
3.1 Introduction
3.2 Related Works
3.3 Convolutional Neural Networks
3.4 Transfer Learning
3.5 System Model
3.6 Results and Discussions
3.7 Conclusion
References
4 Optimization and Deep Learning-Based Content Retrieval,
Indexing, and Metric Learning Approach for Medical Images
4.1 Introduction
4.2 Related Works
4.3 Proposed Method
4.4 Results and Discussion
4.5 Conclusion
References
Part 2: Applications of Deep Learning
5 Deep Learning for Clinical and Health Informatics
5.1 Introduction
5.2 Related Work
5.3 Motivation
5.4 Scope of the Work in Past, Present, and Future
5.5 Deep Learning Tools, Methods Available for Clinical,
and Health Informatics
5.6 Deep Learning: Not-So-Near Future in Biomedical
Imaging
5.7 Challenges Faced Toward Deep Learning Using in
Biomedical Imaging
5.8 Open Research Issues and Future Research Directions
in Biomedical Imaging (Healthcare Informatics)
5.9 Conclusion
References
6 Biomedical Image Segmentation by Deep Learning Methods
6.1 Introduction
6.2 Overview of Deep Learning Algorithms
6.3 Other Deep Learning Architecture
6.4 Biomedical Image Segmentation
6.5 Conclusion
References
7 Multi-Lingual Handwritten Character Recognition Using
Deep Learning
7.1 Introduction
7.2 Related Works
7.3 Materials and Methods
7.4 Experiments and Results
7.5 Conclusion
References
8 Disease Detection Platform Using Image Processing Through
OpenCV
8.1 Introduction
8.2 Problem Statement
8.3 Conclusion
8.4 Summary
References
9 Computer-Aided Diagnosis of Liver Fibrosis in Hepatitis
Patients Using Convolutional Neural Network
9.1 Introduction
9.2 Overview of System
9.3 Methodology
9.4 Performance and Analysis
9.5 Experimental Results
9.6 Conclusion and Future Scope
References
Part 3: Future Deep Learning Models
10 Lung Cancer Prediction in Deep Learning Perspective
10.1 Introduction
10.2 Machine Learning and Its Application
10.3 Related Work
10.4 Why Deep Learning on Top of Machine Learning?
10.5 How is Deep Learning Used for Prediction of Lungs
Cancer?
10.6 Conclusion
References
11 Lesion Detection and Classification for Breast Cancer
Diagnosis Based on Deep CNNs from Digital Mammographic
Data
11.1 Introduction
11.2 Background
11.3 Methods
11.4 Application of Deep CNN for Mammography
11.5 System Model and Results
11.6 Research Challenges and Discussion on Future
Directions
11.7 Conclusion
References
12 Health Prediction Analytics Using Deep Learning Methods
and Applications
12.1 Introduction
12.2 Background
12.3 Predictive Analytics
12.4 Deep Learning Predictive Analysis Applications
12.5 Discussion
12.6 Conclusion
References
13 Ambient-Assisted Living of Disabled Elderly in an
Intelligent Home Using Behavior Prediction—A Reliable Deep
Learning Prediction System
13.1 Introduction
13.2 Activities of Daily Living and Behavior Analysis
13.3 Intelligent Home Architecture
13.4 Methodology
13.5 Senior Analytics Care Model
13.6 Results and Discussions
13.7 Conclusion
Nomenclature
References
14 Early Diagnosis Tool for Alzheimer’s Disease Using 3D
Slicer
14.1 Introduction
14.2 Related Work
14.3 Existing System
14.4 Proposed System
14.5 Results and Discussion
14.6 Conclusion
References
Part 4: Deep Learning - Importance and Challenges for Other
Sectors
15 Deep Learning for Medical Healthcare: Issues, Challenges,
and Opportunities
15.1 Introduction
15.2 Related Work
15.3 Development of Personalized Medicine Using Deep
Learning: A New Revolution in Healthcare Industry
15.4 Deep Learning Applications in Precision Medicine
15.5 Deep Learning for Medical Imaging
15.6 Drug Discovery and Development: A Promise
Fulfilled by Deep Learning Technology
15.7 Application Areas of Deep Learning in Healthcare
15.8 Privacy Issues Arising With the Usage of Deep
Learning in Healthcare
15.9 Challenges and Opportunities in Healthcare Using
Deep Learning
15.10 Conclusion and Future Scope
References
16 A Perspective Analysis of Regularization and Optimization
Techniques in Machine Learning
16.1 Introduction
16.2 Regularization in Machine Learning
16.3 Convexity Principles
16.4 Conclusion and Discussion
References
17 Deep Learning-Based Prediction Techniques for Medical
Care: Opportunities and Challenges
17.1 Introduction
17.2 Machine Learning and Deep Learning Framework
17.3 Challenges and Opportunities
17.4 Clinical Databases—Electronic Health Records
17.5 Data Analytics Models—Classifiers and Clusters
17.6 Deep Learning Approaches and Association
Predictions
17.7 Conclusion
17.8 Applications
References
18 Machine Learning and Deep Learning: Open Issues and
Future Research Directions for the Next 10 Years
18.1 Introduction
18.2 Evolution of Machine Learning and Deep Learning
18.3 The Forefront of Machine Learning Technology
18.4 The Challenges Facing Machine Learning and Deep
Learning
18.5 Possibilities With Machine Learning and Deep
Learning
18.6 Potential Limitations of Machine Learning and Deep
Learning
18.7 Conclusion
Acknowledgement
Contribution/Disclosure
References
Index
List of Illustrations
Chapter 1
Figure 1.1 Architecture of LeNet-5.
Figure 1.2 Architecture of AlexNet.
Figure 1.3 Architecture of ZFNet.
Figure 1.4 Architecture of VGG-16.
Figure 1.5 Inception module.
Figure 1.6 Architecture of GoogleNet.
Figure 1.7 (a) A residual block.
Figure 1.8 Architecture of ResNeXt.
Figure 1.9 Architecture of SE-ResNet.
Figure 1.10 Architecture of DenseNet.
Figure 1.11 Architecture of MobileNets.
Chapter 2
Figure 2.1 General architecture of a search engine.
Figure 2.2 The increased mobile users.
Figure 2.3 AI-powered location-based system.
Figure 2.4 Architecture diagram for querying.
Chapter 3
Figure 3.1 Phases of CECT images (1: normal liver; 2: tumor
within liver; 3: sto...
Figure 3.2 Architecture of convolutional neural network.
Figure 3.3 AlexNet architecture.
Figure 3.4 GoogLeNet architecture.
Figure 3.5 Residual learning—building block.
Figure 3.6 Architecture of ResNet-18.
Figure 3.7 System model for case study on liver tumor
diagnosis.
Figure 3.8 Output of bidirectional region growing
segmentation algorithm: (a) in...
Figure 3.9 HA Phase Liver CT images: (a) normal liver; (b)
HCC; (c) hemangioma; ...
Figure 3.10 Training progress for AlexNet.
Figure 3.11 Training progress for GoogLeNet.
Figure 3.12 Training progress for ResNet-18.
Figure 3.13 Training progress for ResNet-50.
Chapter 4
Figure 4.1 Proposed system for image retrieval.
Figure 4.2 Schematic of the deep convolutional neural
networks.
Figure 4.3 Proposed feature extraction system.
Figure 4.4 Proposed model for the localization of the
abnormalities.
Figure 4.5 Graph for the retrieval performance of the metric
learning for VGG19.
Figure 4.6 PR values for state of art ConvNet model for CT
images.
Figure 4.7 PR values for state of art CNN model for CT images.
Figure 4.8 Proposed system—PR values for the CT images.
Figure 4.9 PR values for proposed content-based image
retrieval.
Figure 4.10 Graph for loss function of proposed deep
regression networks for tra...
Figure 4.11 Graph for loss function of proposed deep
regression networks for val...
Chapter 6
Figure 5.1 Different informatics in healthcare [28].
Chapter 6
Figure 6.1 CT image reconstruction (past, present, and future)
[3].
Figure 6.2 (a) Classic machine learning algorithm, (b) Deep
learning algorithm.
Figure 6.3 Traditional neural network.
Figure 6.4 Convolutional Neural Network.
Figure 6.5 Psoriasis images [2].
Figure 6.6 Restricted Boltzmann Machine.
Figure 6.7 Autoencoder architecture with vector and image
inputs [1].
Figure 6.8 Image of chest x-ray [60].
Figure 6.9 Regular thoracic disease identified in chest x-rays
[23].
Figure 6.10 MRI of human brain [4].
Chapter 7
Figure 7.1 Architecture of the proposed approach.
Figure 7.2 Sample Math dataset (including English
characters).
Figure 7.3 Sample Bangla dataset (including Bangla numeric).
Figure 7.4 Sample Devanagari dataset (including Hindi
numeric).
Figure 7.5 Dataset distribution for English dataset.
Figure 7.6 Dataset distribution for Hindi dataset.
Figure 7.7 Dataset distribution for Bangla dataset.
Figure 7.8 Dataset distribution for Math Symbol dataset.
Figure 7.9 Dataset distribution.
Figure 7.10 Precision-recall curve on English dataset.
Figure 7.11 ROC curve on English dataset.
Figure 7.12 Precision-recall curve on Hindi dataset.
Figure 7.13 ROC curve on Hindi dataset.
Figure 7.14 Precision-recall curve on Bangla dataset.
Figure 7.15 ROC curve on Bangla dataset.
Figure 7.16 Precision-recall curve on Math Symbol dataset.
Figure 7.17 ROC curve on Math symbol dataset.
Figure 7.18 Precision-recall curve of the proposed model.
Figure 7.19 ROC curve of the proposed model.
Chapter 8
Figure 8.1 Eye image dissection [34].
Figure 8.2 Cataract algorithm [10].
Figure 8.3 Pre-processing algorithm [48].
Figure 8.4 Pre-processing analysis [39].
Figure 8.5 Morphologically opened [39].
Figure 8.6 Finding circles [40].
Figure 8.7 Iris contour separation [40].
Figure 8.8 Image inversion [41].
Figure 8.9 Iris detection [41].
Figure 8.10 Cataract detection [41].
Figure 8.11 Healthy eye vs. retinoblastoma [33].
Figure 8.12 Unilateral retinoblastoma [18].
Figure 8.13 Bilateral retinoblastoma [19].
Figure 8.14 Classification of stages of skin cancer [20].
Figure 8.15 Eye cancer detection algorithm.
Figure 8.16 Sample test cases.
Figure 8.17 Actual working of the eye cancer detection
algorithm.
Figure 8.18 Melanoma example [27].
Figure 8.19 Melanoma detection algorithm.
Figure 8.20 Asymmetry analysis.
Figure 8.21 Border analysis.
Figure 8.22 Color analysis.
Figure 8.23 Diameter analysis.
Figure 8.24 Completed detailed algorithm.
Chapter 9
Figure 9.1 Basic overview of a proposed computer-aided
system.
Figure 9.2 Block diagram of the proposed system for finding
out liver fibrosis.
Figure 9.3 Block diagram representing different pre-
processing stages in liver f...
Figure 9.4 Flow chart showing student’s t test.
Figure 9.5 Diagram showing SegNet architecture for
convolutional encoder and dec...
Figure 9.6 Basic block diagram of VGG-16 architecture.
Figure 9.7 Flow chart showing SegNet working process for
classifying liver fibro...
Figure 9.8 Overall process of the CNN of the system.
Figure 9.9 The stages in identifying liver fibrosis by using
Conventional Neural...
Figure 9.10 Multi-layer neural network architecture for a CAD
system for diagnos...
Figure 9.11 Graphical representation of Support Vector
Machine.
Figure 9.12 Experimental analysis graph for different
classifier in terms of acc...
Chapter 10
Figure 10.1 Block diagram of machine learning.
Figure 10.2 Machine learning algorithm.
Figure 10.3 Structure of deep learning.
Figure 10.4 Architecture of DNN.
Figure 10.5 Architecture of CNN.
Figure 10.6 System architecture.
Figure 10.7 Image before histogram equalization.
Figure 10.8 Image after histogram equalization.
Figure 10.9 Edge detection.
Figure 10.10 Edge segmented image.
Figure 10.11 Total cases.
Figure 10.12 Result comparison.
Chapter 11
Figure 11.1 Breast cancer incidence rates worldwide (source:
International Agenc...
Figure 11.2 Images from MIAS database showing normal,
benign, malignant mammogra...
Figure 11.3 Image depicting noise in a mammogram.
Figure 11.4 Architecture of CNN.
Figure 11.5 A complete representation of all the operation
that take place at va...
Figure 11.6 An image depicting Pouter, Plesion, and Pbreast in
a mammogram.
Figure 11.7 The figure depicts two images: (a) mammogram
with a malignant mass a...
Figure 11.8 A figure depicting the various components of a
breast as identified ...
Figure 11.9 An illustration of how a mammogram image
having tumor is segmented t...
Figure 11.10 A schematic representation of classification
procedure of CNN.
Figure 11.11 A schematic representation of classification
procedure of CNN durin...
Figure 11.12 Proposed system model.
Figure 11.13 Flowchart for MIAS database and unannotated
labeled images.
Figure 11.14 Image distribution for training model.
Figure 11.15 The graph shows the loss for the trained model
on train and test da...
Figure 11.16 The graph shows the accuracy of the trained
model for both test and...
Figure 11.17 Depiction of the confusion matrix for the trained
CNN model.
Figure 11.18 Receiver operating characteristics of the trained
model.
Figure 11.19 The image shows the summary of the CNN
model.
Figure 11.20 Performance parameters of the trained model.
Figure 11.21 Prediction of one of the image collected from
diagnostic center.
Chapter 12
Figure 12.1 Deep learning [14]. (a) A simple, multilayer deep
neural network tha...
Figure 12.2 Flowchart of the model [25]. The orange icon
indicates the dataset, ...
Figure 12.3 Evaluation result [25].
Figure 12.4 Deep learning techniques evaluation results [25].
Figure 12.5 Deep transfer learning–based screening system
[38].
Figure 12.6 Classification result.
Figure 12.7 Regression result [45].
Figure 12.8 AE model of deep learning [47].
Figure 12.9 DBN for induction motor fault diagnosis [68].
Figure 12.10 CNN model for health monitoring [80].
Figure 12.11 RNN model for health monitoring [87].
Figure 12.12 Deep learning models usage.
Chapter 13
Figure 13.1 Intelligent home layout model.
Figure 13.2 Deep learning model in predicting behavior
analysis.
Figure 13.3 Lifestyle-oriented context aware model.
Figure 13.4 Components for the identification, simulation, and
detection of acti...
Figure 13.5 Prediction stages.
Figure 13.6 Analytics of event.
Figure 13.7 Prediction of activity duration.
Chapter 14
Figure 14.1 Comparison of normal and Alzheimer brain.
Figure 14.2 Proposed AD prediction system.
Figure 14.3 KNN classification.
Figure 14.4 SVM classification.
Figure 14.5 Load data in 3D slicer.
Figure 14.6 3D slicer visualization.
Figure 14.7 Normal patient MRI.
Figure 14.8 Alzheimer patient MRI.
Figure 14.9 Comparison of hippocampus region.
Figure 14.10 Accuracy of algorithms with baseline records.
Figure 14.11 Accuracy of algorithms with current records.
Figure 14.12 Comparison of without and with dice coefficient.
Chapter 15
Figure 15.1 U-Net architecture [19].
Figure 15.2 Architecture of the 3D-DCSRN model [29].
Figure 15.3 SMILES code for Cyclohexane and Acetaminophen
[32].
Figure 15.4 Medical chatbot architecture [36].
Chapter 16
Figure 16.1 A classical perceptron.
Figure 16.2 Forward and backward paths on an ANN
architecture.
Figure 16.3 A DNN architecture.
Figure 16.4 A DNN architecture for digit classification.
Figure 16.5 Underfit and overfit.
Figure 16.6 Functional mapping.
Figure 16.7 A generalized Tikhonov functional.
Figure 16.8 (a) With hidden layers (b) Dropping h2 and h5.
Figure 16.9 Image cropping as one of the features of data
augmentation.
Figure 16.10 Early stopping criteria based on errors.
Figure 16.11 (a) Convex, (b) Non-convex.
Figure 16.12 (a) Affine (b) Convex function.
Figure 16.13 Workflow and an optimizer.
Figure 16.14 (a) Error (cost) function (b) Elliptical: Horizontal
cross section.
Figure 16.15 Contour plot for a quadratic cost function with
elliptical contours...
Figure 16.16 Gradients when steps are varying.
Figure 16.17 Local minima. (When the gradient ∇ of the
partial derivatives is po...
Figure 16.18 Contour plot showing basins of attraction.
Figure 16.19 (a) Saddle point S. (b) Saddle point over a two-
dimensional error s...
Figure 16.20 Local information encoded by the gradient
usually does not support ...
Figure 16.21 Direction of gradient change.
Figure 16.22 Rolling ball and its trajectory.
Chapter 17
Figure 17.1 Artificial Neural Networks vs. Architecture of
Deep Learning Model [...
Figure 17.2 Machine learning and deep learning techniques
[4, 5].
Figure 17.3 Model of reinforcement learning
(https://www.kdnuggets.com).
Figure 17.4 Data analytical model [5].
Figure 17.5 Support Vector Machine—classification approach
[1].
Figure 17.6 Expected output of K-means clustering [1].
Figure 17.7 Output of mean shift clustering [2].
Figure 17.8 Genetic Signature–based Hierarchical Random
Forest Cluster (G-HR Clu...
Figure 17.9 Artificial Neural Networks vs. Deep Learning
Neural Networks.
Figure 17.10 Architecture of Convolution Neural Network.
Figure 17.11 Architecture of the Human Diseases Pattern
Prediction Technique (EC...
Figure 17.12 Comparative analysis: processing time vs.
classifiers.
Figure 17.13 Comparative analysis: memory usage vs.
classifiers.
Figure 17.14 Comparative analysis: classification accuracy vs.
classifiers.
Figure 17.15 Comparative analysis: sensitivity vs. classifiers.
Figure 17.16 Comparative analysis: specificity vs. classifiers.
Figure 17.17 Comparative analysis: FScore vs. classifiers.
Chapter 18
Figure 18.1 Deep Neural Network (DNN).
Figure 18.2 The evolution of machine learning techniques
(year-wise).
List of Tables
Chapter 1
Table 1.1 Various parameters of the layers of LeNet.
Table 1.2 Every column indicates which feature map in S2 are
combined by the uni...
Table 1.3 AlexNet layer details.
Table 1.4 Various parameters of ZFNet.
Table 1.5 Various parameters of VGG-16.
Table 1.6 Various parameters of GoogleNet.
Table 1.7 Various parameters of ResNet.
Table 1.8 Comparison of ResNet-50 and ResNext-50 (32 × 4d).
Table 1.9 Comparison of ResNet-50 and ResNext-50 and SE-
ResNeXt-50 (32 × 4d).
Table 1.10 Comparison of DenseNet.
Table 1.11 Various parameters of MobileNets.
Table 1.12 State-of-art of spine segmentation approaches.
Chapter 2
Table 2.1 History of search engines.
Table 2.2 Three types of user refinement of queries.
Table 2.3 Different approaches for the query suggestion
techniques.
Chapter 3
Table 3.1 Types of liver lesions.
Table 3.2 Dataset count.
Table 3.3 Hyperparameter settings for training.
Table 3.4 Confusion matrix for AlexNet.
Table 3.5 Confusion matrix for GoogLeNet.
Table 3.6 Confusion matrix for ResNet-18.
Table 3.7 Confusion matrix for ResNet-50.
Table 3.8 Comparison of classification accuracies.
Chapter 4
Table 4.1 Retrieval performance of metric learning for VGG19.
Table 4.2 Performance of retrieval techniques of the trained
VGG19 among fine-tu...
Table 4.3 PR values of various models—a comparison for CT
image retrieval.
Table 4.4 Recall vs. precision for proposed content-based
image retrieval.
Table 4.5 Loss function of proposed deep regression networks
for training datase...
Table 4.6 Loss function of proposed deep regression networks
for validation data...
Table 4.7 Land mark details (identification rates vs. distance
error) for the pr...
Table 4.8 Accuracy value of the proposed system.
Table 4.9 Accuracy of the retrieval methods compared with
the metric learning–ba...
Chapter 6
Table 6.1 Definition of the abbreviations.
Chapter 7
Table 7.1 Performance of proposed models on English dataset.
Table 7.2 Performance of proposed model on Bangla dataset.
Table 7.3 Performance of proposed model on Math Symbol
dataset.
Chapter 8
Table 8.1 ABCD factor for TDS value.
Table 8.2 Classify mole according to TDS value.
Chapter 9
Table 9.1 The confusion matrix for different classifier.
Table 9.2 Performance analysis of different classifiers:
Random Forest, SVM, Naï...
Chapter 10
Table 10.1 Result analysis.
Chapter 11
Table 11.1 Comparison of different techniques and tumor.
Chapter 13
Table 13.1 Cognitive functions related with routine activities.
Table 13.2 Situation and design features.
Table 13.3 Accuracy of prediction.
Chapter 14
Table 14.1 Accuracy comparison and mean of algorithms with
baseline records.
Table 14.2 Accuracy comparison and mean of algorithms with
current records.
Chapter 15
Table 15.1 Variances of Convolutional Neural Network (CNN).
Table 15.2 Various issues challenges faced by researchers for
using deep learnin...
Chapter 17
Table 17.1 Comparative analysis: classification accuracy for 10
datasets—analysi...
Chapter 18
Table 18.1 Comparison among data mining, machine learning,
and deep learning.
Scrivener Publishing
100 Cummings Center, Suite 541J
Beverly, MA 01915-6106
Publishers at Scrivener
Martin Scrivener ([email protected])
Phillip Carmical ([email protected])
Computational Analysis and
Deep Learning for Medical
Care
Abstract
The widespread publicity of Convolutional Neural Network (CNN) in various domains such as image
classification, object recognition, and scene classification has revolutionized the research in machine
learning, especially in medical images. Magnetic Resonance Images (MRIs) are suffering from severe
noise, weak edges, low contrast, and intensity inhomogeneity. Recent advances in deep learning with
fewer connections and parameters made their training easier. This chapter presents an in-depth review
of the various deep architectures as well as its application for segmenting the Intervertebral disc (IVD)
from the 3D spine image and its evaluation. The first section deals with the study of various traditional
architectures of deep CNN such as LeNet, AlexNet, ZFNet, GoogleNet, VGGNet, ResNet, Inception model,
ResNeXt, SENet, MobileNet V1/V2, and DenseNet. It also deals with the study of the parameters and
components associated with the models in detail. The second section discusses the application of these
models to segment IVD from the spine image. Finally, theoretically performance and experimental results
of the state-of-art of the literature shows that 2.5D multi-scale FCN performs the best with the Dice
Similarity Index (DSC) of 90.64%.
Keywords: CNN, deep learning, intervertebral disc degeneration, MRI segmentation
1.1 Introduction
The concept of Convolutional Neural Network (CNN) was introduced by Fukushima. The principle in CNN is
that the visual mechanism of human is hierarchical in structure. CNN has been successfully applied in
various image domain such as image classification, object recognition, and scene classification. CNN is
defined as a series of convolution layer and pooling layer. In the convolution layer, the image is convolved
with a filter, i.e., slide over the image spatially and computing dot products. Pooling layer provides a smaller
feature set.
One major cause of low back pain is disc degeneration. Automated detection of lumbar abnormalities from
the clinical scan is a burden for radiologist. Researchers focus on the automation task of the segmentation of
large set of MRI data due to the huge size of such images. The success of the application of CNN in various
field of object detection enables the researchers to apply various models for the detection of Intervertebral
Disc (IVD) and, in turn, helps in the diagnosis of diseases.
The details of the structure of the remaining section of the paper are as follows. The next section deals with
the study of the various CNN models. Section 1.3, presents applications of CNN for the detection of the IVD.
In Section 1.4, comparison with state-of-the-art segmentation approaches for spine T2W images is carried
out, and conclusion is in Section 1.5.
(1.1)
(1.2)
In the first convolutional layer, number of learning parameters is (5×5 + 1) × 6 = 156 parameters; where 6 is
the number of filters, 5 × 5 is the filter size, and bias is 1, and there are 28×28×156 = 122,304 connections.
The number of feature map calculation is as follows:
(1.3)
(1.4)
(1.5)
(1.6)
The number of feature map is 14×14 and the number of learning parameters is (coefficient + bias) × no.
filters = (1+1) × 6 = 12 parameters and the number of connections = 30×14×14 = 5,880.
Layer 3: In this layer, only 10 out of 16 feature maps are connected to six feature maps of the previous layer
as shown in Table 1.2. Each unit in C3 is connected to several 5 × 5 receptive fields at identical locations in
S2. Total number of trainable parameters = (3×5×5+1)×6+(4×5×5+1)×9+(6×5×5+1) = 1516. Total number
of connections = (3×5×5+1)×6×10×10+(4×5×5+1) ×9×10×10 +(6×5×5+1)×10×10 = 151,600. Total number
of parameters is 60K.
1.2.2 AlexNet
Alex Krizhevsky et al. [2] presented a new architecture “AlexNet” to train the ImageNet dataset, which
consists of 1.2 million high-resolution images, into 1,000 different classes. In the original implementation,
layers are divided into two and to train them on separate GPUs (GTX 580 3GB GPUs) takes around 5–6 days.
The network contains five convolutional layers, maximum pooling layers and it is followed by three fully
connected layers, and finally a 1,000-way softmax classifier. The network uses ReLU activation function, data
augmentation, dropout and smart optimizer layers, local response normalization, and overlapping pooling.
The AlexNet has 60M parameters. Figure 1.2 shows the architecture of AlexNet and Table 1.3 shows the
various parameters of AlexNet.
Table 1.2 Every column indicates which feature map in S2 are combined by the units in a particular feature
map of C3 [1].
012 3456 789 10 11 12 13 14 15
0X XXX X X X X X X
1XX XX X X X X X X
2XXX X XX X X X X
3 XX X X XXX X X X
4 X XX XXX X X X X
5 XXX XX X X X X X
Figure 1.2 Architecture of AlexNet.
First Layer: AlexNet accepts a 227 × 227 × 3 RGB image as input which is fed to the first convolutional layer
with 96 kernels (feature maps or filters) of size 11 × 11 × 3 and a stride of 4 and the dimension of the output
image is changed to 96 images of size 55 × 55. The next layer is max-pooling layer or sub-sampling layer
which uses a window size of 3 × 3 and a stride of two and produces an output image of size 27 × 27 × 96.
Second Layer: The second convolutional layer filters the 27 × 27 × 96 image with 256 kernels of size 5 × 5
and a stride of 1 pixel. Then, it is followed by max-pooling layer with filter size 3 × 3 and a stride of 2 and the
output image is changed to 256 images of size 13 × 13.
Third, Fourth, and Fifth Layers: The third, fourth, and fifth convolutional layers uses filter size of 3 × 3 and a
stride of one. The third and fourth convolutional layer has 384 feature maps, and fifth layer uses 256 filters.
These layers are followed by a maximum pooling layer with filter size 3 × 3, a stride of 2 and have 256
feature maps.
Sixth Layer: The 6 × 6 × 256 image is flattened as a fully connected layer with 9,216 neurons (feature maps)
of size 1 × 1.
Seventh and Eighth Layers: The seventh and eighth layers are fully connected layers with 4,096 neurons.
Output Layer: The activation used in the output layer is softmax and consists of 1,000 classes.
1.2.3 ZFNet
The architecture of ZFNet introduced by Zeiler [3] is same as that of the AlexNet, but convolutional layer
uses reduced sized kernel 7 × 7 with stride 2. This reduction in the size will enable the network to obtain
better hyper-parameters with less computational efficiency and helps to retain more features. The number
of filters in the third, fourth and fifth convolutional layers are increased to 512, 1024, and 512. A new
visualization technique, deconvolution (maps features to pixels), is used to analyze first and second layer’s
feature map.
Table 1.3 AlexNet layer details.
Sl. no. Layer Kernel Stride Activation Weights Bias # Activation #
size shape Parameters Connections
1 Input - - (227,227,3) 0 0 - relu -
Layer
2 CONV1 11 × 11 4 (55,55,96) 34,848 96 34,944 relu 105,415,200
3 POOL1 3 × 3 2 (27,27,96) 0 0 0 relu -
4 CONV2 5 × 5 1 (27,27,256) 614,400 256 614,656 relu 111,974,400
5 POOL2 3 × 3 2 (13,13,256) 0 0 0 relu -
6 CONV3 3 × 3 1 (13,13,384) 884,736 384 885,120 relu 149,520,384
7 CONV4 3 × 3 1 (13,13,384) 1,327,104 384 1,327,488 relu 112,140,288
8 CONV5 3 × 3 1 (13,13,256) 884,736 256 884,992 relu 74,760,192
9 POOL3 3 × 3 2 (6,6,256) 0 0 0 relu -
10 FC - - 9,216 37,748,736 4,096 37,752,832 relu 37,748,736
11 FC - - 4,096 16,777,216 4,096 16,781,312 relu 16,777,216
12 FC - - 4,096 4,096,000 1,000 4,097,000 relu 4,096,000
OUTPUT FC - - 1,000 - - 0 softmax -
- - - - - - - 62,378,344 - -
(Total)
1.2.4 VGGNet
Simonyan and Zisserman et al. [4] introduced VGGNet for the ImageNet Challenge in 2014. VGGNet-16
consists of 16 layers; accepts a 227 × 227 × 3 RGB image as input, by subtracting global mean from each
pixel. Then, the image is fed to a series of convolutional layers (13 layers) which uses a small receptive field
of 3 × 3 and uses same padding and stride is 1. Besides, AlexNet and ZFNet uses max-pooling layer after
convolutional layer. VGGNet does not have max-pooling layer between two convolutional layers with 3 × 3
filters and the use of three of these layers is more effective than a receptive field of 5 × 5 and as spatial size
decreases, the depth increases. The max-pooling layer uses a window of size 2 × 2 pixel and a stride of 2. It is
followed by three fully connected layers; first two with 4,096 neurons and third is the output layer with
1,000 neurons, since ILSVRC classification contains 1,000 channels. Final layer is a softmax layer. The
training is carried out on 4 Nvidia Titan Black GPUs for 2–3 weeks with ReLU nonlinearity activation
function. The number of parameters is decreased and it is 138 million parameters (522 MB). The test set
top-5 error rate during competition is 7.1%. Figure 1.4 shows the architecture of VGG-16, and Table 1.5
shows its parameters.
Table 1.4 Various parameters of ZFNet.
Layer name Input Filter Window # Stride Padding Output # Feature # Connections
size size size Filters size maps
Conv 1 224 × 224 7 × 7 - 96 2 0 110 × 96 14,208
110
Max-pooling 110 × 110 3×3 - 2 0 55 × 55 96 0
1
Conv 2 55 × 55 5×5 - 256 2 0 26 × 26 256 614,656
Max-pooling 26 × 26 - 3×3 - 2 0 13 × 13 256 0
2
Conv 3 13 × 13 3×3 - 384 1 1 13 × 13 384 885,120
Conv 4 13 × 13 3×3 - 384 1 1 13 × 13 384 1,327,488
Conv 5 13 × 13 3×3 - 256 1 1 13 × 13 256 884,992
Max-pooling 13 × 13 - 3×3 - 2 0 6×6 256 0
3
Fully 4,096 37,752,832
connected 1 neurons
Fully 4,096 16,781,312
connected 2 neurons
Fully 1,000 4,097,000
connected 3 neurons
Softmax 1,000 62,357,608
classes (Total)
1.2.5 GoogLeNet
In 2014, Google [5] proposed the Inception network for the ImageNet Challenge in 2014 for detection and
classification challenges. The basic unit of this model is called “Inception cell”—parallel convolutional layers
with different filter sizes, which consists of a series of convolutions at different scales and concatenate the
results; different filter sizes extract different feature map at different scales. To reduce the computational
cost and the input channel depth, 1 × 1 convolutions are used. In order to concatenate properly, max pooling
with “same” padding is used. It also preserves the dimensions. In the state-of-art, three versions of Inception
such as Inception v2, v3, and v4 and Inception-ResNet are defined. Figure 1.5 shows the inception module
and Figure 1.6 shows the architecture of GoogLeNet.
For each image, resizing is performed so that the input to the network is 224 × 224 × 3 image, extract mean
before feeding the training image to the network. The dataset contains 1,000 categories, 1.2 million images
for training, 100,000 for testing, and 50,000 for validation. GoogLeNet is 22 layers deep and uses nine
inception modules, and global average pooling instead of fully connected layers to go from 7 × 7 × 1,024 to 1
× 1 × 1024, which, in turn, saves a huge number of parameters. It includes several softmax output units to
enforce regularization. It is trained on a high-end GPUs within a week and achieved top-5 error rate of
6.67%. GoogleNet trains faster than VGG and size of a pre-trained GoogleNet is comparatively smaller than
VGG.
Table 1.5 Various parameters of VGG-16.
Layer name Input size Filter Window # Stride/Padding Output # Feature #
size size Filters size maps Parameters
Conv 1 224 × 224 3 × 3 - 64 1/1 224 × 64 1,792
224
Conv 2 224 × 224 3 × 3 - 64 1/1 224 × 64 36,928
224
Max-pooling 224 × 224 - 2×2 - 2/0 112 × 64 0
1 112
Conv 3 112 × 112 3 × 3 - 128 1/1 112 × 128 73,856
112
Conv 4 112 × 112 3 × 3 - 128 1/1 112 × 128 147,584
112
Max-pooling 112 × 112 - 2×2 - 2/0 56 × 56 128 0
2
Conv 5 56 × 56 3×3 - 256 1/1 56 × 56 256 295,168
Conv 6 56 × 56 3×3 - 256 1/1 56 × 56 256 590,080
Conv 7 56 × 56 3×3 - 256 1/1 56 × 56 256 590,080
Max-pooling 56 × 56 - 2×2 - 2/0 28 × 28 256 0
3
Conv 8 28 × 28 3×3 - 512 1/1 28 × 28 512 1,180,160
Conv 9 28 × 28 3×3 - 512 1/1 28 × 28 512 2,359,808
Conv 10 28 × 28 3×3 - 512 1/1 28 × 28 512 2,359,808
Max-pooling 28 × 28 - 2×2 - 2/0 14 × 14 512 0
4
Conv 11 14 × 14 3×3 - 512 1/1 14 × 14 512 2,359,808
Conv 12 14 × 14 3×3 - 512 1/1 14 × 14 512 2,359,808
Conv 13 14 × 14 3×3 - 512 1/1 14 × 14 512 2,359,808
Max-pooling 14 × 14 - 2×2 - 2/0 7×7 512 0
5
Fully 4,096 102,764,544
connected 1 neurons
Fully 4,096 16,781,312
connected 2 neurons
Fully 1,000 4,097,000
connected 3 neurons
Softmax 1,000
classes
Figure 1.5 Inception module.
1.2.6 ResNet
Usually, the input feature map will be fed through a series of convolutional layer, a non-linear activation
function (ReLU) and a pooling layer to provide the output for the next layer. The training is done by the back-
propagation algorithm. The accuracy of the network can be improved by increasing depth. Once the network
gets converged, its accuracy saturates. Further, if we add more layers, then the performance gets degraded
rapidly, which, in turn, results in higher training error. To solve the problem of the vanishing/exploding
gradient, ResNet with a residual learning framework [6] was proposed by allowing new layers to fit a
residual mapping. When a model is converged than to fit the mapping, it is easy to push the residual to zero.
The principle of ResNet is residual learning and identity mapping and skip connections. The idea behind the
residual learning is that it feeds the input image to the next convolutional layer and adds them together and
performs non-linear activation (ReLU) and pooling.
Table 1.6 Various parameters of GoogleNet.
Layer Input Filter Window # Stride Depth # 1 # 3 × 3 # 3 # 5 × 5 # 5 Pool Padding Out
name size size size Filters × 1 reduce × 3 reduce × 5 proj size
Convolution 224 × 7 × 7 - 64 2 1 2 112
224 112
64
Max pool 112 × - 3×3 - 2 0 0 56 ×
112 × 64
Convolution 56 × 3×3 - 192 1 2 64 192 1 56 ×
56 × 19
Max pool 56 × - 3×3 192 2 0 0 28 ×
56 × 19
Inception 28 × - - - - 2 64 96 128 16 32 32 - 28 ×
(3a) 28 × 25
Inception 28 × - - - - 2 128 128 192 32 96 64 - 28 ×
(3b) 28 × 48
Max pool 28 × - 3×3 480 2 0 0 14 ×
28 × 48
Inception 14 × - - - - 2 192 96 208 16 48 64 - 14 ×
(4a) 14 × 51
Inception 14 × - - - - 2 160 112 224 24 64 64 - 14 ×
(4b) 14 × 51
Inception 14 × - - - - 2 128 128 256 24 64 64 - 14 ×
(4c) 14 × 51
Inception 14 × - - - - 2 112 144 288 32 64 64 - 14 ×
(4d) 14 × 52
Inception 14 × - - - - 2 256 160 320 32 128 128 - 14 ×
(4e) 14 × 83
Max pool 14 × - 3×3 - 2 0 0 7×7
14 832
Inception 7×7 - - - - 2 256 160 320 32 128 128 - 7×7
(5a) 832
Inception 7×7 - - - - 2 384 192 384 48 128 128 - 7×7
(5b) 1,02
Avg pool 7×7 - 7×7 - - 0 0 1×1
1,02
Dropout - - - 1,024 - 0 - 1×1
(40 %) 1,02
Linear - - - 1,000 - 1 - 1×1
1,00
Softmax - - - 1,000 - 0 - 1×1
1,00
The architecture is a shortcut connection of VGGNet (consists of 3 × 3 filters) that is inserted to form a
residual network as shown in figure. Figure 1.7(b) shows 34-layer network converted into the residual
network and has lesser training error as compared to the 18-layer residual network. As in GoogLeNet, it
utilizes a series of a global average pooling layer and the classification layer. ResNets were capable of
learning a network with a maximum depth of 152. Compared to the GoogLeNet and VGGNet, accuracy is
better and computationally efficient than VGGNet. ResNet-152 achieves 95.51 top-5 accuracies. Figure 1.7(a)
shows a residual block, Figure 1.7(b) shows the architecture of ResNet and Table 1.7 shows the parameters
of ResNet.
1.2.7 ResNeXt
The ResNeXt [7] architecture is built based on the advantages of ResNet (residual networks) and GoogleNet
(multi-branch architecture) and requires less number of hyperparameters compared to the traditional
ResNet. The next defines the next dimension (“cardinality”), an additional dimension on top of the depth and
width of ResNet. The input is split channelwise into groups. The standard residual block is replaced with a
“split-transform-merge” procedure. This architecture uses a series of residual blocks and uses the following
rules. (1) If the spatial maps are of same size, the blocks will split the hyperparameters; (2) The spatial map
is pooled by two factors; block width is doubled by two factors. ResNeXt becomes the 1st runner up of
ILSVRC classification task and produces better results than ResNet. Figure 1.8 shows the architecture of
ResNeXt, and the comparison with REsNet is shown in Table 1.8.
Figure 1.7 (a) A residual block.
Table 1.7 Various parameters of ResNet.
1.2.8 SE-ResNet
Hu et al. [8] proposed a Squeeze-and-Excitation Network (SENet) (first position on ILSVRC 2017 category)
with lightweight gating mechanism. This architecture focuses explicitly on model interdependencies
between the channels of convolutional features and to achieve dynamic channel-wise feature recalibration.
In the squeeze phase, SE block uses global average pooling operation and in the excitation phase uses
channel-wise scaling. For an input image of size 224 × 224, the running time of ResNet-50 is 164 ms,
whereas it is 167 ms for SE-ResNet-50. Also, SE-ResNet-50 requires ∼3.87 GFLOPs, which shows a 0.26%
relative increase over the original ResNet-50. The top-5 error is reduced to 2.251%. Figure 1.9 shows the
architecture of SE-ResNet, and Table 1.9 shows ResNet and its comparison with SE-ResNet-50 and SE-
ResNeXt-50.
1.2.9 DenseNet
The architecture is proposed by [9], where every layer connect directly with each other so as to ensure
maximum information (and gradient) flow. Thus, this model with L layer has L(L+1) connections. A number
of dense block (group of layers connected to previous layers) and the transition layer control the complexity
of the model. Each dense block adds one channel to the model. Transition layer is used to reduce the number
of channels by using the convolutional layer of size 1 × 1 and reduces the width and height of the average
pooling layer by a factor of 2 and with a stride of 2. It concatenates all the output feature map of previous
layers along with incoming feature maps, i.e., each layer has direct access to the gradients from the loss
function and the original input image. Further, DenseNets needs small set of parameters as compared to the
traditional CNN and reduces vanishing gradient problem. Figure 1.10 shows the architecture of DenseNet,
and Table 1.10 shows various DenseNet architectures.
Table 1.8 Comparison of ResNet-50 and ResNext-50 (32 × 4d).
1.2.10 MobileNets
Google proposed MobileNets VI [10] uses depthwise separable convolution instead of the normal
convolutions, which, in turn, reduces the model size and complexity. Depthwise separable convolution is
defined as a depthwise convolution followed by a pointwise convolution, i.e., a single convolution is
performed on each colour channel and it is followed by pointwise convolution which applies a 1 × 1
convolution to combine the outputs of depthwise convolution; after each convolution, batch normalization
(BN) and ReLU are applied. The whole architecture consists of 30 layers with (1) Convolutional layer with
stride 2, (2) Depthwise layer, (3) Pointwise layer, (4) Depthwise layer with stride 2, and (5) Pointwise layer.
The advantage of MobileNets is that it requires fewer number of parameters and the model is less complex
(small number of Multiplications and Additions). Figure 1.11 shows the architecture of MobileNets. Table
1.11 shows the various parameters of MobileNets.
Figure 1.9 Architecture of SE-ResNet.
1.5 Conclusion
In this Chapter, we had discussed about the various CNN architectural models and its parameters. In the first
phase, various architectures such as LeNet, AlexNet, VGGnet, GoogleNet, ResNet, ResNeXt, SENet, and
DenseNet and MobileNet are studied. In the second phase, the application of CNN for the segmentation of
IVD is presented. The comparison with state-of-the-art of segmentation approaches for spine T2W images
are also presented. From the experimental results, it is clear that 2.5D multi-scale FCN outperforms all other
models. As a future study, this work modify any currents models to get optimized results.
Discovering Diverse Content Through
Random Scribd Documents
“‘Oh, no!’ says Brother Bear. ‘It wouldn’t do to call that wrestling.
That was only playing. I was just showing you the first few capers:
you can’t wrestle until you learn how. I’ll drop by your house to-
morrow morning, bright and early, and give you another whirl.’
“Brother Tiger looked mighty solemn, but he didn’t say anything.
He ambled off home as well as he could in his condition, and got his
old woman to mend his breeches. She wanted to know who he had
been fighting with, but he told her he had just been playing with
Brother Bear. She laughed, and said that when he had played that
way a few more times there wouldn’t be enough of him left, neither
breeches, body, nor bones, to sew up in a bag.
“Well, the next morning, bright and early, Brother Bear rapped at
Brother Tiger’s door, and told him to come out and take some
exercise before breakfast. Brother Tiger didn’t like this invitation at
all. He said he wanted to sleep a little longer; but Brother Bear sent
in word that the night was made for sleeping, while the day was
made for work and play. Now, it so happened that the honey which
Brother Tiger had ate had put a spell on him, and when Brother Bear
asked him out to wrestle he had to come. He pulled on his clothes
with no good heart, for he was still very sore, and came limping out,
trying to put a good face on the affair. Brother Bear laughed, and
told Brother Tiger howdy, but Brother Tiger didn’t make much of a
reply.
“So Brother Bear says, says he, ‘I hope you are not begrudging
your bargain, Brother Tiger, but you made it yourself, and at no
invitation of mine. I had the seven pieces of honey-in-the-comb, and
you had the bad taste in the mouth. I told you how it would be, but
you would have the honey, and now you’ll have to stand to your
bargain: you can’t help yourself now. I told you the plain truth about
it, but you wouldn’t believe it. You’ll find out the truth before you get
the taste of that honey out of your mouth.’
“Then they made a few passes at each other; but Brother Bear
finally grabbed Brother Tiger around his striped waist, squeezed the
breath out of him, dashed him on the ground, cuffed his ears, and
then stood there on his hind legs, waiting to see what Brother Tiger
was going to do. But Brother Tiger didn’t want any more wrestling
for that day. He went into the house and washed his face and hands,
and sat down and licked his bruises the best he could.
“But the next morning he had to come out and wrestle again, and
this happened until he was so weak he could hardly walk. His hide
was split, his ears were swollen, and every stripe on his long body
was crossed by a scar. Wrestling was fine fun for Brother Bear, who
was used to it, but it was no fun for Brother Tiger, who didn’t know
how. Every time he wrestled he got new bruises, and his head
swelled until he could hardly get in the door of his house without
backing his ears.
“Finally, one day he told Brother Bear candidly that he would
rather give up his house and lot than to be tossed around and cuffed
at that rate. Brother Bear said that he would rather wrestle and have
a jolly time than to take Brother Tiger’s house; but Brother Tiger
wouldn’t hear to that. He said he couldn’t stay in that part of the
country and hear the talk of the neighbors. They would pester him
mighty near to death on the week days, and fairly kill him out on
Sunday, when they had nothing to do but sit around and gossip.
“So Brother Tiger moved out, and Brother Bear moved in; and it
has come to pass that Brother Tiger won’t stay in the same country
with Brother Bear for fear that he will have to do some more
wrestling.”
XIX.
“Now, I’ll tell you honestly,” said Little Mr. Thimblefinger, popping
out from under Mr. Rabbit’s big armchair, “I don’t like such stories.
They give me the all-overs. I expect maybe it’s because they are
true.”
“No doubt that’s the trouble with them,” remarked Mr. Rabbit in a
tone unusually solemn. “You don’t think that at my time of life my
tongue is nimble enough for me to sit here and make up stories to
suit the hour and the company? By the bye,” he continued, turning
around so as to catch Little Mr. Thimblefinger’s eye, “what stories
were you talking about?”
“Well, to tell you the truth, I was fast asleep, for the most part,
but I distinctly remember something about Moons and Monkeys.
When I heard that, I just went off to sleep in spite of myself.”
“There’s no accounting for tastes,” said Mr. Rabbit. “There are
some tales that put me to sleep, and I have no complaint to make
when anybody begins to doze over them that I tell.”
“Oh, you tell ’em well enough,” Little Mr. Thimblefinger declared.
“If anything, you make them better than they ought to be. You lift
your ears at the right place, and pat your foot when the time comes.
I don’t know what more could be asked in telling a story.”
“So far so good,” remarked Mrs. Meadows, who had thus far said
nothing. “Suppose you whirl in and tell us the kind of tale that you
really admire.”
“That’s easier said than done,” replied Little Mr. Thimblefinger,
fidgeting about a little. “You have to take the tales as they come.
Sometimes one will pop into your head in spite of yourself. You
remember it just because you didn’t like it when you first heard it.”
“Tell us one, anyway, just to pass away the time,” said Sweetest
Susan.
“If I tell you one,” Little Mr. Thimblefinger replied, “I’ll not promise
it will be one that I like. That would be promising too much. But the
talk about the Moon, that I heard before I dozed off just now,
reminded me of a tale I heard when I was a good deal smaller than
I am now.
“Once upon a time there was a man who had two sons. They
were twins, but they were just as different from each other as they
could possibly be. One was dark, and the other was light
complected. One was slim, and the other was fat. One was good,
and the other was what people call bad. He was lazy, and full of fun
and mischief. They grew up that way until they were nineteen or
twenty years old. The good boy would work hard every day, or
pretend to work hard, and then he’d go back home and tell his
mother and father that his brother hadn’t done a stroke of work. Of
course, this made the old people feel very queer. The mother felt
sorrowful, and the father felt angry. This went on, until finally, one
day, the father became so angry that he concluded to take his bad
son into some foreign country, and bind him out to some person
who could make him work and cure him of his mischievousness. In
those days people sometimes bound out their children to learn
trades and good manners and things of that sort.”
“I wish dey’d do it now,” exclaimed Drusilla. “Kaze den I wouldn’t
hafter be playin’ nuss, an’ be gwine in all kind er quare places whar
you dunner when ner whar you kin git out.”
“Stuff!” cried Buster John. “Why don’t you be quiet and listen to
the story?”
“It go long too slow fer ter suit me,” said Drusilla in a grumbling
tone.
“Well,” remarked Mr. Thimblefinger, turning to Buster John, “you’ve
come mighty close to telling a part of the tale I had in my mind.”
“I don’t see how,” replied Buster John with some surprise.
“You said ‘stuff!’” responded Mr. Thimblefinger, “and that’s a part
of my story. If you listen, you’ll soon find out. As I was saying,
people in old times bound out their sons to some good man, who
taught them a good trade or something of that kind. Well, this man
that I was telling you about took his bad son off to a foreign country,
and tried to find some one to bind him out to. They traveled many
days and nights. They went over mountains and passed through
valleys. They crossed plains, and they went through the wild woods.
“Now, the man who was taking his son into a foreign country was
getting old, and the farther they walked, the more tired he grew. At
last, one day, when they were going through the big woods, he sat
down to rest near a tall poplar-tree, and, turning to his son, said
angrily:—
“‘Stuff! you are not worth all this trouble. But for you I’d be at
home now, enjoying myself and smoking my pipe.’
“The son, who was used to these outbreaks, made no reply, but
stretched himself out on the dead leaves that littered the ground. He
had hardly done so when there was a tremendous noise in the
woods, and then both father and son saw rushing toward them an
old man with a long beard, followed by a small army of fierce-
looking dwarfs armed with clubs and knives and pikes. They rushed
up and surrounded the father and son.
“‘Which of you called my name and abused me?’ cried the old man
with the long beard.
“‘Not I,’ said the bad son.
“‘Not I,’ said the father. ‘I am sure I never saw you or heard of you
before.’
“This made the old man more furious than ever. He fairly trembled
with rage. ‘Didn’t I hear one of you say, “Stuff! but for you I’d be at
home now enjoying myself, and smoking my pipe?”’
“‘I did say something like that,’ replied the father in great
astonishment.
“‘How dare you?’ cried the old man, beside himself with rage.
‘How did I ever harm you? Seize him!’ he said to his army of dwarfs.
‘Seize him, and bind him hard and fast! I’ll show him whether he can
come into my kingdom and abuse me!’
“The father was speechless with astonishment, and made no
attempt to prevent the dwarfs from seizing and binding him. They
had him tied hard and fast before he could say a word, even if he
had had a word to say. But by this time the son had risen to his feet.
“‘Wait!’ he cried, ‘let’s see what the trouble is! Who are you?’ he
inquired, turning to the old man with the long beard.
“‘My name is Stuff,’ he replied, ‘and I am king of this country
which you are passing through. I’m not going to allow any one to
abuse me in my own kingdom. You may go free, but mind you go
straight back the way you came.’
“The son thought the matter over a little while, and then turned
on his heel and went back the way he had come, and, as he walked,
he whistled all the lively tunes he could think of. For a time he was
glad that his father was no longer with him to quarrel and complain;
but finally he grew lonely, and then he began to think how his father
had raised him up from a little child. The more he thought about
this, the sorrier he was that he had given his father any trouble. He
sat down on a log by the side of the road and thought it all over, and
presently he began to cry.
A QUEER-LOOKING LITTLE MAN CAME JOGGING ALONG
THE ROAD
“While he was sitting there with his head between his hands,
crying over the fate of his father, a queer-looking little man came
jogging along the road. He had bushy hair and a beard that grew all
over his face, except right around his eyes and lips and the tip-end
of his nose. His beard was not long, but it was very thick, and it
stood out around his face like the spokes in a buggy-wheel. He
seemed to be in a big hurry, but when he saw the young man sitting
on the log crying, he stopped, and stared at him.
“‘Tut, tut!’ he cried. ‘What’s all this? Who has hurt your feelings?’
“If the young man had not been so sorrowful, he would have been
surprised to see the queer-looking little man standing by him. But,
as it was, he didn’t seem to be surprised at all. He just looked at the
stranger with red eyes.
“‘My name is Mum,’ said the stranger, ‘and I’m the Man in the
Moon. Tell me your troubles. Maybe I can help you. I’m in a great
hurry, because the Moon must change day after to-morrow, and I
must be there to lend a hand; but I’ll not allow my hurry to prevent
me from hearing your troubles and helping you if I can.’
“So then and there the young man told his story, and the Man in
the Moon sighed heavily when he heard it.
“‘I see how it is,’ he said. ‘You are young and thoughtless, and
your father is old and crabbed. You never thought of what you owed
him, and he never made any allowances for your youth. He’s in no
danger. I know old Stuff well. I’ve watched him many a night when
he thought nobody had an eye on him, and he’s a pretty tough and
cunning customer. You must have help if you get your father out of
trouble.’
“‘What am I to do?’ asked the young man.
“‘Well,’ replied the Man in the Moon, ‘in the first place you will
have to go home. Say nothing about the trouble your father is in.
Just tell your mother that he has lost the sole of his shoe, and has
sent you for the awl that is in the big red cupboard, a piece of
leather, a handful of pegs, and a piece of wax.’
“‘What then?’ the young man inquired.
“‘Bring them here,’ said the Man in the Moon. ‘By the time you get
back, I will have another holiday. We’ll put our heads together and
see what can be done.’
“The young man made no delay. He was so anxious about his
father that he started for home at once. It was a long journey, but
he lost no time on the way. He was in rags and tatters when he
reached home, but that made no difference to him. He took no time
to eat, or to sleep, or to rest, but went to his mother at once, and
told her that his father had lost the sole of his shoe, and had sent
for the awl that lay in the big red cupboard, a strong piece of
leather, a handful of shoe-pegs, and a cake of shoemaker’s wax.
“His mother asked him a great many questions, as women will,
but all the answer the son would make was that his father had lost
the sole of his shoe, and had sent for the awl that lay in the big red
cupboard, a strong piece of leather, a handful of shoe-pegs, and a
cake of shoemaker’s wax. Of course, the mother was very much
worried. She finally came to the conclusion that some great calamity
had befallen her husband, and she went about crying and wringing
her hands, and declaring that they were all ruined; that her husband
was dead; and that more than likely he had been murdered by this
bad, bad son of hers, who had no other story to tell except to ask
for the awl that lay in the big red cupboard, a strong piece of
leather, a handful of shoe-pegs, and a cake of shoemaker’s wax.
“Now, the good son heard all this, but he said nothing. He just
folded his hands and fetched a sigh or two, and seemed to be sorry
for everything in general. But while the mother was going about
wringing her hands and weeping, and the good son was heaving and
fetching his sighs, the other son went to the big red cupboard. There
on a shelf he saw the awl sticking in a cake of shoemaker’s wax.
Near it was a strong piece of leather, and close by was a handful of
shoe-pegs. He took these, changed his ragged coat, and started
back on his journey.
“Now, although the good son did nothing but sigh and look sorry,
he had deep ideas of his own. The reason he was called the good
son was because he was so cunning. He thought to himself that now
would be a good time to do a fine stroke of business. He knew that
his brother had something more on his mind than the awl, the
leather, the pegs, and the shoemaker’s wax, and he wanted to find
out about it. So he ran after his brother to ask him what the real
trouble was. He caught up with him a little way beyond the limits of
the village, but no satisfaction could he get. Then he began to abuse
his brother and to accuse him of all sorts of things.
“But the son, who was trying to get his father out of trouble, paid
no attention to this. He went forward on his journey, turning his
head neither to the right nor to the left. The good brother (as he
was called) followed along after the best he could, being determined
to see the end of the business. But somehow it happened that, on
the second day, the brother who was going to meet the Man in the
Moon was so tired and worn out that he was compelled to crawl
under a haystack and go to sleep. In this way the good brother
passed him on the road and went forward on his journey, never
doubting that the other was just ahead of him. Finally, one day, the
good brother grew tired and sat down on a log to rest. He sat there
so long that the brother he thought he was following came up. He
was very much surprised to see his nice and good brother sitting on
a log and nodding in that country. So he woke him up and asked him
what the trouble was.
“‘Stuff!’ cried the other, ‘you know you have made way with our
father!’
“At once there was a roaring noise in the woods and a rustling
sound in the underbrush, and out came an old man with a long
beard, followed by an army of dwarfs.
“‘How dare you abuse me in my own kingdom?’ he cried to the
good brother. ‘How did I ever harm you?’
“The brother, who had seen this game played before, tried to
explain, but King Stuff would listen to no explanation. He
commanded his armed dwarfs to seize and bind the good brother,
and they soon carried him out of sight in spite of his cries.
“Now, the young man who had gone home for the awl and the
axe and the shoemaker’s wax was very much puzzled. He had more
business on his hands than he knew what to do with. He saw that
he must now rescue his brother as well as his father, and he didn’t
know how to go about it. He had the awl and the axe and the
shoemaker’s wax. He also had the shoe-pegs and leather that he
found together. But what was he to do with them? He sat on the log
and thought about it a long time.
“While he was sitting there, and just as he was about to go
forward on his journey, he heard some one coming briskly down the
road singing. He heard enough of the song to be very much
interested in it. It ran thus:—
“‘With the awl and the axe
And the shoemaker’s wax,
And the pegs and the leather
That were found close together
Where the old man had fling’d ’em,
We’ll bore through and roar through;
We’ll cut down, we’ll put down,
This king and his kingdom.’
“Of course, it was the Man in the Moon who was coming along the
road singing the song, and he seemed to be in high good humor. He
caught sight of the solemn face of the young man and began to
laugh.
“‘There you are!’ cried Mum, the Man in the Moon, ‘and I’m glad
to see you; but I’d feel a great deal better if you didn’t look so
lonesome. I don’t know what to do about it. Your face is as long as a
hind quarter of beef.’
“‘I can’t help it,’ replied the young man. ‘I am in deeper trouble
than ever. My brother has been carried off by the same people that
captured my father.’
“‘What of it?’ exclaimed the Man in the Moon. ‘If you knew as
much about that brother of yours as I do, you’d go on about your
business, and let him stay where he is.’
“‘No,’ said the young man. ‘I couldn’t do that. I know he is my
brother, and that is enough. And then there’s my father.’
“The Man in the Moon looked at the young man a long time, and
finally said:—
“‘Since we are to have a sort of holiday together, maybe you won’t
mind telling me your name.’
“‘Why, of course not,’ replied the young man. ‘My name is Smat.’
“The Man in the Moon scratched his head and then laughed. ‘It is
a queer name,’ he said; ‘but I see no objection to it. I suppose it just
happened so.’
“‘Now, I can’t tell you anything about that,’ replied Smat. ‘I was
too young when the name was given to take any part in the
performance. They seized me, and named me at a time when I had
to take any name that they chose to give me. They named me Smat,
and that was the end of it so far as I was concerned. They never
asked me how I liked it, but just slapped the name in my face, as
you may say, and left it there.’
“‘Well,’ said the Man in the Moon, ‘they’ll put another letter in the
name when you get back home. Instead of calling you Smat, they’ll
say you are Smart, and there’s some consolation in that.’
“‘Not much as I can see,’ remarked Smat. ‘It’s all in your mouth,
and what is in your mouth is pretty much all wind and water, if you
try to spit it out. What I want now is to get my father and my
brother out of the trouble that my mischief has plunged them in.
Please help me. They ought to be at home right now. There’s the
corn to grind, and the cows are waiting to be milked, and the grain
is to be gathered. Times are pretty hard at our house when
everybody is away.’
“‘Very well,’ said the Man in the Moon. He had hanging by his side
the horn of the new Moon, and on this he blew a loud blast.
Immediately there was a roaring noise in the woods, and very soon
there swarmed about them a company of little men, all bearing the
tiniest and the prettiest lanterns that were ever seen. It was not
night, but their lanterns were blazing, and as they marched around
the Man in the Moon in regular order, it seemed as though the light
of their lanterns had quenched that of the sun, so that Smat saw the
woods in a different light altogether. He had not moved, but he
seemed to be in another country entirely. The trees had changed,
and the ground itself. He was no longer sitting on a log by the side
of the big road, but was now standing on his feet in a strange
country, as it seemed to him.
“He had risen from his seat on the log when the little men with
their lanterns began marching around, but otherwise he had not
moved. And yet here he was in a country that was new to him. He
rubbed his eyes in a dazed way, and when he opened them again,
another change had taken place. Neither he nor the Man in the
Moon had made any movement away from the big road and the log
that was lying by the side of it, but now they were down in a wide
valley, that stretched as far as the eye could see, between two high
mountain ranges.
“‘Now, then,’ said the Man in the Moon, ‘you must be set up in
business. On the side of the mountain yonder is the palace of King
Stuff, and somewhere not far away you will find your father and your
brother, and perhaps some one else.’
“He then called to the leaders of the little men with the lanterns,
and gave each one a task to do. Their names were Drift and Sift,
Glimmer and Gleam, and Shimmer and Sheen. These six leaders
waved their lanterns about, called their followers about them, and at
once began to build a house.”
“And they so little, too,” remarked Mrs. Meadows sympathetically.
“Why, it was no trouble in the world to them,” said Little Mr.
Thimblefinger. “It didn’t seem as if they were building a house. Did
you ever see a flower open? You look at it one minute, turn your
head away and forget about it, and the next time you look, there it
is open wide. That was the way with this house the little men built.
It just seemed to grow out of the ground. As it grew, the little men
climbed on it, waved their lanterns about, and the house continued
to grow higher and higher, and larger and larger, until it was
finished. Not a nail had been driven, not a board had been rived, not
a plank had been planed, not a sill had been hewn, not a brick had
been burned. And yet there was the house all new and fine, with a
big chimney-stack in the middle.
“‘Now,’ said the Man in the Moon, when everything was done,
‘here is your house, and you may move in with bag and baggage.’
“‘That is quickly done,’ replied Smat. ‘What then?’
“‘Why, you must set up as a shoemaker,’ said the Man in the
Moon.
“‘But I never made a shoe in my life,’ the young man declared.
“‘So much the more reason why you should make ’em before you
die,’ the Man in the Moon remarked. ‘The sooner you begin to make
shoes, the sooner you’ll learn how.’
“‘That’s so true,’ said Smat, ‘that I have no reply to make. ‘I’ll do
as you say, if I can.’
“‘That’s better,’ cried the Man in the Moon. ‘If you do that, you’ll
have small trouble. If you don’t, I wouldn’t like to tell you what will
happen. Now listen! There is in this kingdom a person (I’ll not say
who) that goes about with only one shoe. When you see that
person, no matter when or where,—no matter whether it’s man,
woman, or child,—you must let it be known that you are ready to
make a shoe.’
“Then the Man in the Moon called to the leaders of his army of
lantern bearers, and waved his hands. They, in turn, waved their tiny
lanterns, and in a moment all were out of sight, and Smat was left
alone. For some time afterwards he felt both lonely and uneasy, but
this feeling passed away as soon as he went into his house. He was
so astonished by what he saw in there that he forgot to feel uneasy.
He saw that, although the house was newly built,—if it had been
built,—it was in fact old enough inside to seem like home. Every
room was finely furnished and carpeted, and in one part of the
house, in a sort of shed-room, he found that a shoemaker’s shop
had been fixed up. There he saw the awl and the axe, and the
shoemaker’s wax, with the pegs and the leather that were found
close together.
“He thought to himself that all that was very nice, but he knew,
too, that he was not much of a shoemaker, and this bothered him
not a little. Anyhow, he made himself comfortable and waited to see
what was going to happen.
“One day a head officer of the kingdom chanced to pass that way.
He saw the house and rubbed his eyes. He was so astonished that
he went and told another officer, and this officer told another, and
finally all the officers in the kingdom knew about it. Now, if you’ve
ever noticed, those who hold government offices have less to do and
more time to do it in than any other day laborers. So they went
about and caucussed among themselves, and examined into the
books, and found that no taxes had ever been gathered from the
owner of such a house. There was great commotion among them.
One of them, more meddlesome than the rest, took a big book
under his arm and went to Smat’s house to make inquiries. The first
question he asked was the last.
“Says he, ‘How long have you been living in this precinct?’
“Says Smat, ‘Ever since the house was built and a little while
before.’
“The officer looked at the house and saw that it was a very old
one, and then he tucked his big book under his arm and went off
home. At last the king—the same King Stuff whose name you’ve
heard me mention—heard about the new house that was old, and of
the shoemaker who didn’t know how to make shoes. So he
concluded to look into the matter. He summoned his high and
mighty men, and when they had gathered together they went into a
back room of the palace and shut the door, and had a long talk
together. All this took time; and while the king and his high and
mighty men were confabbing together, other things were happening,
as you shall presently see.
“It seems that in that kingdom there was a beautiful girl who went
wandering about the country. If she had any kinsfolk, nobody knew
anything about it, and, indeed, nobody cared. She had lost one of
her shoes, and she went about from place to place hunting for it.
Some pitied her, and some laughed at her, which is the way of the
world, as you’ll find out; but nobody tried to help her. Some said that
one shoe was better than no shoe, and others said that a new shoe
would do just as well as an old shoe.”
“That’s where they made a big mistake,” said Mrs. Meadows. “I’ve
tried it, and I ought to know. A new shoe is bound to hurt you a little
at first, I don’t care how well it fits.”
“Well, I’m only telling you what they said,” replied little Mr.
Thimblefinger. “From all I can hear, new shoes hurt the ladies a
great deal worse than they do the men. But that’s natural, for their
toes and their heels are a good deal tenderer than those of the men
folks. Anyhow, this beautiful girl had lost one of her shoes, and,
rather than buy another one or a new pair, she went hunting it
everywhere. One day she came by Smat’s house. He, sitting by one
of the windows, and wishing that he could see his father and
brother, paid no attention to the passers-by. But this beautiful girl
saw him at the window and spoke to him.
“HAVE YOU SEEN ANYTHING OF A STRAY SHOE?”
“‘Kind sir,’ she said, ‘have you seen anything of a stray shoe? I
have lost one of mine, and I’m in great trouble about it.’
“Smat looked at the girl, and she was so beautiful that he couldn’t
help but blush. Seeing this, the girl began to blush. And so there
they were, two young things a-blushing at one another, and
wondering what was the matter.
“‘I have seen no stray shoe,’ said Smat; ‘but if you’ll come in and
show me the one you have on, I think I’ll know its fellow when I see
it.’
“The girl went into the house and sat on a chair, and showed Smat
the shoe that she hadn’t lost. She had the smallest and the neatest
foot he had ever seen.
“‘I hope you are no kin to Cinderella,’ said Smat, ‘for then you
couldn’t get a shoe to fit your other foot until some kind fairy made
it.’
“‘I never heard of Cinderella,’ the girl replied. ‘I only know that I
have lost my shoe, and I’m afraid I’ll never get another just like it.’
“Smat scratched his head, and then he thought about the awl and
the axe and the shoemaker’s wax, and the pegs and the leather that
were found close together. So he said to the beautiful girl:—
“‘Just sit here a little while, and I’ll see if I can’t get you a shoe to
fit your foot. But I must have the other shoe as a pattern to work
by.’
“At first the girl didn’t want to trust him with the shoe, but she
saw that he was in earnest, and so she pulled off the only shoe she
had and placed it in Smat’s hands. He saw at once that the leather
he had was a match for that in the shoe, and he set to work with a
light heart,—with a light heart, but his hand was heavy. And yet,
somehow or other, he found that he knew all about making shoes,
although he had never learned how. The leather fitted itself to the
last, and everything went smoothly. But the beautiful girl, instead of
feeling happy that she would soon have a mate to her shoe, began
to grow sad. She sat in a corner with her head between her hands
and her hair hanging down to her feet, and sighed every time Smat
bored a hole in the leather with his awl or drove in a peg. Finally,
when he handed her the shoe entirely finished, she looked at it,
sighed, and let it fall from her hands.
“‘Of course,’ said Smat, ‘I don’t feel bad over a little thing like that.
But you don’t have to pay anything for the shoe, and you don’t have
to wear it unless you want to.’
“‘Oh, it is not that,’ cried the beautiful girl. ‘The shoe will do very
well, but the moment I put it on, your troubles will begin.’
“‘Well,’ replied Smat, ‘we must have troubles of some sort anyhow,
and the sooner they begin, the sooner they’ll be ended. So put on
your shoe.’
“Now, it happened that just as the girl put on the shoe, which
fitted her foot exactly, King Stuff and his councilors came driving up
to the door. King Stuff was not a large man, but he was very fierce-
looking. He called out from his carriage of state and asked what sort
of a person lived in that house that he couldn’t come out and salute
when the king and his councilors went riding by. Smat went to the
door and bowed as politely as he could, and said that he would have
been glad to bow and salute, if he had known his royal highness and
their excellent excellencies intended to honor his poor house even so
much as to pass by it. The king and his councilors looked at one
another and shook their heads.
“‘This man is none of us,’ said the oldest and wisest of the
councilors. ‘We must be careful.’
“‘How long have you lived here?’ asked the king.
“‘Longer than I wanted to,’ replied Smat. ‘My house is so far from
the palace that I have not been able to call and pay my respects to
your majesty.’
“‘I see you are a maker of shoes,’ remarked the king, seeing the
awl in Smat’s hand.
“‘No, your majesty, not a maker of shoes, but simply a shoemaker.
Thus far I have succeeded in making only one shoe.’
“At this the king and his councilors began to shake and tremble.
‘What was the prophecy?’ cried the king to the oldest and wisest.
‘Repeat it!’
“The oldest and the wisest closed his eyes, allowed his head to
drop to one side, and said in solemn tones:—
‘Wherever you go, and whatever you do,
Beware of the man that makes but one shoe;
Beware of the man with the awl and the axe,
With the pegs and the leather and the shoemaker’s wax.
If you’re out of your palace when you meet this man,
You’d better get back as fast as you can.’
“Smat felt very much like laughing at the solemn way in which the
oldest and wisest councilor repeated this prophecy, or whatever it
might be called. ‘Your majesty needn’t be worried about that
prophecy,’ said he. ‘It’s the easiest thing in the world to break the
force of it.’
“‘How?’ asked the king.
“‘Why, having made one shoe, I’ll go to work and make another,’
replied Smat.
“The oldest and wisest of the councilors said that was a pretty
good plan,—anyhow, it was worth trying. Smat promised to make
another shoe, and have it ready in two days. But this was easier said
than done. In the first place, he had used nearly all his leather in
making a shoe for the beautiful girl. In the second place, the awl
point wouldn’t stay in the handle. In the third place, the pegs split
and broke every time he tried to drive them, and the shoemaker’s
wax wouldn’t stick. Everything went wrong at first and grew worse
at last, so that when the king sent his officers for the shoe it was no
nearer done than it had been before Smat began.
“The beautiful girl had not gone very far away, and she came
every day to see how Smat prospered in making the second shoe.
She was watching him when the king’s officers came for the shoe,
and when she saw them she began to weep. But Smat looked as
cheerful as ever, and even began to whistle when the officers
knocked at the door.
“‘We are in a fix,’ said he, ‘but we’ll get out of it. Lend me the shoe
I made for you. I’ll send that to the king and then get it back again.’
“The girl tried to take the shoe from her foot, but nothing would
move it. ‘That is a sign,’ said Smat, ‘that it ought not to come off. I’ll
just go to the king myself and tell him the facts in the case. That is
the best way.’
“So he gathered the awl and the axe and the shoemaker’s wax,
and the scraps of leather, and bundled them together. Then he told
the officers that he would go with them and carry the shoe himself,
so as to be sure that it came safely into the king’s hands. They went
toward the palace, and Smat noticed, as they went along, that it
grew darker and darker as they came nearer to the palace. The
officers seemed to notice it too. By the time they reached the
palace, it was so dark that Smat had great trouble in keeping up
with the officers.
“There was great commotion in the palace. Nobody had ever seen
it so dark before except just at the stroke of midnight, when the
shadows grow thick and heavy and run together and over
everything.
“Now, old King Stuff was a sort of magician himself (as, indeed, he
had to be in those times, in order to manage a kingdom properly),
and as soon as he saw the great darkness coming on at the wrong
time of day, he thought at once of the prophecy in regard to the
man who made but one shoe. So he hustled and bustled around the
palace, calling for the officers he had sent after the shoe. But
nobody had seen them return before the dark began to fall, and
after that it was impossible to see them.
“In the midst of it all, the officers, followed by Smat, stumbled into
the palace and went groping about from room to room hunting for
old King Stuff and his ministers. At last, they heard him grumbling
and growling, and felt their way toward him.
“‘The shoe! the shoe!’ cried King Stuff, when the officers had
made themselves known.
“‘I have something that will answer just as well,’ said Smat.
“‘The shoe! give me the shoe!’ cried the king.
“‘Take this, your majesty,’ said Smat, handing him the bundle.
“No sooner had the king’s hands touched the bundle than there
was a rumbling noise in the air, the building began to shake and
totter and crumble away. In the midst of it all some one cried out in
a loud voice:—
‘Wherever you go, and whatever you do,
Beware of the man that makes but one shoe!’
“In the twinkling of an eye, King Stuff and his army and his palace
had disappeared from sight. At the same time the darkness had
cleared away, and Smat saw his father and his brother standing near,
dazed and frightened, and not far away was the beautiful girl. The
father and the brother were very much astonished when they found
that Smat had been the means of their rescue. They talked about it
until night fell, and then the Man in the Moon, with his tiny lantern-
bearers, came and escorted them to their own country.
“Now it happened that the beautiful girl was a princess, the
daughter of the king. It fell to the lot of Smat to take the princess
home. Not long after that the king gave a great festival, to celebrate
the return of his daughter. Smat’s father and brother got close
enough to the palace to see him standing in a large room, where
there was a large crowd of people and music and flowers. They saw,
too, that he was holding the princess by the hand.
“And so,” said little Mr. Thimblefinger, wiping the perspiration from
his forehead, “the story ended.”
XX.
“Phew!” exclaimed Mr. Rabbit, when he was sure that little Mr.
Thimblefinger had finished. “That beats anything I ever heard.”
“I’m glad you like it,” said Mr. Thimblefinger.
“Oh, hold on there!” protested Mr. Rabbit, “you are going too fast.
I never said I liked it. I said it beat any story I ever heard, and so it
does,—for length. I didn’t know that such a little chap could be so
long-winded. It was such a long story that I’ve forgotten what the
moral ought to be.”
“Why, I thought you said you didn’t believe much in stories that
had morals tacked to them,” remarked Mrs. Meadows.
“No doubt I did,” replied Mr. Rabbit,—“No doubt I did. But this
story was long enough to have a dozen morals cropping out in
different places, like dog fennel in a cow pasture.”
“Well,” said Mr. Thimblefinger, “there was a moral or two in the
story, but I didn’t call attention to them in the telling, and I’ll not
dwell on them now.”
“I thought it was a tolerably fair story,” said Buster John, yet with
a tone of doubt.
“Oh, I thought it was splendid all the way through,” said Sweetest
Susan.
“There are some stories that are hard to tell,” suggested Mrs.
Meadows. “They go in such a rambledy-wambledy way that it’s not
easy to keep the track of them. I remember I once heard Chickamy
Crany Crow trying to repeat a story that she heard the Looking-glass
Children tell. I never found head nor tail to it, but I sat and listened
almost without shutting my eyes.”
“What was the story?” asked Sweetest Susan.
In reply, Mrs. Meadows said she would call Chickamy Crany Crow,
and ask her to tell it. As usual, Chickamy Crany Crow was off at play
with Tickle-My-Toes. They both came when Mrs. Meadows called
them, and Chickamy Crany Crow, after some persuasion, began to
tell the story.
“One day,” she said, brushing her hair behind her ears with her
fingers, “I wanted to see the Looking-glass Children. Tickle-My-Toes
was off playing by himself, and I was lonesome; so I went to the
Looking-glass, whirled it around in its frame, and waited for the
children to come out. But they didn’t come. I called them, but they
made no answer. I went close to the Glass, and looked in. At first, I
couldn’t see anything; but after a while I saw, away off in the Glass,
one of the children,—the one they all say looks like me. I called her;
but she was so far off in the Glass that she couldn’t hear me, and, as
she had her face turned the other way, she couldn’t see me.
“After so long a time, she came up to the frame of the Glass, and
then stepped out and sat down on the ground. I saw she had been
crying.
“Says I, ‘Honey, what in the world is the matter?’ I always call her
Honey when we are by ourselves.
“Says she, ‘There’s enough the matter. I’m e’en about scared to
death, and I expect that all the other children in this Looking-glass
are either captured, or killed, or scared to death.’
“Says I, ‘Why didn’t you holler for help?’
“Says she, ‘What good would that have done? You all could help
us very well on dry land, out here, but how could you have helped
us in the Looking-glass, when you can’t even get in at the door? I’ve
seen you try to follow us, but you’ve always failed. You stop at the
Glass, and you can’t get any farther.’
“Says I, ‘You are right about that; but if we outside folks can’t get
in the Glass to play with you and keep you company, how can
anybody or anything get in there to scare you and hurt you?’
“Says she, ‘The thing that scared us has been in there all the time.
It was born in there, I reckon, but I’ve never seen it before; and I
tell you right now I never want to see it again.’
“Says I, ‘What sort of a thing is it?’
“Says she in a whisper, ‘It’s the Woog!‘
“‘The what?’ says I.
“‘The Woog!‘ says she.
“Says I, ‘It’s new to me. I never heard of it before.’
“Says she, ‘To hear of it is as close as you want to get to it.’
“Why, I heard of the Woog in my younger days,” remarked Mr.
Thimblefinger. “I thought the thing had gone out of fashion.”
“Don’t you believe a word of it,” said Chickamy Crany Crow. “It’s
just as much in fashion now as ever it was, especially at certain
seasons of the year. The little girl in the Looking-glass—I say little
girl, though she’s about my size and shape—told me all about it; and
as she lives in the same country with the Woog, she ought to know.”
“What did she say about it?” asked Buster John, who had a vague
idea that he might some day be able to organize an expedition to go
in search of the Woog.
A HORRIBLE MONSTER GLARED AT THEM
“Well,” replied Chickamy Crany Crow, “she said this,—she said that
she and the other children were sitting under the shade of a bazzle-
bush in the Looking-glass, telling fairy stories. It had come her turn
to tell a story, and she was trying to remember the one about the
little girl who had a silk dress made out of a muscadine skin, when
all of a sudden there was a roaring noise in the bushes near by.
While they were shaking with fright, a most horrible monster came
rushing out, and glared at them, growling all the while. It wore great
green goggles. Its hair stood out from its head on all sides, except in
the bald place on top, and its ears stuck out as big as the wings of a
buzzard.
“‘Do you know who I am?’ it growled. ‘No, you don’t; but I’ll show
you. I am the Woog. Do you hear that? The Woog! Don’t forget that.
What did I hear you talking about just now? You were talking about
fairies. Don’t say you weren’t, for I heard you.’
“‘Well,’ says one of the Looking-glass Children, ‘what harm is there
in that?’
“‘Harm!’ screamed the Woog. ‘Do you want to defy me? I have
caught and killed and crushed and smoked out all the fairies that
ever lived on the earth, except a few that have hid themselves in
this Looking-glass country. What harm, indeed!—a pretty question to
ask me, when I’ve spent years and years trying to run down and
smother out the whole fairy tribe.’
“The Looking-glass Children,” Chickamy Crany Crow continued,
“told the Woog that they didn’t know there was any harm in the
fairies themselves, or in talking about them. The Woog paid no
attention to their apologies. He just stood and glared at them
through his green goggles, gnashing his teeth and clenching his
hands.
“Says the monster after awhile, ‘How dare any of you wish that
you could see a fairy, or that you had a fairy godmother? What shall
I do with you? I crushed a whole population of fairies between the
lids of this book’ (he held up a big book, opened it, and clapped it
together again so hard that it sounded like some one had fired off a
gun), ‘and I’ve a great mind to smash every one of you good-for-
nothing children the same way.’
“You may be sure that by this time the poor little Looking-glass
Children were very much frightened, especially when they saw that
the Woog was fixing to make an attack on them. He dropped his big
book, and when the children saw him do this they broke and run:
some went one way and some another. The last they saw of him, he
was rushing through the bushes like a blind horse, threshing his
arms about, and doing more damage to himself than to anybody
else. But the children had a terrible scare, and if he hasn’t made way
with some of them it’s not because he is too good to do it.”
“The poor dears!” exclaimed Mrs. Meadows sympathetically.
“Dat ar creetur can’t come out’n dat Lookin’-glass like de yuthers,
kin he?” inquired Drusilla, moving about uneasily: “kaze ef he kin,
I’m gwine ’way fum here. I dun seed so many quare doin’s an’
gwine’s on dat I’ll jump an’ holler ef anybody pints der finger at me.”
“Well, Tar-Baby,” replied Mr. Rabbit with some dignity, “he hasn’t
never come out yet. That’s all that can be said in that line. He may
come out, but if he does you’ll be in no danger at all. The Woog
would never mistake you for a fairy, no matter whether he had his
green goggles on or whether he had them off.”
“No matter ’bout dat,” remarked Drusilla. “I mayn’t look like no
fairy, but I don’t want no Woog fer ter be cuttin’ up no capers ’roun’
me. I tell you dat, an’ I don’t charge nothin’ fer tellin’ it. Black folks
don’t stan’ much chance wid dem what knows ’em, let ’lone dem ar
Woog an’ things what don’t know ’em. Ef you all hear ’im comin’, des
give de word, and I boun’ you’ll say ter yo’se’f dat Drusilla got wings.
Now you min’ dat.”
“What does the Woog want to kill the fairies for?” asked Sweetest
Susan. “He must be very mean and cruel.”
“He’s all of that, and more,” replied Mrs. Meadows. “The fairies
please the children, and give them something beautiful to think
about in the day and to dream about at night, and the Woog doesn’t
like that. He hates the fairies because it pleases the children to hear
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
ebooknice.com