Local Approximation Techniques in Signal and Image Processing Jaakko Astola download pdf
Local Approximation Techniques in Signal and Image Processing Jaakko Astola download pdf
https://ebookgate.com/product/information-fusion-in-signal-and-image-
processing-digital-signal-and-image-processing-1st-edition-isabelle-
bloch/
ebookgate.com
https://ebookgate.com/product/optimisation-in-signal-and-image-
processing-1st-edition-patrick-siarry/
ebookgate.com
https://ebookgate.com/product/digital-signal-and-image-processing-1st-
edition-tamal-bose/
ebookgate.com
https://ebookgate.com/product/digital-image-and-signal-processing-for-
measurement-systems-1st-edition-j-richard-duro/
ebookgate.com
Advanced Image Processing Techniques and Applications 1st
Edition N. Suresh Kumar
https://ebookgate.com/product/advanced-image-processing-techniques-
and-applications-1st-edition-n-suresh-kumar/
ebookgate.com
https://ebookgate.com/product/the-digital-signal-processing-handbook-
video-speech-and-audio-signal-processing-2nd-edition-vijay-k-
madisetti/
ebookgate.com
https://ebookgate.com/product/image-and-video-processing-in-the-
compressed-domain-jayanta-mukhopadhyay/
ebookgate.com
Bellingham, Washington USA
Library of Congress Cataloging-in-Publication Data
TK5102.9.K38 2006
621.382'2--dc22
2006042318
Published by
The content of this book reflects the work and thought of the author(s).
Every effort has been made to publish reliable and accurate information herein,
but the publisher is not responsible for the validity of the information or for any
outcomes resulting from reliance thereon.
The cover image: Spring in Tampere, Finland – May 2006 by Alessandro Foi.
Contents
Preface xi
Notations and Abbreviations xv
1 Introduction 1
1.1 Linear Local Approximation 2
1.1.1 Windowing 2
1.1.2 Nonparametric estimation 3
1.1.3 Scale 5
1.1.4 Ideal scale 8
1.1.5 Adaptive varying scale 9
1.2 Anisotropy 11
1.2.1 Univariate case 11
1.2.2 Multivariate case 12
1.3 Nonlinear Local Approximation 14
1.3.1 Likelihood and quasi-likelihood 14
1.3.2 Robust M-estimation 16
1.3.3 Adaptive varying scale 16
1.4 Multiresolution Analysis 17
1.5 Imaging Applications 17
1.6 Overview of the Book 18
2 Discrete LPA 21
2.1 Introduction 22
2.1.1 Observation modeling 22
2.1.2 Classes of signals 23
2.1.3 Multi-index notation 26
2.2 Basis of LPA 28
2.2.1 Idea of LPA 28
2.2.2 Windowing and scale 34
2.2.3 Estimate calculation 36
2.2.4 Multivariate estimates 37
2.2.5 Examples 38
2.3 Kernel LPA Estimates 47
2.3.1 Estimate of signal 47
2.3.2 Estimate of derivative 48
2.3.3 Reproducing polynomial kernels 48
v
vi Contents
4 Integral LPA 91
4.1 Integral Kernel Estimators 91
4.1.1 Integral LPA 91
4.1.2 Multivariate kernels and estimates 96
4.1.3 Limit LPA estimates 97
4.1.4 Frequency domain 97
4.2 Analytical Kernels 100
4.2.1 1D case 100
4.2.2 2D case 104
4.3 Generalized Singular Functions∗ 107
4.3.1 Gaussian smoothing kernels 107
4.3.2 Univariate Dirac delta function 108
4.3.3 Multivariate Dirac delta function 110
4.3.4 Dirac’s delta sequences 111
4.4 Potential Derivative Estimates 114
14 Appendix 467
14.1 Analytical Regular Grid Kernels 467
14.1.1 1D kernels 467
14.1.2 2D kernels 475
14.2 LPA Accuracy 480
14.2.1 Proof of Proposition 4 480
14.2.2 Proof of Proposition 7 481
14.3 ICI Rule 487
14.3.1 Proof of Proposition 10 487
14.4 Cross Validation 490
14.4.1 Basic approach 491
14.4.2 Algorithm 492
14.4.3 Shift-invariant kernels 495
14.5 Directional LPA Accuracy 496
14.5.1 Proof of Proposition 12 496
14.5.2 Convergence rate for signal estimation 502
14.5.3 Convergence rate for derivative estimation 506
14.6 Random Processes 510
14.6.1 Spectrum of random process 510
14.6.2 Minimization in frequency domain 512
14.6.3 Vector random processes 513
14.7 3D inverse 516
14.7.1 Regularized inverse 517
14.7.2 LPA regularized inverse 517
14.7.3 LPA regularized Wiener inverse 519
14.8 Nonlinear Methods 521
14.8.1 Proof of Proposition 13 521
14.8.2 Proof of Proposition 14 524
14.8.3 Proof of Proposition 15 529
References 535
Index 547
Preface
This book deals with a wide class of novel and efficient adaptive signal process-
ing techniques developed to restore signals from noisy and degraded observations.
Imaging provides a clear example of this type of problem. Digital images produced
by a variety of physical units, including still/video cameras, electron microscopes,
radar, x-rays, ultrasound devices, etc., are used for various purposes, including
entertainment, medical, business, industrial, military, civil, security, and scientific
applications. In today’s computer networks, the number of circulating digital images
is enormous and continues to increase rapidly. Digital images can become distorted
for many reasons, such as blurring and changes in illumination, motion, and process-
ing, including transmission, quantization, smoothing, compression, and geometric
transformations.
In many cases useful information and high quality must be extracted from the
imaging. However, often raw signals are not directly suitable for this purpose and
must be processed in some way. Such processing is called signal reconstruction.
This book is devoted to a recent and original approach to signal reconstruction
based on combining two independent ideas: local polynomial approximation and
the intersection of confidence interval rule. Local polynomial approximation is
applied for linear and nonlinear estimation using a polynomial data fit in a sliding
window. The window size, which is also interpreted as a scale, is one of the key
parameters of this technique. The terms “window size,” “bandwidth,” and “scale”
are synonymous here.
The local polynomial approximation is combined with an adaptive window size
selection procedure called the intersection of confidence interval rule. The idea of
this adaptation is as follows. The algorithm searches for the largest local area of the
point of estimation where the local polynomial approximation assumptions fit well
to the data. The estimates are calculated for a grid of window sizes and compared.
The adaptive window size is defined as the largest of those windows in the set for
which the estimate does not differ significantly from the estimators corresponding
to the smaller window sizes.
In the approach considered in this book, the intersection of confidence interval
rule defines the best scale for each point/pixel of the signal. In this way, we arrive
at a varying adaptive-scale signal processing (imaging) with a pointwise adaptive-
scale selection. The adaptive estimator is always nonlinear, even for a linear local
polynomial approximation, because the nonlinearity of the method is incorporated
xi
xii Preface
in the intersection of the confidence interval rule itself. It is proved that these
adaptive estimators allow one to get a near-optimal quality of the signal recovery.
The local polynomial approximation (LPA) plus the intersection of confidence
interval (ICI) rule is an efficient tool for signal (image) processing, especially for
denoising, interpolation, differentiation, and inverse problems.
The local approximation (nonparametric regression) techniques considered
here signify that no prior limit is placed on the number of unknown parameters
used to model the signal. Such theories of estimation are necessarily quite different
from traditional (parametric regression) statistical models with a small number of
parameters specified in advance.
Experiments demonstrate the state-of-art performance of the new algorithms,
which on many occasions visually and quantitatively outperform the best existing
methods.
Historically, the nonparametric regression technique is a predecessor of
wavelets. It demonstrates a tremendous breakthrough in adaptive methods overall.
This current development is almost unknown to the signal processing community,
which is mainly dominated by the wavelet paradigm.
In this book we present basic concepts, methodology, and theory of this modern,
spatially adaptive (nonparametric-regression-based) signal and image processing.
The advanced performance of the algorithms is illustrated by applications to various
image processing problems.
We present and interpret the nonparametric regression LPA as a filter design
technique.
The ICI adaptive-scale selection concerns adaptive neighborhood filtering. The
ICI as well as the nonparametric regression approach overall admit the derivation
of the adaptive algorithms from the optimization of the estimation accuracy. In this
way we develop a regular approach to the adaptive neighborhood filtering design
that is universally applicable to various problems.
We focus on material that we believe is fundamental and has a scope of appli-
cations that is not limited to solutions of specialized problems. Therefore, the book
is prepared for a broad readership. To avoid mathematical technicalities, we prefer
to postpone the presentation of proofs until the Appendix.
In addition to explanations of the formal mathematics required for background,
all methods are discussed in light of their implementations and applications.
Local polynomial approximation is an old idea. However, when combined
with the new adaptation technique, it becomes a novel and powerful tool. The
new approach and new algorithms are mainly illustrated for imaging applications.
However, they are quite general in nature and can be applied to any scalar or multi-
dimensional data. These new methods can be exploited as independent tools as well
as jointly with conventional techniques.
The emphasis of the book is on multivariate problems. However, the introduc-
tion guides the reader through the most important ideas and techniques relating to
univariate signals.
The mathematics used in the book is motivated by and restricted to the necessity
of making the ideas and algorithms clear. Its complexity remains at a level well
within the grasp of college seniors and first-year graduate students who have had
Preface xiii
Tampere, Finland
April 2006
V. Katkovnik
K. Egiazarian
J. Astola
1
MATLAB is a registered trademark of The Mathwork, Inc.
Notation and Abbreviations
y true signal
z observed signal
h scalar/vector nonnegative scale parameter of estimate
ŷh estimate of y of scale h
ŷh(r) estimate of the derivative y (r) of scale h
x argument variable (integer or continuous)
Xs coordinates of observations
m order (power) of LPA
r order of the derivative
∆ sampling interval
φ vector of basis function in LPA
w window function
v kernel of blurring (convolution) operator
Γ threshold in the ICI rule
d dimensionality of signals
r = (r1 . . . , rd ) multi-index order of the derivative
m = (m1 , . . . , md ) multi-index order of LPA
h = (h1 , . . . , hd ) multi-index scale parameter
∆ = (∆1 , . . . , ∆d ) multi-index sampling interval
Symbols
Sets
R scalar real numbers
R+ nonnegative scalar real numbers
Z integer numbers
Z+ nonnegative integer numbers
Rd space of real d-dimensional vectors
Rd+ space of non-negative real d-dimensional vectors
Zd space of d-dimensional vectors with integer elements
Zd+ space of d-dimensional vectors with integer nonnegative elements
L2 space of squared integrable functions
l2 (Zd ) space of squared summable sequences defined in Zd -space
Abbreviations
IFT integral Fourier transform
DFT discrete Fourier transform, discrete in both spatial and
frequency variables
IDFT inverse discrete Fourier transform
Notation and Abbreviations xvii
Introduction
The book is devoted to signal and image reconstruction based on two independent
ideas: local approximation for the design of linear and nonlinear filters (estimators)
and adaptation of these filters to unknown smoothness of the signal of interest.
As flexible universal tools we use local polynomial approximation (LPA) for
approximation and intersection of confidence intervals (ICI) for adaptation.
The LPA is applied for linear filter design using a polynomial fit in a sliding
window. The window as well as the order of the polynomial defines a desirable
filter.
The window size is considered as a varying adaptation parameter of the filter.
The ICI is an adaptation algorithm. It searches for a largest local window size
where LPA assumptions fit well to the observations. It is shown that the ICI adaptive
LPA is efficient and allows for a nearly optimal quality of estimation.
To deal with anisotropic signals, in particular those that are typical in imaging,
narrowed nonsymmetric directional windows are used. The corresponding direc-
tional (multidirectional) LPA filters equipped with the ICI adaptive window sizes
demonstrate an advanced performance.
The LPA combined with the maximum likelihood and quasi-likelihood are used
for the design of nonlinear filters (estimators). The ICI rule used for the window
size selection of these filters defines nonlinear filters as adaptive to the unknown
smoothness of the reconstructed signal.
LPA-ICI algorithms are powerful recent tools in signal and image processing.
These algorithms demonstrate the state-of-art performance and can visually and
quantitatively outperform some of the best existing methods.
The ICI rule is universal and can be applied with many existing linear and
nonlinear filters (not only with the LPA-based filters) where the bias-variance trade-
off is valid for selecting tunable parameters.
The LPA-ICI techniques define a wide class of pointwise spatially/time adaptive
filters applicable in many scientific and engineering fields.
In this introduction we start with motivation, ideas, and constructive comments
to review basic elements of the techniques presented in the book. These com-
ments are presented for univariate signals while the book is addressed mainly to
multidimensional problems.
1
2 V. Katkovnik et al.
1.1.1 Windowing
zs = y(Xs ) + s , s = 1, . . . , n, (1.1)
where the observation coordinates Xs are known and the s are zero-mean random
errors.
We make no global assumptions about y but assume that locally it can be well
approximated with members of a simple class of parametric functions. Taylor’s
series can serve as a good approximation of a smooth function.
In a neighborhood of x this series gives for y(Xs ):
Since the function y and the derivatives y (1) and y (2) are unknown in Eq. (1.2),
we may look to fit data y(Xs ) in the form
y(Xs ) C0 − C1 (x − Xs ) + C2 (x − Xs )2 /2 + · · · , (1.3)
where the coefficients C0 , C1 , and C2 in Eq. (1.3) are used as estimates for y(x),
y (1) (x), and y (2) (x), respectively.
Modeling in the form (1.3) allows this fruitful interpretation of the coefficients
Ck as the derivatives of y.
Note that we use the first-order polynomial x in Eq. (1.3) with the minus in
order to have C1 as the estimator of y (1) (x). Of course, Eq. (1.3) can be written with
positive signs for all polynomials,
y(Xs ) C0 + C1 (x − Xs ) + C2 (x − Xs )2 /2 + · · · , (1.4)
It is clear that, in general, the coefficients C are different for each x. The
emphasizes that the estimate of C depends on x. The sliding window
notation C(x)
estimation makes the coefficients C in the model (1.3) varying on x.
The model (1.3) can be zero, first, second, or higher orders depending on the
power of used polynomials. The parameter C(x) immediately gives estimates of
(r)
the function y and the derivatives y :
0 ,
ŷ(x) = C 1 ,
ŷ (1) (x) = C 2 .
ŷ (2) (x) = C (1.7)
The sliding window estimates (1.5)–(1.6) belong to the class of so-called nonpara-
metric estimates. Let us clarify what we mean here by parametric and nonparametric
estimation.
For a parametric estimation, the criterion (1.5) and the estimates (1.6) are
transformed to
J (C) = ws es2 , es = zs − y(Xs ),
s (1.8)
y(Xs ) = C0 − C1 Xs + C2 Xs2 /2,
Figure 1.1 A Gaussian sliding window centered at the point x shows data sets entered in
the estimate for the point x and the weights of the included observations.
4 V. Katkovnik et al.
0 (x).
ŷ(x) = ŷ(x, Xs )|x=Xs = C
and
2 (x).
ŷ (2) (x) = ∂X2 s ŷ(x, Xs )|x=Xs = C
Thus, in the nonparametric estimation, the model is explicitly local and used
in a local pointwise manner. The coefficients in Eqs. (1.5–1.6) are calculated for a
fixed x and the model is used for this x only.
The dependence on x in the parametric case is defined by the basis functions
(polynomials) of the model and the coefficients are fixed. In the nonparametric
case, everything is completely different: The dependence on x is defined by the
1 (x), C
0 (x), C
varying coefficients C 2 (x) while the basis functions are used with a
fixed zero value of the argument.
This nonparametric nature of the windowed estimation follows from the tech-
nique used for each x in Eq. (1.5) when fit with different weights for different data.
Introduction 5
The localization relaxes the sensitivity of the standard parametric models to the
order of the model and transforms the polynomial fit into a much more flexible tool
than can be expected from the standard parametric approach.
The estimate ŷ(x) = C 0 (x) (as well as the estimates of the derivatives) can be
given in the form of the kernel operator (filter) linear on the following observations:
ŷ(x) = g(x, Xs )zs . (1.10)
s
where φ(x) = (1, −x, x 2 /2, . . .)T is a vector of polynomials of the LPA. This is
a general formula where the length of the vector φ is equal to the order of the
polynomials plus one.
1.1.3 Scale
The size of the window function w in Eq. (1.5) is an important and natural parameter
of the local approximation. If the window is large and prescribes equal weights to
all residuals, then the nonparametric estimate coincides with the parametric one.
The difference between the parametric and nonparametric approaches disappears.
In this case, the estimation curve is constant, linear, or quadratic, depending on the
polynomial degree used in the model, and enables a maximal smoothing of random
noise in observations.
If the window is of a minimal size, then the estimation curve goes exactly
through the nearby observation points and there is no smoothing of the data. In this
way the window size controls the level of smoothing enabled by the estimator.
The curves in Figs. 2.6 and 2.9 in Chapter 2 illustrate the role of window size
(with notation h for this parameter). The zero-order LPA is used for filtering a finite
number of observations in Fig. 2.6. For a large window size (large h) the estimate is
a constant equal to the mean value of the observations. For a very small window size,
the estimate is a piecewise constant going exactly through the observed values of the
signal. For intermediate values of the window size, we obtain curves demonstrating
different smoothness and different closeness to the observations.
The first-order LPA is used for filtering the same observations in Fig. 2.9. For
the large window size, the estimate is a linear function corresponding to the linear
least square fit of the signal. For a very small window size, the estimate is nearly
piecewise linear going through the observation points. For intermediate values of
the window size, we have curves demonstrating different smoothness and different
closeness to the observations.
6 V. Katkovnik et al.
We use the scale parameter h in the window function wh and for the polynomials.
In the above formulas,
φh (x) = φ(x/ h).
Thus, the polynomials in φh (x) are also scaled by h.
The index h in ŷh and gh emphasizes the special role of the scale parameter h
for the estimate as well as for the kernel of the estimator.
With this notation, the local polynomial model in Eq. (1.5) has the form
yh (x, Xs ) = φTh (x − Xs )C. (1.15)
Under some appropriate assumptions, the discrete local estimator (1.13)
becomes shift-invariant, and for the continuous data it can be represented in the
form of the integral kernel transformation:
1 x−u
ŷh (x) = g z(u)du = g(u)z(x − hu)du, (1.16)
h h
where g is a kernel depending on the window w and the order of the polynomials,
g(x) = wh (x)φT (x)Φ−1 φ(0),
Φ = w(x)φ(x)φT (x)dx. (1.17)
The integral form (1.16) makes obvious an interpretation of the window size h
as the scale parameter of the estimate and of the LPA.
In the first formula in Eq. (1.16), h defines the scale of the kernel g((x −
u)/ h)/ h convolved with the signal z(u) independent on h, while in the second
formula h defines a scale of the signal z(x − hu) convolved with the kernel g(u)
independent on h.
Introduction 7
The following moment conditions hold for the LPA kernels of the order m:
g(x)dx = 1, for m = 0,
g(x)dx = 1, xg(x)dx = 0, for m = 1, (1.18)
g(x)dx = 1, xg(x)dx = 0, x 2 g(x)dx = 0, for m = 2.
All of these estimates give the true value of y(x) as the main term of the series.
The error of the estimation is proportional to h for m = 0, to h2 for m = 1, and to
h3 for m = 2. For small h this shows the order of the error, which depends on the
order of the polynomial approximation. In any case, if h → 0, the estimate becomes
accurate as ŷh (x) → y(x).
The parameter h controls the smoothing of y. Larger and smaller h result in
stronger and weaker smoothing of y, respectively.
In the frequency domain, the Eq. (1.16) becomes
h (λ) = G(λh)Y (λ),
Y (1.20)
where the capital letters denote the integral Fourier transform of the corresponding
variable; in particular, we have for y(x)
Y (λ) = F {y(x)} = y(x) exp(−iλx)dx.
Figure 1.2 Amplitude frequency characteristics of the continuous kernels for different scales
h1 < h 2 < h 3 .
If the observation model includes the random noise in Eq. (1.1), the estimation error
ey (x, h) = y(x) − ŷh (x) = mŷh (x, h) + ey0 (x, h)
consists of the systematic mŷh (x, h) and random ey0 (x, h) components
Assuming that the noise is independent with an identical distribution for all
observations, the variance of the random components is defined as
2∆
σŷh (x, h) = σ Bg ,
2
where Bg = g 2 (u)du, (1.21)
h
∆ is a small sampling period, and σ2 is the variance of the noise in Eq. (1.1).
According to Eq. (1.19), we may conclude that the main term of the systematic
error is
|mŷh (x, h)| = hm+1 |y (m+1) (x)|Ag , (1.22)
where
m is the order of the polynomial approximation and Ag =
| g(u)um+1 du|/(m + 1)!.
Introduction 9
Theoretical analysis and experiments show that the local approximation estimates
can be efficient only with a correct selection of the scale h. It can be varying or invari-
ant, but it should be properly selected. The scale selection is a necessary element
of a successful local approximation estimation. Without automatic scale selection,
the local approximation, even though it is a reasonable idea, does not work.
In signal processing and statistics, scale selection as it is defined above is the
subject of many publications exploiting different ideas and techniques.
10 V. Katkovnik et al.
Figure 1.3 The squared bias |mŷh (x, h)|2 and the variance ŷ2h (x, h) illustrating the bias-
variance trade-off. The “ideal” balance is achieved at h = h∗ minimizing the mean-square
loss function lŷh (x, h).
Recently, a novel class of algorithms known under the generic name Lepski’s
approach has been introduced in statistics and shown to be efficient in theory and
practice. These algorithms are proposed for the pointwise varying-scale adaptive
nonparametric estimation. In this book we develop as the basic adaptation algorithm
one version of this approach, called the intersection of confidence interval (ICI) rule.
The algorithm is simple in implementation. Let H be a set of the ordered scale
values H = {h1 < h2 < · · · < hJ }. The estimates ŷh (x) are calculated for h ∈ H
and compared.A special statistic is exploited to identify a scale close to the ideal one.
This statistic needs only the estimates ŷh (x) and the variances of these estimates,
σŷ2h (x, h), both calculated for h ∈ H .
A layout of the LPA-ICI adaptive algorithm is shown in Fig. 1.4. The LPA
designed filters with the kernels gh1 , . . . , ghJ are applied to the input signal z. The
outputs of these filters ŷh1 , . . . , ŷhJ are inputs of the ICI algorithm, which defines
the adaptive-scale estimates ŷh+ (x) in a pointwise manner for each x.
1.2 Anisotropy
The local polynomial approximation motivated by the Taylor series in Eq. (1.2)
is based on the assumption that the function y is smooth and differentiable. This
means that (at least locally) the behavior of the function is more or less the same
in all directions around the point of interest x. This sort of isotropy assumes, in
particular, that the following three definitions of the derivative are equivalent:
y(x + h) − y(x)
∂y/∂x = lim
h→0 h
y(x) − y(x − h)
= lim
h→0 h
y(x + h/2) − y(x − h/2)
= lim .
h→0 2h
Thus, for derivative estimation we can use the right, left, or symmetric finite-
differences and the results are approximately equal. This simple hypothesis, which
is used by default in most signal processing algorithms, does not hold at singular
points or so-called change points, where the behavior of y changes rapidly.
Directional local approximation is developed to deal with the anisotropy of y.
For univariate y, the directionality of estimation assumes the use of right and
left windows. We say that the window is right or left if w(x) = 0 for x < 0 or
x > 0, respectively. This allows us to get good approximations near the jump points
of curves.
Let us look at a simple example. The signal is piecewise constant having zero
value for x < a and a value of 1 for x ≥ a (Fig. 1.5a). The observation noise is
an independent standard Gaussian. Let the estimator be a sample mean (symmetric
window estimate) with the estimate for the point x 0 in the form
1
ŷh (x 0 ) = zs ,
# 0|Xs −x |≤h
1
ŷhlef t (x 0 − 0) = zs .
#
x 0 −h≤Xs <x 0
12 V. Katkovnik et al.
Points, lines, edges, and textures are specific elements in images. They are locally
defined by position, orientation, and size. Often being of small size, these specific
features encode a great proportion of information contained in images. The cor-
responding image intensity is a typical example of an anisotropic function y and
points, lines, and edges are singular change points and change surfaces in images.
Introduction 13
What makes the problem more specific is that reconstruction of y near the change
points is of prime interest because these points present the most valuable imaging
information.
For a multivariate y for directional estimation we consider a ball of radius h
defining a spherical neighborhood of x,
(a) (b)
Figure 1.7 An LPA-ICI pointwise adaptive segmentation of the Lena test image.
Our previous discussion of the LPA concentrated on the object function y, mainly
ignoring the noise in the observation model (1.1). However, the noise in Eq. (1.1) is
an equally important element of the observation model that becomes an extremely
sensitive issue when noise characteristics (e.g., the variance) are signal-dependent.
To deal with this sort of complex scenario, local versions of the likelihood and
quasi-likelihood methods are developed. These methods allow us to exploit explicit
probabilistic characteristics of observations to obtain optimal noise reduction
procedures.
In statistics, the observation model usually means a joint probability density
f (z, y) of the observation pairs {zs , Xs }s=1,...,n . In this probability density
the argument z corresponds to observations and y is a parameter of interest,
usually y = E{z}. This modeling is quite different from the more restrictive
standard additive model (1.1).
The maximum likelihood (ML) is a universal statistical technique that is used if
the distribution of observations is known. The optimality of the ML estimate means
that this method is able to extract full information about the estimated parameters
available in the given observations.
The local ML is a nonparametric counterpart of the widely used parametric
ML. It extends the scope of the parametric ML to a broader class of functions y.
The local likelihood does not assume a global parametric modeling of y but fits this
model locally within the sliding window.
The local likelihood and local log-likelihood use the window function w for
localization in a neighborhood of the point of interest x and replace the global
parametric model with a local one.
Using the polynomial linear model (1.15), the local log-likelihood can be
presented in the form
Lh (x, C) = ln f (zs , yh (x, Xs ))wh (x − Xs ),
s
(1.25)
yh (x, v) = C T φh (x − v), φh (x) = φ(x/ h),
wh (x) = w(x/ h)/ h,
Introduction 15
where h stays for the scale parameter used both in the window function w and for
the scaling of the polynomials φ.
For y we use the model yh (x, v) linear on C. This type of model is known in
statistics as a generalized linear model. For the local log-likelihood it becomes a
local (nonparametric) generalized linear model.
The standard nonlocal likelihood uses the global parametric model on C, which
is assumed to be valid for all x. The local likelihood relaxes this assumption and
assumes that the used linear model is valid only locally in some neighborhood of
the estimation point x.
Following the idea of the local nonparametric estimation, we use the criterion
Lh (x, C) for computing C. The model yh (x, v) is exploited in the pointwise manner
only. In this way we arrive at the following local ML version of the LPA:
= arg max Lh (x, C),
C
C
(1.26)
T
ŷh (x) = C φh (0).
For Gaussian noise with a constant variance, this approach gives the linear LPA
estimates in Eq. (1.10).
In general, the estimates in Eq. (1.26) are nonlinear. This nonlinearity concerns
two different aspects of the problem. First, the equation
is nonlinear with respect to C, which means that there is no analytical solution for
and numerical methods should be used for maximization in Eq. (1.26). Second,
C
the estimate ŷh (x) is nonlinear with respect to the observations zs .
There exist many circumstances in which, even though the full likelihood is
unknown, one can specify the relationship between the mean and the variance. In this
situation, estimation of the mean (regression, local regression) can be achieved by
replacing the log-likelihood by a quasi-likelihood Q(z, y). The quasi-likelihood has
been proposed as a distribution-free method (note that the ML is a distribution-based
method).
One assumes that we have independent observations with the expectations y
and the variance of these observations is modeled as
Based on the LPA linear filters we introduce a novel multiresolution (MR) non-
parametric regression analysis. This MR approach gives varying-scale adaptive
estimators that are quite different from those defined by the ICI algorithm. Instead
of selecting the estimate with a single best scale h, we build a nonlinear estimator
using the estimate ŷh with all available scales h ∈ H .
The denoising is produced in two steps. The first step transforms the data into
noisy spectrum coefficients (MR analysis). In the second step, these noisy estimates
of the spectrum are filtered by thresholding procedures and then exploited for esti-
mation (MR synthesis). In this way an extension of the conventional nonparametric
regression approach is achieved and a wider class of adaptive-scale estimators with
potentially better performance is obtained.
The developed imaging techniques are universal and can also be applied in the
following areas where our current interests are concentrated:
In Chapters 2 and 3, basic local approximation ideas and techniques are presented,
and the roles of the window function and scale are discussed. The polynomial
smoothness of LPA kernels is defined in terms of the vanishing moment equations
and in the frequency domain. It is shown that the designed filters are reproducing
(accurately) for the polynomial signals. The nonparametric approach is exploited
for filtering, interpolation, and differentiation.
Shift-invariant kernel estimates are derived for regular grid signals. All results
are given in multivariate notation equally applicable for uni- and multivariate
signals.
In Chapter 4, the integral local polynomial approximation is studied. These
continuous variable filters and transformations are important on their own as well
as limit representations for the discrete ones. The limit integral forms are used for
accuracy analyses of discrete filters and for accuracy optimization. The integral
LPA estimates equipped with scale parameters are treated as delta-sequences for
estimation and differentiation. Corresponding links with a theory of generalized
functions are also discussed.
The accuracy analysis of the discrete LPA estimates is the subject of Chapter 5.
Formulas for the bias are derived for some classes of signals, and the ideal scale
balancing the bias and variance of the estimates is studied.
Adaptive-scale selection is the subject of Chapter 6. The varying adaptive-scale
selection is considered to be an efficient tool to deal with nonstationary signals
having spatially varying properties. Lepski’s general approach and the intersection
Introduction 19
of confidence (ICI) rule are introduced as basic adaptation tools for spatially varying
adaptation. The ICI rule is derived from optimization of the mean-square error. The
idea as well as the implementation and properties of the ICI rule are presented.
Simulation experiments demonstrate a high sensitivity of the ICI rule to singularities
and change points in signals and its efficiency for adaptive filtering.
The directional LPA is presented in Chapter 7 as a further development of the
local approximation idea. It is proposed to deal with the anisotropy of signals having
different behaviors in different directions. The directionality of the approximation
is enabled by narrowed sectorial windows. The directional kernels use polynomials
of different orders and different scales for different directions. The designing of
this sort of filter for estimation (smoothing) and differentiation is given.
The directional ICI adaptive-scale estimation is the subject of Chapter 8. The
main goal is to obtain a shape-adaptive neighborhood for the best approximation
of anisotropic signals. The approach is based on a sectorial partition of the local
neighborhood with adaptive sizes of the sectors defined by the ICI rule. A bank of
scale-adaptive directional filters is used with the final estimate found by aggregation
of the directional estimates.
Recursive LPA-ICI adaptive algorithms are proposed as a special class of adap-
tive filters with a cyclic repetition of the multidirectional LPA-ICI filtering. The
advance performance of the LPA-ICI directional algorithms is demonstrated for
image denoising problems.
For differentiation, a novel concept of the anisotropic gradient is developed.
The algorithm looks for the adaptive neighborhood where the signal allows a linear
local approximation and the gradient is calculated using observations only from
this neighborhood. Specific features of this concept are illustrated for shading-
from-depth problems.
Scale-adaptive image deblurring techniques are considered in Chapter 9. A
basic topic concerns the deblurring of a noisy image provided that the point-spread
function of the blur is shift-invariant. Developed algorithms combine the regularized
inverse and regularized Wiener inverse filtering procedures, both using the LPA-ICI
scale-adaptive technique. Fast implementation of the algorithms is based on using
convolution operations for computing.
The ICI scale adaptivity is proved to be very efficient. It is shown that the newly
developed algorithms demonstrate state-of-the-art performance. This technique is
generalized for 3D deblurring problems.
Nonlinear nonparametric estimation (filtering) is the subject of Chapter 10.
Observation modeling and performance criterion are considered as the main reasons
to replace linear estimates by more demanding nonlinear ones. At first, a class of
nonparametric robust estimates based on LPA and the M-estimation is considered.
Links between the robust and minimax estimations are discussed.Accuracy analyses
and scale optimizations show that the ICI rule is applicable for the adaptive varying-
scale selection. Adaptive-scale median filters are considered as a special class of
these adaptive M-estimates.
The nonlinear Box-Cox and Anscombe transformations allow the stabilization
of the variance in observations. In particular, this approach is efficient to deal with
the Poisson distribution data. Using this concept we develop adaptive algorithms
20 V. Katkovnik et al.
where the nonlinear estimation is reduced to a linear one with nonlinear transforms
before and after filtering.
A general approach to nonlinear estimation problems based on the local like-
lihood and quasi-likelihood is considered in Chapter 11. The LPA of the signal
of interest is used as the basic design tool. The accuracy results for the bias and
variance of the estimates are obtained. It is shown that the only difference versus
the corresponding results for the linear estimate concerns the formulas for the vari-
ance. It follows that the ICI rule can be modified and used for the nonlinear adaptive
varying-scale filtering.
The quasi-likelihood is a distribution-free version of the likelihood that does
not require the accurate probabilistic observation model. The reweighted least-
square algorithms are used for the quasi-likelihood estimate calculation. The quasi-
likelihood-based methods are relevant to a wide class of applications where the
variance of observations is signal-dependant.
Fast algorithms implementing the quasi-likelihood scale-adaptive estimates are
developed based on linear convolution estimates completed by the ICI rule. As a
particular application of the developed adaptive nonlinear estimates, we consider
the counting (Poisson distribution) and binary (Bernoulli distribution) observations.
Image reconstruction problems concerning Poisson distribution data are the
subject of Chapter 12. Important areas of application include digital photogra-
phy, biology, and medical imaging problems. In many cases crucial high-quality
imaging requirements are in clear contradiction to very noisy raw data, with the
noise variance depending on the signal. Nonlinear methods based on likelihood and
quasi-likelihood are standard tools for this sort of problem.
A number of novel algorithms are proposed using the nonlinear LPA joint with
the ICI adaptive-scale selection. Special attention is paid to the development of more
computationally productive methods based on fast linear estimates completed by
the ICI adaptive-scale selection. It is shown that these methods are able to give high
quality with a reasonable complexity of computing.
A novel multiresolution (MR) nonparametric regression analysis is considered
in Chapter 13 as a further development and an alternative to the LPA-ICI adap-
tive technique. This processing includes the following three successive steps: MR
analysis, thresholding (filtering in the transform domain), and synthesis. The final
estimate is a composition of the estimates of different scales. In contrast to it, the
ICI rule defines a unique best scale in a pointwise manner for each estimation point.
Mathematical results concerning proofs of propositions as well as derivations
of some formulas are collected in the Appendix.
Chapter 2
Discrete LPA
The idea of local smoothing and local approximation is so natural that it is not
surprising it has appeared in many branches of science. Citing [145] we men-
tion early works in statistics using local polynomials by the Italian meteorologist
Schiaparelli (1866) and the Danish actuary Gram (1879) (famous for developing
the Gram-Schmidt procedure for the orthogonalization of vectors).
In the 1960s and 1970s the idea became the subject of intensive theoretical study
and applications, in statistics by Nadaraya [158], Watson [230], Parzen [175], and
Stone [215], and in engineering sciences by Brown [19], Savitzky and Golay [196],
Petersen [179], Katkovnik [98], [99], [101], and Cleveland [29].
The local polynomial approximation as a tool appears in different modifica-
tions and under different names, such as moving (sliding, windowed) least square,
Savitzky-Golay filter, reproducing kernels, and moment filters. We prefer the term
LPA with a reference to publications on nonparametric estimation in mathematical
statistics where the advanced development of this technique can be seen.
In this chapter the discrete local approximation is presented in a general mul-
tivariate form. In the introductory section (Sec. 2.1), we discuss an observation
model and multi-index notation for multivariate data, signals, and estimators.
Section 2.2 starts with the basic ideas of the LPA presented initially for 2D
signals typical for image processing. The window function, the order of the LPA
model, and the scaling of the estimates are considered in detail. Further, the LPA is
presented for the general multivariate case with estimates for smoothing and differ-
entiation. Estimators for the derivatives of arbitrary orders are derived. Examples
of 1D smoothers demonstrate that the scale (window size) parameter selection is
of importance.
The LPA estimates are given in kernel form in Sec. 2.3. The polynomial smooth-
ness of the kernels and their properties are characterized by the vanishing moment
conditions.
The LPA can be treated as a design method for linear smoothing and
differentiating filters. Links of the LPA with the nonparametric regression con-
cepts are clarified in Sec. 2.4, and the LPA for interpolation is discussed briefly in
Sec. 2.5.
21
22 V. Katkovnik et al.
2.1 Introduction
Since the LPA is a method that is valid for signals of any dimensionality, we
mainly use a multidimensional notation because it is universal and convenient. For
instance, we say that the signal y(x) depends on x ∈ Rd , i.e., the argument x is a
d-dimensional vector, and we use it with d = 1 for scalar signals in time and
with d = 2 for 2D images. The multidimensional notation allows us to present
results in a general form valid for both scalar time and 2D image and signal
processing.
The symbol d is reserved for the dimensionality of the argument-variable x.
Thus, x is a vector with elements x1 , . . . , xd .
Suppose that we are given observations of y in the form ys = y(Xs ), s =
1, . . . , n. Here Xs stays for a location of the sth observation, and the coordinates
of this location are
Xs = [x1 (s), . . . , xd (s)].
zs = y(Xs ) + s , s = 1, . . . , n, (2.1)
where the additive noise s is an error of the sth experiment usually assumed to be
random, zero-mean independent for different s with E{s } = 0, E{s2 } = σ2 .
Equation (2.1) is a direct observation model where a signal y is measured
directly and the additive noise only defines a data degradation.
Suppose we are able to observe (y v)(x), where v is a blurring kernel or point
spread function (PSF) of a linear convolution. The blurring phenomenon modeled
by the PSF v (continuous or discrete) is very evident in many signal and image
applications. Moreover, we assume that the data are noisy, so that we observe zs
given by
zs = (y v)(Xs ) + s . (2.2)
This equation is an indirect observation model with a signal y measured not directly
but after some transformation.
A data degradation is defined by this transformation as well as by the additive
noise. It is assumed in image processing that all signals in Eqs. (2.1) and (2.2) are
defined on a 2D rectangular regular grid:
X = {Xs : s = 1, . . . , n}
In this notation the image pixels are numbered by s taking values from 1 through
n. The coordinates of these pixels can be given explicitly as
X = {k1 ∆1 , k2 ∆2 : k1 = 1, . . . , n1 , k2 = 1, . . . , n2 }, n = n1 n2 , (2.4)
where ∆1 and ∆2 are the sampling intervals for x1 and x2 , respectively.
In signal processing literature, square brackets are sometimes used for dis-
crete arguments to differentiate them from continuous arguments. For example,
the notation y(k1 ∆1 , k2 ∆2 ) is for a continuous argument function and y[k1 , k2 ] is
for a discrete one with
y(k1 ∆1 , k2 ∆2 ) = y[k1 , k2 ]. (2.5)
For this discrete argument y we also can use the general notation y(Xs ) provided
that Xs = (k1 ∆1 , k2 ∆2 ).
The LPA considered in the book is applicable to observations given on irregular
grids, when the coordinates Xs and the distances between Xs can be arbitrary. In
image processing this sort of situation appears when data are destroyed or lost.
Figure 2.1 illustrates the cases of regular and irregular grids for 2D data.
Figure 2.1 Types of data grids: (a) regular with sampling intervals 1 and 2 on the argu-
ments x1 and x2 ; (b) regular with missing data; and (c) irregular with varying intervals between
observations Xs .
24 V. Katkovnik et al.
where Lα (x) and Lα are finite and α = r1 + · · · + rd is the order of the derivative
used in this definition.
The class Cα is composed of signals having bounded all possible derivatives of
the order α.
Let d = 1. Then the class Cα is defined as
The class includes all signals with the bounded αth-order derivative.
Figure 2.2 Parametric and nonparametric models: (a) linear parametric model, (b) cosine
parametric model, and (c) nonparametric function with unknown parametric representation.
Discrete LPA 25
Let d = 2. Then
α r1 r2
C = y : max ∂x1 ∂x2 y(x1 , x2 ) = Lα (x1 , x2 ) ≤ Lα , ∀(x1 , x2 ) ∈ R2 .
r1 +r2 =α
In this definition, the maximum for each x = (x1 , x2 ) is calculated over all
derivatives of the order α = r1 + r2 . For α = 1, it assumes that
max(|∂x1 y(x)|, |∂x2 y(x)|) = L1 (x),
while for α = 2 it means
max(|∂x21 y(x)|, |∂x22 y(x)|, |∂x1 ∂x2 y(x)|) = L2 (x).
Definition (2.6) is global because the conditions hold for all x. For images where
singularities and discontinuities of y carry the most important image features, a local
smoothness of y is more realistic. Therefore, the class Cα can be replaced by
CαA = y : max ∂xr11 . . . . ∂xrdd y(x) = Lα (x) ≤ Lα , ∀x ∈ A ⊂ Rd , (2.7)
r1 +···+rd =α
is a particular case of Eq. (2.8) with constant values within each region Ai . Figure 2.3
shows an example of a piecewise constant y defined according to Eq. (2.9) as well
as the corresponding regions Ai .
In the models (2.8) and (2.9) yi (x), and ai , as well as the regions Ai are usually
unknown. The boundaries Gi define change points of the piecewise smooth y in
Eq. (2.8). The estimation of y can be produced in different ways. One possible
approach deals with a two-stage procedure that includes estimating the boundaries
Gi at the first stage, which defines the regions Ai . The second stage is a parametric
or nonparametric fitting yi in Ai .
Another approach is connected with the concept of spatially adaptive estima-
tion. In this context, change points can be viewed as a sort of anisotropic behavior
26 V. Katkovnik et al.
of the estimated signal. One may therefore apply the same procedure for all x,
for instance, nonlinear wavelet, ridgelet, or curvelet estimators, and the analysis
focuses on the quality estimation when change-points are incorporated into the
model. With this approach, the main intention is to estimate the signal but not the
locations of the change-points, which are treated as elements of the signal surface.
In this book we will follow the second approach. The objective is to develop a
method that simultaneously adapts to the anisotropic smoothness of the estimated
curve and is also sensitive to the discontinuities in the curves and their derivatives.
In general, conditions (2.6) and (2.7) define nonparametric signals that do not
have parametric representations. The following important class of parametric y can
be interpreted as a special case of the nonparametric class Cα .
Let Lm+1 (x) ≡ 0 for x ∈ Rd . Then y is polynomial of the degree m. Indeed, the
condition Lm+1 (x) ≡ 0 means that y is a solution of the set of differential equations
∂xr11 . . . . ∂xrdd y(x) = 0
for all ri such that r1 + · · · + rd = m + 1, and this solution is a polynomial. For
this class of polynomials we use the notation
P m = {y : ∂xr11 . . . . ∂xrdd y(x) = 0, r1 + · · · + rd = m + 1, ∀x ∈ Rd }, (2.10)
where m stays for the power of the polynomial. It is obvious that P m ⊂ Cm+1
with Lm+1 = 0.
All the girls spoke their lines distinctly, though Miss Gibson had
deleted some, to shorten the scene and to leave out those that were
too unpleasant for such an occasion. Olive as Macbeth made quite
an impression. She withdrew with the witches, but witches,
apparitions and Macbeth were obliged to come out again in front of
the cavern to receive further applause.
“The rest will be anti-climax,” mourned Pansy, the kitty-cat, who
had joined Madge and Shirley.
“The freshman fun will be the relaxation of the evening,” said
Shirley, “and how can you speak thus to the author of your beautiful
verses!”
Pansy laughed. “That is so. I had forgotten our beautiful poetry.”
“To tell the truth, in comparison with this, our lines may fall a little
flat; but just looking at you kittens, you black cats, I should say, will
be enough. I thought that I saw two costumes like your witch’s,
Pansy, a while ago.”
“I did, too,—I wonder whose the other is.”
The sophomore entertainment was even more gruesome than the
witches of Macbeth. When a curtain was drawn aside at the end of
the room, there, against the white background of another curtain,
which represented a wall, hung the white faces of Bluebeard’s wives.
A ghostly sophomore read the story, briefly told, in its most exciting
parts, while the wife who entered the forbidden chamber, Bluebeard,
and Sister Ann played their parts in pantomime, with the addition of
ghostly groans from the wives who had, supposedly, been disposed
of long since. This was a little too realistic and made more than one
of the audience jump a little at first. But it was soon over.
There was relief from spookdom when the juniors came in to give
very prettily a “Dance of the Pumpkins.” “Pumpkin” costumes and
one funny rolling movement gave the “motif.”
But how they laughed when the freshmen came in as black cats,
managed by a rather frisky looking witch with her tall black hat, her
black robe and the broomstick on which she expected to make her
exit. On the front of the robe was the large cat’s head with its big
yellow eyes, and a whole cat was depicted on the back between the
witch’s shoulders.
First the witch led the march, while the piano crashed and two
girls who had violins tried a little hideous jazz at certain points. Next,
the witch stopped and from the side gave orders for a standing drill
with rubber mice. A few squeals from the audience at the first
appearance of the mice, swung forth by their tails, was so natural
and suggestive that the whole audience laughed and one girl called
out, “nice kitties!”
The comical appearance made by the backs of the girls, as they
wheeled and faced away from the audience, brought more laughter.
Shirley had despaired of painting enough cats for all the freshmen in
the drill, but the bright idea occurred to her after it was decided to
put cats on their costumes, to stencil the cats. Accordingly, on the
square white patch of muslin, similar to the one upon her own
costume, which the witch wore, in stenciled patches of black, the
clawing limbs and wildly waving tail of the witch’s cat appeared.
As a result of careful measurements, this made a line of cat
pictures funny to behold, with the black whiskers and yellow eyes
added by Shirley’s brush afterward. The cat’s head in front was
striking, too, but not so funny as the whole cat between the
shoulders behind. It was scarcely necessary to do anything “smart,”
Madge declared to Shirley. Just to look at them was enough, Madge
said; and Shirley, grinning herself at some of the evolutions, nodded
assent. “Maybe that’s so,” she whispered, as the freshmen girls
made their eyes big, held out the mice with one “claw” and
scratched at them with the other. They laid them on the floor and
played with them, or took them away from each other and “howled”
in chorus, all to the music. This changed now to the lively melody of
which Shirley was the composer.
Facing the audience and lined up in one row, the freshmen pinned
the rubber mice on their costumes by the tails as badges and stood
for a moment to get their breath while one of the teachers, who had
made an accompaniment to Shirley’s melody, played a brief prelude.
“Mother Goose stuff,” said a low voice near Shirley. Shirley did not
turn to see what the speaker looked like, in some gay costume, she
supposed, for the voice was Sidney’s. Madge heard it, too, and
nudged Shirley, whose ghost costume, of course, could not indicate
to Sidney that the chairman of the committee was close by. “She’s
jealous,” whispered Madge, but the sarcastic little phrase spoiled
what followed for Shirley. “It is silly,” she thought, “but, someway,
they couldn’t think up anything better, and we had to have
something.” Quietly she stood to see how the girls would sing the
foolish song.
But the rest of the audience were in the spirit of fun and “Mother
Goose stuff” was quite acceptable to them. Youthful freshmen voices
started in after a loud crash from the accompanist and a wail from
the violins.
“MUCH ADO.”
“Yes, Irma,” said Sidney, sitting in the study shortly after dinner.
“Considering the fact that there were about half a dozen witch
costumes last night, the decision of the judges that Shirley
Harcourt’s costume was the most original was nothing short of
ridiculous. But that would not annoy me at all. What I feel provoked
about is that those girls so evidently made it up to get me to wear
the same sort of suit that Shirley did. I couldn’t get much out of
Hope, when I asked her again about it; but she certainly told me
that Caroline described a costume that would be just the thing for
me!”
“I can scarcely believe it, Sidney. Shirley Harcourt is not that sort
of a girl; and if Caroline suggested it, I don’t see that it involves
Shirley at all.”
“Oh, all right, Irma. But I think what I think. My, how cold it is
tonight! I wanted to go down to the lake, but there is frost in the air
and the wind is unpleasant.”
“You must be taking cold, Sidney. I was out and did not notice it at
all.”
A light knock came at the door of the study. Irma went to the door
and opening it, found Shirley Harcourt there. “Why, how do you do,
Shirley; come in,” Irma said.
Soberly Shirley entered with a return of Irma’s greeting. Hesitant
she stood within the room, seeing the girl in the pretty, blue negligé,
who sat on the other side of a central table. Sidney had just had
time to turn her back before Shirley came in. “I wanted to speak to
Sidney Thorne just a moment, Irma,” Shirley continued. “I had
reason to think this morning that I had offended her and I want to
ask her what is the matter. I am very willing to apologize, if I have
done anything, without knowing it.”
Shirley paused and looked at the shining hair, one well-shaped ear,
and a cheek fair and pink with only the natural tints of youth. But
Sidney made no move.
Irma stood quietly. She knew that it must have taken an effort on
Shirley’s part to say that she was willing to apologize. But Sidney,
listening, thought that Shirley knew well enough. She had not yet
been addressed. She would not turn around until she was.
Shirley looked at Irma, but Irma, puzzled and annoyed, did not
know what to do. She started to speak and then stopped, and
Shirley, wishing that she had not come, smiled at Irma as she
opened the door again, stepping outside. “It was a mistake to come,
I see,” said Shirley. “Thank you, Irma; good night.”
Irma closed the door and without a word to Sidney went into the
bedroom which she and Edith occupied. There she moved around for
some time before coming into the study again. Taking the same
chair by the table which she had occupied before Shirley knocked,
she resumed her study. With the ringing of the gong for study hours
to begin, Fleta and Edith came in, full of life, hoping that they didn’t
interrupt, but it was most important to tell the latest news, that the
“Water Nymph” was going to be married at the Christmas Holidays.
It was a relief to Irma when they came. She was not enjoying her
silent companion, though silence was better than speech if speech
should take up the subject of the call. But Sidney knew that for once
in her life, at least, she had been discourteous. Of that Irma very
likely disapproved. She would say nothing. It was a relief to her, as
well, when the other girls joined them.
Shirley had found that Hope had little recollection of what she had
said to Sidney. “Why, Caroline,” she replied to Caroline’s questions, “I
was trying to help Sidney about her decision. I remembered your
describing a cute one, and I had the impression that it was one you
had seen somewhere. I knew that you were wearing something else.
So I told Sidney about the painted cats. Mercy, what have I done? I
never even thought of it that night, for we had witches in the senior
stunt and I supposed that it was Sidney’s idea, though I did hear her
say that she would not have a part in the performance.”
“It’s just that Sidney may think Shirley had some hand in it. I only
want to let you know that Shirley did not even know that Madge had
shown me the costume when she did.”
“If you want me to say something to Sidney,—” Hope began.
“Not yet, Hope, and perhaps not at all. Haven’t you heard Sidney
say a word?”
“I have scarcely seen Sidney at all. I can’t quite understand,—did
you say that Sidney has been blaming you girls for her having
something just like Shirley’s?”
“Hope, you dear little goose! You are too broad-minded yourself to
take all this in. Just keep quiet about it. If we call you in as witness,
tell the truth!”
“I certainly can do that, Cad. I wish that Sidney weren’t quite so
proud.”
“Sid would not be herself if she were not proud. What a pity that
we can’t all be Standishes of New England!”
“You are a sad case, Cad Scott,” laughed Hope. “Good luck to
you.”
So it came about that Shirley decided to go directly to Sidney,—
with the embarrassing results. Had she persisted, it is most likely
that Sidney would have entered into conversation with her. But
Shirley’s pride came in there. It had been hard to go to Sidney’s
room. She could not stay where she was not wanted. Thinking about
it, she concluded that it was, as Madge said, “much ado about
nothing.” “Just go right on, Shirley. If Sidney is mad about anything,
you have shown that you are ready to make it right. That is enough.
If it were any other girl than Sid you would not care. I believe that
you are twins!”
Shirley laughed. “It isn’t my way to let things go, unless I’m sure
that the other side is altogether unjust. But I can’t help myself, it
seems. We’ll drop it.” Within herself Shirley decided not to avoid
Sidney, to speak if the opportunity given, but to go right along as
usual.
Shirley’s other school-mates were more friendly than ever after
the masked party. Without trying, Shirley was taking a position of
influence among the girls. She was consulted and sought. She joined
one or two clubs, but worked busily at her lessons, encouraged often
by the warm letters from her mother. Her father was too busy to do
more than to scribble a few lines of affection and advice upon her
mother’s letters.
In one of Miss Dudley’s letters she asked, “Have you remembered,
Shirley, that you were born in Chicago? I don’t know that we have
thought of it in connection with your going to school so near the city.
Your father was getting another degree at Chicago University, and
your mother was with your grandmother and me in a house that we
had rented for a while in Glencoe,—a very attractive suburb,—you
must stop off and see it some time.”
To this Shirley wrote, “If I’ve ever been told that I was born
anywhere else than at ‘home,’ I have forgotten it. I can’t say that I
am pleased to hear it particularly, though it does not matter so much
where a body was born, I guess, as who—whom she was born to!
I’m certainly glad that I belong to your family, Auntie. Can’t you
come on at the Holidays to see me?”
But Miss Dudley could not manage it. The fact was that she was
taking every spare cent to meet the expenses for her niece, though
she had indulged in an economical summer vacation. She would not
tell Shirley this. Let Shirley think that Auntie had plenty.
As the first term speeded to its close, Caroline had several
conferences with Hope Holland relative to Shirley, who was
expecting to spend the vacation at the school with several other
pupils, for whose benefit it would not be closed. Hope wanted
Shirley at her home, but so did Caroline, and the fact that Hope
belonged to the Double Three made it embarrassing.
“I don’t have to go over to Sidney’s all the time,” she said. “We
see each other all the time at school and Mother and Father and the
boys will want me there. I suppose I’ll have to go to Sidney’s parties,
—not that they will not be fine, as they always are, but I don’t see
why I should not invite Shirley.”
“If you do, Sidney will never get over it. I’ll tell you. You let me
invite Shirley and have her part of the time. Then when you are not
in anything with the Double Three, or entertaining them yourself,
she can be with you.”
“If I have a party,” said Hope, with determination, “if I have a
party,” she repeated, “and Shirley is in Chicago, she will be invited.
Sidney can have a headache if she does not want to come!”
“Well, then, may I have Shirley?”
“Yes, on those conditions, that I have her part of the time, to stay
all night, you know.”
“All right. We’ll not quarrel, Hope. Shirley is such a big-hearted
and broad-minded girl, like yourself, Hope, that I couldn’t be jealous
of either of you if I tried.”
“That is because you are nice yourself, Cad, my dear.”
All of this was not imparted to Shirley. But she knew that she was
invited by both Caroline and Hope, and after a letter of permission
from her great-aunt, Miss Dudley, she accepted her invitations very
happily. When she heard that the Double Three were having a house
party at Sidney’s, she wondered about how things would be
managed; for she “felt it in her bones” that Sidney would not invite
her to her home, and she knew that Hope was a “Double Three.” But
Shirley said nothing. That could be handled by her hostesses, she
knew. She would go and have a wonderful time.
It had happened that Sidney’s parents had not driven to the
school that fall. It was Sidney’s second year. They were accustomed
to the separation as well as she. She spent one or two week ends in
Chicago, as well as the Thanksgiving vacation. Early in the year, also,
Sidney had asked Hope and Caroline not to speak of the strange
resemblance between Sidney and the then “new girl.” “If you write
home about it, Father and Mother will hear of it, and it will not strike
them very pleasantly I am sure,” said Sidney. And after some
consideration Hope and Caroline had promised, though Caroline had
said, “We’ll not say anything now, shall we, Hope? But if our parents
ever do see Shirley or hear about her, don’t flatter yourself, Sid, that
we can muzzle our fathers. Our mothers might hesitate to say
anything, but if I know Dad, he would be just as likely as not to
mention it.”
“I suppose he would,” said Sidney, with a look and tone that made
Caroline want to resort to “primitive measures,” she told Hope. “If
we had been about six years old, Hope,” she said, “I would have
slapped Sidney Thorne and not regretted it.”
“Tut-tut, Caroline,” laughed Hope. “It’s a primitive society, indeed,
that can’t control its angry passions.”
None of the girls had forgotten all this, and now Hope and
Caroline expected to enjoy the surprise of their respective families
upon their first sight of Shirley. “You will not mind, will you, Shirley, if
anybody takes you for Sidney?” Caroline asked.
“I am used to it by this time,” said Shirley, “and this time I shall
know why Chicago people, or some of them, think that they know
me.”
CHAPTER XV.
AN ACCIDENTAL MEETING.
Long since Sidney Thorne had spoken to Shirley, for she found out
that her suspicions of an intent to embarrass her were entirely
unfounded. Her manner toward Shirley had not even been unfriendly
for some time but when she found that Shirley was going to Chicago
as the guest of Caroline, she was almost indignant. The girls knew
that it would be embarrassing for her. Why did they invite Shirley?
Now, unless she wanted to have complications arise, she could not
invite Shirley to the affairs that she wanted to have for the Double
Three. Well, she would just leave Shirley out, if she did come from
the same school. You did not have to be intimate with everybody!
Such was Sidney’s attitude. Shirley thought of it, too, and felt
rather sorry for Sidney, supposing, of course, that Sidney wanted to
be courteous, as she had always been except on that one occasion,
which had never been explained between them. But it would not
affect Shirley’s good time in the least.
The Double Threes had gone on ahead, leaving on the first train,
with the exception of Hope Holland, who waited for Caroline and
Shirley, the three preferring to go by themselves, though it was only
a tacit understanding among them.
How jolly it was to have no lessons and to be facing the best
vacation of the year in thrills and Christmas festivities. Shirley’s
winter coat was all that could be desired, and she was to buy a new
hat in Chicago, though the hat which she had brought, with her
coat, was becoming and still good. Sidney would have no reason to
be ashamed of her double.
Cards from Hope and Caroline had warned their families of
showing too much surprise at a remarkable resemblance between
Shirley Harcourt and Sidney Thorne. As a result, while they were
almost startled, in spite of the warning, there was to be no
embarrassing moment for Shirley.
She was to go first to Hope’s; but at the station two cars met the
girls, one from each household. Mr. Scott reached them first and was
introduced to Shirley. “I have met you once before, Mr. Scott,” said
Shirley after shaking hands.
“Why, when, my child?” asked kindly Mr. Scott.
“Last summer, when I was in Chicago for a few days. You came up
to me in a hotel and shook hands with me. I thought it was some
graduate of our university, till you told me that Mrs. Scott and the
girls had gone up to Wisconsin and assumed that I knew about it.”
“Then it was you instead of Sidney!” laughed Mr. Scott. “I
remember that I was puzzled, for Sidney was supposed to have left
the city some time before.”
But here came two youths hurrying across through the crowd to
them. “Hello, Hope. How do you do, Mr. Scott? Caroline, how you’ve
grown! Isn’t that always the thing to say to returning children?” The
taller of the two boys was shaking hands with Caroline, after this
speech, and put an arm around Hope, as he waited to be introduced
to her friend.
In a moment Shirley found herself in a handsome car, sitting
behind with Hope, while the two young men sat in front, the older
one driving skilfully through the traffic of Chicago. “Little did I think,
Hope,” said Shirley, “when I was here last summer, or even last fall
on the way to school, that at Christmastime I’d be back to visit with
a dear girl like you.”
“I want you many more times, Shirley. I’m sorry that Madge had
to go home, but after all, it’s nice to have you to ourselves. Some
way, people get to loving you, Shirley, did you know that?”
“No I didn’t,” laughed Shirley. “I think that it’s ‘your imagination
and a beautiful dream,’ as Auntie is fond of saying.”
“You did not know that I had such big brothers, did you? I told
them all about you, though. I have one more, and no sisters at all.”
Shirley looked at the two young men in front of her, used to the
ways of the city, capable, interesting. Mac, who was driving, looked
not in the least like Hope, though he had her serious look when his
face was in repose, as now. Good, clear features marked the profile
that Shirley saw. His face was rather thin and the hands on the
wheel were well-shaped. Ted, the other brother, was not as tall as
Mac, but looked as old; his eyes and the shape of his face were like
Hope.
“They look as if they were the same age, don’t they?” asked Hope.
“Ted is not quite a year older than I am, and Mac is just a year older
than Ted. We were all little together and my, how Mother ever stood
our playing and fussing I don’t know. Kenneth is fourteen, only three
years younger than I am, but he is somewhat spoiled as the ‘baby’
of the family.”
It was pleasant to be welcomed into the beautiful home of the
Hollands. Shirley shared Hope’s room and thought it “lovely;” but
Hope said that they were selling the house soon and would move
into a suburb farther out.
Shirley knew little about changes in a city and these things did not
concern her. Immediately she entered upon one happy event after
another. Mac, so full of fun, yet so serious upon occasion, took a
great fancy to Shirley and saw that she missed nothing. When she
went to Caroline’s just before Christmas Day, Mac did not desert her,
but drove over, with gifts from the Hollands, while Caroline said that
she never had so much attention in her life as now from the Holland
boys and their friends. Shirley did not even know that Sidney had
had a great party for the Double Three, for Hope was over early that
evening and went to Sidney’s late, in plenty of time for this event.
Caroline sent regrets because of a previous engagement, which was
an evening with Mac and one of his friends.
“I thought that you were like Sidney at first,” said Mac, “but I’d
never confuse you two after a good look, Shirley. Sidney is a fine girl
and she may learn a few things about people after a while; but you
have a different viewpoint and it makes you sweeter.”
“Why, that is nice of you to say,” said the surprised Shirley, “but I
didn’t know that you were so—,” she paused for a word and Mac
said, “‘observing,’ isn’t it?”
“No; that would be admitting that you are right.”
“Analytical, then, or philosophical. Remember that I am going to
college!”
“Oh, you ought to know Dick. He is in our university at home, the
one where my father teaches.” There, it was out. Shirley had
changed her mind about not speaking of her wonderful father.
“Is your father a university professor? That explains it, then.” Mac
looked as if he would like to go on, but was not sure whether he
dared or not.
“What it explains I don’t know,” laughed Shirley, “but so far as Dad
is concerned, he is mighty fine, even if he never has much money
and puts it into his line of work or gives it back to the college. And
he’s always doing things in one way or another for his students.”
“That is about what I was going to say, Shirley, doing big things
on next to nothing. The reason I know anything about it is that we
have a friend like that. But who’s Dick? Her best college friend? Don’t
tell me that I have to label you ‘Taken!’”
“I don’t know what to make of you, Mac. Ought I to be offended?
You are so funny that I can’t be. No; Dick is my cousin and I’m going
to bring him up for the Prom to meet our girls. I told him not to have
too much of a college ‘case’ till he saw them.”
“I tell you what would be delightful to do,” said Mac. They were
sitting together on a hall seat at the Hollands, while they waited for
Hope, who had gone upstairs after her gloves which were missing.
Mac was to drive them to Caroline’s.
“There are other young men who would be interested in being
entertained by some charming damsel other than their sisters.” Mac
paused and looked meaningly at Shirley. “Why not arrange for Dick
with, say, the sister of one of said young men, or one of her other
friends?”
“It would be possible, even if Dick came as my guest,” said Shirley,
“for me to see something of, well, any of ‘said young men.’”
“How dearly I love my sister, only time will prove,” said Mac, rising
and taking hold of Hope on the lowest step. Hope looked
suspiciously at her brother, stopping in her descent.
“What now, Malcolm?” she said, severely, but breaking out into
her own cheery smile as she looked at the laughing Shirley. “Such
displays of affection usually mean something, Shirley,” Hope
continued, “but I’ll do almost anything for you, Mac, for taking us
around the way you are doing.”
“I am always willing to sacrifice myself for my only sister,” asserted
Mac, with a perfectly serious face. But Mac Holland did not keep up
his joking about the Prom or indulge in any personal remarks after
this, and Shirley liked him all the better when he was his normal self,
full of fun, to be sure, but with something better than that about
him. He saw that Shirley and his sister heard some of the holiday
entertainments that Chicago can supply, quietly taking care of them
in a gentlemanly way.
The girls had two weeks’ vacation, which they enjoyed to the full.
After Shirley had visited with Caroline, she came back to Hope,
yielding to many urgings, for Mr. and Mrs. Holland liked Shirley.
There were only a few occasions on which Shirley met people who
took her for Sidney Thorne; but Hope repeated a remark that had
been made to Mac. “‘I did not know that you knew Sidney Thorne so
well, Mac, and went around with her so much,’ somebody said to
Mac the other day, Shirley,” said Hope. “And Mac never explained at
all!”
It was not until toward the last of her stay in Chicago that Shirley
met any one connected with Sidney. As the girls had told Sidney,
they could not muzzle their fathers. Mr. Scott, in particular, Caroline
made no attempt to caution. Why should she? Sidney might just as
well let her father and mother know about the lovely girl that looked
like her. It happened, then, that Mr. Scott said to Mr. Thorne, “Odd,
Thorne, but my daughter brought home from school a young girl
who looks enough like your daughter to be her twin.”
“There are close resemblances sometimes, I suppose,” returned
Mr. Thorne, who was preoccupied with the bonds about which he
had come to the bank.
“But this isn’t any ordinary close resemblance, Thorne. Did you
ever have any relatives named Harcourt?”
“None that I ever heard of. Say Scott, I’ll drop in tomorrow to see
if you have gotten hold of what I want beside these. Regards to your
wife. Mine is happy these holidays with her daughter from school.
Good morning.”
That very afternoon the incident occurred which brought Shirley to
the notice of Sidney’s father, a surprising experience. The Holland
chauffeur, who had little to do when the Holland boys were at home,
had taken the girls to do some shopping. It was Shirley’s last
opportunity to make such purchases as she needed before going
back to school. They had run across Caroline, who accompanied
them when Shirley went to have a dress tried on, one which she had
seen before but just decided to buy. Some alterations were to be
made and when Shirley saw how Hope looked as she sat waiting she
suggested that the girls need not wait for her. “You have a
headache, Hope, I know, and I shall have to wait a little while. Go
on home, do. I can come by street car. I know right where to go, for
Mac told me one time, for fear I might get lost.”
Caroline looked at Hope. “Yes, Hope, you are half sick; but I tell
you what we’ll do. I’ll take you home, and Hope can tell her
chauffeur to wait for Shirley. Shirley knows where the car is parked.
I’d have to leave you in a minute anyhow, because I told Mother that
I’d be right back, and she will be through her shopping by this time.”
So it was arranged, and Hope was glad to go with Caroline. Shirley
did not have very long to wait, not as long as she had expected.
Hurrying from the store, she mistook direction and had a great hunt
for the car. At last she saw it, smooth and shining, and with a sigh of
relief she approached it, entering it without waiting for the chauffeur,
whom she saw standing at a little distance in conversation with
some other man. Shirley sank back against the cushions in relief. Her
dress was a pretty one and would be sent to her at the school. Her
other packages would be delivered at the Hollands’. What luxury this
was. Could this be Shirley, ready to say, “Home, James?”
The chauffeur, whom Shirley had scarcely noticed before,
apologized for not being there to open the door, which Shirley had
found unlocked. “I was only a short distance away,” said the man,
“but I saw a man that—,” but the chauffeur was busy with getting
his car out into the street successfully and Shirley lost the rest. She
closed her eyes and leaned back again. They had not taken time
even for some ice-cream and she was really hungry. Ho for the good
dinner waiting at the Hollands’!
Shirley was almost ready to doze off, for traffic in Chicago
disturbed her no more, when the car stopped at a curb, to let a fine-
looking man of middle age enter. Shirley looked up with surprise.
Perhaps this was some guest,—but it was funny that Hope had not
mentioned it. The gentleman was dressed in unobtrusive but the
finest of business outfit,—clothes, tie, shoes, the heavy, handsome
overcoat and the well-fitting hat.
He, too, leaned back as if tired. “You may go home now,” he said
to the chauffeur.
Shirley sat up, startled. Who was this? She turned and started to
say something, but the gentleman looked at her and said, “What is
the matter, Sidney? Have you forgotten something? I see that you
left your fur coat to be fixed, but I hope that you will not take cold in
that one.”
Shirley ceased to be startled when she heard herself addressed as
Sidney. By some mistake she had gotten into the Thorne car and this
was Mr. Thorne! She smiled and said, “I see that I have made a
mistake. I am not Sidney, Mr. Thorne, I am Shirley Harcourt. Hasn’t
Sidney told you about me?”
“Do you mean to say that you are not Sidney? Why, Sidney, child,
you are just joking!” Mr. Thorne looked scarcely puzzled.
“Well, I don’t know how to convince you, but poor Sidney must be
somewhere wondering what became of her car. I thought that this
one was the Holland car that was to take me home. I should have
known the chauffeur, but the boys have driven us around most of
the time. I am visiting at the Holland home, and I go to the same
school that your daughter attends.”
Mr. Thorne was sitting forward now, looking seriously at Shirley.
The chauffeur was looking back occasionally, as much as he dared.
“I seen that she had different clothes on,” he said, and was
answered only by a sharp glance from Mr. Thorne. But the reproving
look was quite wasted.
“I was quite deceived,” said Mr. Thorne. “My friend, Mr. Scott, told
me only this morning that a young girl who resembled my daughter
was visiting his daughter from the school.”
“Yes, sir. I visited Caroline part of the time. Caroline, Hope and I
have been together nearly all the time.”
Mr. Thorne then directed the chauffeur to go back to the place
where he had parked the car to wait for Sidney. Meantime, he
exerted himself to put Shirley at her ease. “I do not wonder that you
mistook the car. Holland has one almost like it, perhaps exactly like
it, though I never thought about it. Tell me a little about yourself
Miss Shirley. Where do you live?”
Under Mr. Thorne’s kindly look Shirley found herself telling as she
had told no one but the Holland family, about her home, her father
and mother, the university and her one year at the girls’ school.
“Has it been a happy one so far?” asked Mr. Thorne kindly. He
looked at her so thoughtfully and with so much interest that Shirley
felt comforted some way. Here was one who did not resent her
looking like Sidney.
“Not altogether,” Shirley frankly told him, “but it was all new, and
with my father and mother so far away I have been a little bit lonely
once in a while, but not very often, for there is always so much to
do.”
“Has the close resemblance between yourself and my daughter
made any complications?”
“A few, but nothing serious,” smiled Shirley. No one should criticise
Shirley from anything she might say here in Chicago.
When they arrived at the place from which Shirley had started,
Sidney and her mother could be seen, coming from the entrance of
the store where Shirley had shopped. “Oh, I hope that they have not
waited!” exclaimed Shirley.
“If they have it is not your fault.”
“I’m afraid it is.”
Mr. Thorne helped Shirley out and drew her with him to meet his
wife and Sidney. “I will take you to find the other car,” he said. “You
must be safely started to the right place this time.”
It was a curious meeting. Sidney’s face was flaming, and Mrs.
Thorne’s was full of amazement. “Mother,” said Mr. Thorne, “this is
Miss Harcourt, who attends school with Sidney. I ran across her
accidentally. Have you been waiting for the car?”
“No,” replied Mrs. Thorne, after saying a few words to Shirley and
extending her daintily gloved hand from her furs. “We have only now
finished. Sidney expected to go home alone, for I intended to join
one of the ladies for tea at the club.”
“That accounts for Carl’s expecting only Sidney in the car, then.”
Mr. Thorne was watching the two girls, who had pleasantly
exchanged greetings as school girls would. He gave his wife a long
look, then said that he must find the Holland car for Shirley. “I will
be back in a moment,” said he. “Come, Miss Harcourt; no telling
where your car may be parked by this time, but the chauffeur is
doubtless on the lookout for you.”
“I am sorry, Mother,” said Sidney, as the two entered their own car,
“that I did not tell you before about Shirley Harcourt. But I thought
that it might annoy you as it annoyed me to have some one else
look so much like me.”
“It was startling,” replied Mrs. Thorne. “It is strange, too, that she
happened to attend the same school. I am afraid that you have not
enjoyed your term. Would you prefer to go somewhere else?”
“Perhaps,” said Sidney, “but Father will want me to get my
certificate there, I think.”
To Mr. Thorne, when he joined them, Sidney again apologized
prettily for not having told them of Shirley. “I am wondering how you
happened to meet her, Father,” she said.
Mr. Thorne related the circumstances and seemed to be surprised
at Sidney’s rather critical attitude, when she said that Shirley “might
have known the difference in cars and chauffeurs.”
“It was merely a mistake, Sidney. You might almost as well say
that Carl ought not to have mistaken her for you. I found Miss
Harcourt a very charming young girl. She told me of her father when
I inquired. He is abroad on some archæalogical expedition this year.
I fancy that he is rather a big man in his line.”
Then Mr. Thorne changed the topic and Sidney was relieved to
find that her parents did not pursue the subject of the resemblance.
Mr. Thorne’s explanation of a delay satisfied the waiting chauffeur,
who drove home as rapidly as the traffic would permit after Shirley
was safely deposited in the car. It had not been so long after all,
since Shirley’s wait in the store had been shorter than she had
expected. Nevertheless, she found that Hope had been uneasy.
“I believe that you are ‘psychic,’ Hope,” joked Shirley, “but my
double, that ought to be where I am concerned, if she is so like me,
is not even interested.”
“You are mistaken, Shirley. Sidney is attracted to you, but fights
it.”
“I wonder if you are right,” mused Shirley.
“Sidney can’t share anything,—not even looks!”
CHAPTER XVI.
SIDNEY’S “GHOST.”
About lunchtime the next day, Mrs. Holland answered the telephone
to find Mr. Thorne on the line. After some preliminary conversation,
he came to the point of his message. “I called you to inquire about
Hope and her guest. We were so interested yesterday in meeting the
young lady who looks so much like Sidney, that Mrs. Thorne and I
would like to meet her again. Sidney’s guests left yesterday and we
have just seen Sidney off; but if your girls are not going till later,
could we not have them for dinner. I seem to remember that Miss
Harcourt spoke of its being doubtful about her leaving till late to-day.
Mrs. Thorne is right here and she will speak to you when I am
through.”
“Thank you Mr. Thorne; the girls may not get off until to-morrow
morning. Hope is wretched and I am not sure whether it is too much
Christmas holiday excitement or an attack of la grippe coming on.
Shirley says that she will wait to go with her, if she is able, in the
morning. They will scarcely miss anything. Oh, is this Mrs. Thorne
now? How are you, my dear? Yes, Shirley can come,—I will properly
present the invitation,—but Hope is too miserable. Wait a moment,
please.”
Mrs. Holland duly called Shirley, who said that she would be very
happy to go. Mr. Thorne, again at the telephone, said that he would
call for her on his way home.
“Hope, what have you gotten me in for by being sick?” queried
Shirley of Hope, who was lying in bed, being plied with various
remedies at different intervals.
“A pleasant acquaintance, I hope, that will make up for Sidney’s
snippiness! Has Caroline gone, do you know?”
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
ebookgate.com