Introduction to Machine Learning Second Edition Adaptive Computation and Machine Learning Ethem Alpaydin - The ebook is available for quick download, easy access to content
Introduction to Machine Learning Second Edition Adaptive Computation and Machine Learning Ethem Alpaydin - The ebook is available for quick download, easy access to content
com
https://ebookgate.com/product/introduction-to-machine-
learning-second-edition-adaptive-computation-and-machine-
learning-ethem-alpaydin/
OR CLICK BUTTON
DOWLOAD EBOOK
https://ebookgate.com/product/introduction-to-machine-learning-3rd-
edition-ethem-alpaydin/
ebookgate.com
https://ebookgate.com/product/machine-learning-an-algorithmic-
perspective-second-edition-stephen-marsland/
ebookgate.com
https://ebookgate.com/product/probabilistic-machine-learning-an-
introduction-1st-edition-kevin-p-murphy/
ebookgate.com
Machine Learning with Spark Develop intelligent machine
learning systems with Spark 2 x 2nd Edition Rajdeep Dua
https://ebookgate.com/product/machine-learning-with-spark-develop-
intelligent-machine-learning-systems-with-spark-2-x-2nd-edition-
rajdeep-dua/
ebookgate.com
https://ebookgate.com/product/scala-for-machine-learning-1st-edition-
nicolas/
ebookgate.com
https://ebookgate.com/product/lifelong-machine-learning-2nd-edition-
zhiyuan-chen/
ebookgate.com
https://ebookgate.com/product/signal-processing-theory-and-machine-
learning-1st-edition-diniz/
ebookgate.com
Introduction
to
Machine
Learning
Second
Edition
Adaptive Computation and Machine Learning
Second
Edition
Ethem Alpaydın
Alpaydin, Ethem.
Introduction to machine learning / Ethem Alpaydin. — 2nd ed.
p. cm.
Includes bibliographical references and index.
ISBN 978-0-262-01243-0 (hardcover : alk. paper)
1. Machine learning. I. Title
Q325.5.A46 2010
006.3’1—dc22 2009013169
CIP
10 9 8 7 6 5 4 3 2 1
Brief Contents
1 Introduction 1
2 Supervised Learning 21
3 Bayesian Decision Theory 47
4 Parametric Methods 61
5 Multivariate Methods 87
6 Dimensionality Reduction 109
7 Clustering 143
8 Nonparametric Methods 163
9 Decision Trees 185
10 Linear Discrimination 209
11 Multilayer Perceptrons 233
12 Local Models 279
13 Kernel Machines 309
14 Bayesian Estimation 341
15 Hidden Markov Models 363
16 Graphical Models 387
17 Combining Multiple Learners 419
18 Reinforcement Learning 447
19 Design and Analysis of Machine Learning Experiments 475
A Probability 517
Contents
Figures xix
Tables xxix
Preface xxxi
Acknowledgments xxxiii
Notations xxxix
1 Introduction 1
1.1 What Is Machine Learning? 1
1.2 Examples of Machine Learning Applications 4
1.2.1 Learning Associations 4
1.2.2 Classification 5
1.2.3 Regression 9
1.2.4 Unsupervised Learning 11
1.2.5 Reinforcement Learning 13
1.3 Notes 14
1.4 Relevant Resources 16
1.5 Exercises 18
1.6 References 19
2 Supervised Learning 21
2.1 Learning a Class from Examples 21
viii Contents
4 Parametric Methods 61
4.1 Introduction 61
4.2 Maximum Likelihood Estimation 62
4.2.1 Bernoulli Density 63
4.2.2 Multinomial Density 64
4.2.3 Gaussian (Normal) Density 64
4.3 Evaluating an Estimator: Bias and Variance 65
4.4 The Bayes’ Estimator 66
4.5 Parametric Classification 69
4.6 Regression 73
4.7 Tuning Model Complexity: Bias/Variance Dilemma 76
4.8 Model Selection Procedures 80
4.9 Notes 84
4.10 Exercises 84
4.11 References 85
5 Multivariate Methods 87
5.1 Multivariate Data 87
Contents ix
7 Clustering 143
7.1 Introduction 143
7.2 Mixture Densities 144
7.3 k-Means Clustering 145
7.4 Expectation-Maximization Algorithm 149
7.5 Mixtures of Latent Variable Models 154
7.6 Supervised Learning after Clustering 155
7.7 Hierarchical Clustering 157
7.8 Choosing the Number of Clusters 158
7.9 Notes 160
7.10 Exercises 160
7.11 References 161
A Probability 517
A.1 Elements of Probability 517
A.1.1 Axioms of Probability 518
A.1.2 Conditional Probability 518
A.2 Random Variables 519
A.2.1 Probability Distribution and Density Functions 519
A.2.2 Joint Distribution and Density Functions 520
A.2.3 Conditional Distributions 520
A.2.4 Bayes’ Rule 521
xvi Contents
Index 529
Series Foreword
The goal of building systems that can adapt to their environments and
learn from their experience has attracted researchers from many fields,
including computer science, engineering, mathematics, physics, neuro-
science, and cognitive science. Out of this research has come a wide
variety of learning techniques that are transforming many industrial and
scientific fields. Recently, several research communities have converged
on a common set of issues surrounding supervised, semi-supervised, un-
supervised, and reinforcement learning problems. The MIT Press Series
on Adaptive Computation and Machine Learning seeks to unify the many
diverse strands of machine learning research and to foster high-quality
research and innovative applications.
The MIT Press is extremely pleased to publish this second edition of
Ethem Alpaydın’s introductory textbook. This book presents a readable
and concise introduction to machine learning that reflects these diverse
research strands while providing a unified treatment of the field. The
book covers all of the main problem formulations and introduces the
most important algorithms and techniques encompassing methods from
computer science, neural computation, information theory, and statis-
tics. The second edition expands and updates coverage of several areas,
particularly kernel machines and graphical models, that have advanced
rapidly over the past five years. This updated work continues to be a
compelling textbook for introductory courses in machine learning at the
undergraduate and beginning graduate level.
Figures
7.1 Given x, the encoder sends the index of the closest code
word and the decoder generates the code word with the
received index as x . 147
7.2 Evolution of k-means. 148
7.3 k-means algorithm. 149
7.4 Data points and the fitted Gaussians by EM, initialized by
one k-means iteration of figure 7.2. 153
7.5 A two-dimensional dataset and the dendrogram showing
the result of single-link clustering is shown. 159
11.7 The multilayer perceptron that solves the XOR problem. 249
11.8 Sample training data shown as ‘+’, where xt ∼ U(−0.5, 0.5),
and y t = f (xt ) + N (0, 0.1). 252
11.9 The mean square error on training and validation sets as a
function of training epochs. 253
11.10 (a) The hyperplanes of the hidden unit weights on the first
layer, (b) hidden unit outputs, and (c) hidden unit outputs
multiplied by the weights on the second layer. 254
11.11 Backpropagation algorithm for training a multilayer
perceptron for regression with K outputs. 255
11.12 As complexity increases, training error is fixed but the
validation error starts to increase and the network starts to
overfit. 259
11.13 As training continues, the validation error starts to increase
and the network starts to overfit. 259
11.14 A structured MLP. 260
11.15 In weight sharing, different units have connections to
different inputs but share the same weight value (denoted
by line type). 261
11.16 The identity of the object does not change when it is
translated, rotated, or scaled. 262
11.17 Two examples of constructive algorithms. 265
11.18 Optdigits data plotted in the space of the two hidden units
of an MLP trained for classification. 268
11.19 In the autoassociator, there are as many outputs as there
are inputs and the desired outputs are the inputs. 269
11.20 A time delay neural network. 271
11.21 Examples of MLP with partial recurrency. 272
11.22 Backpropagation through time. 273
12.1 Shaded circles are the centers and the empty circle is the
input instance. 282
12.2 Online k-means algorithm. 283
12.3 The winner-take-all competitive neural network, which is a
network of k perceptrons with recurrent connections at the
output. 284
12.4 The distance from x a to the closest center is less than the
vigilance value ρ and the center is updated as in online
k-means. 285
xxiv Figures
12.5 In the SOM, not only the closest unit but also its neighbors,
in terms of indices, are moved toward the input. 287
12.6 The one-dimensional form of the bell-shaped function used
in the radial basis function network. 289
12.7 The difference between local and distributed representations. 290
12.8 The RBF network where ph are the hidden units using the
bell-shaped activation function. 292
12.9 (-) Before and (- -) after normalization for three Gaussians
whose centers are denoted by ‘*’. 296
12.10 The mixture of experts can be seen as an RBF network
where the second-layer weights are outputs of linear models. 301
12.11 The mixture of experts can be seen as a model for
combining multiple models. 302
2.1 With two inputs, there are four possible cases and sixteen
possible Boolean functions. 37
The way you get good ideas is by working with talented people who are
also fun to be with. The Department of Computer Engineering of Boğaziçi
University is a wonderful place to work, and my colleagues gave me all the
support I needed while working on this book. I would also like to thank
my past and present students on whom I have field-tested the content
that is now in book form.
While working on this book, I was supported by the Turkish Academy
of Sciences, in the framework of the Young Scientist Award Program (EA-
TÜBA-GEBİP/2001-1-1).
My special thanks go to Michael Jordan. I am deeply indebted to him
for his support over the years and last for this book. His comments on
the general organization of the book, and the first chapter, have greatly
improved the book, both in content and form. Taner Bilgiç, Vladimir
Cherkassky, Tom Dietterich, Fikret Gürgen, Olcay Taner Yıldız, and anony-
mous reviewers of the MIT Press also read parts of the book and provided
invaluable feedback. I hope that they will sense my gratitude when they
notice ideas that I have taken from their comments without proper ac-
knowledgment. Of course, I alone am responsible for any errors or short-
comings.
My parents believe in me, and I am grateful for their enduring love
and support. Sema Oktuğ is always there whenever I need her, and I will
always be thankful for her friendship. I would also like to thank Hakan
Ünlü for our many discussions over the years on several topics related to
life, the universe, and everything.
This book is set using LATEX macros prepared by Chris Manning for
which I thank him. I would like to thank the editors of the Adaptive Com-
putation and Machine Learning series, and Bob Prior, Valerie Geary, Kath-
xxxiv Acknowledgments
leen Caruso, Sharon Deacon Warne, Erica Schultz, and Emily Gutheinz
from the MIT Press for their continuous support and help during the
completion of the book.
Notes for the Second Edition
Machine learning has seen important developments since the first edition
appeared in 2004. First, application areas have grown rapidly. Internet-
related technologies, such as search engines, recommendation systems,
spam fiters, and intrusion detection systems are now routinely using ma-
chine learning. In the field of bioinformatics and computational biology,
methods that learn from data are being used more and more widely. In
natural language processing applications—for example, machine transla-
tion—we are seeing a faster and faster move from programmed expert
systems to methods that learn automatically from very large corpus of
example text. In robotics, medical diagnosis, speech and image recogni-
tion, biometrics, finance, sometimes under the name pattern recognition,
sometimes disguised as data mining, or under one of its many cloaks,
we see more and more applications of the machine learning methods we
discuss in this textbook.
Second, there have been supporting advances in theory. Especially, the
idea of kernel functions and the kernel machines that use them allow
a better representation of the problem and the associated convex opti-
mization framework is one step further than multilayer perceptrons with
sigmoid hidden units trained using gradient-descent. Bayesian meth-
ods through appropriately chosen prior distributions add expert know-
ledge to what the data tells us. Graphical models allow a representa-
tion as a network of interrelated nodes and efficient inference algorithms
allow querying the network. It has thus become necessary that these
three topics—namely, kernel methods, Bayesian estimation, and graphi-
cal models—which were sections in the first edition, be treated in more
length, as three new chapters.
Another revelation hugely significant for the field has been in the real-
xxxvi Notes for the Second Edition
I would like to thank all the instructors and students of the first edition,
from all over the world, including the reprint in India and the German
translation. I am grateful to those who sent me words of appreciation
and errata or who provided feedback in any other way. Please keep those
emails coming. My email address is [email protected].
The second edition also provides more support on the Web. The book’s
Notes for the Second Edition xxxvii
I would like to thank my past and present thesis students, Mehmet Gönen,
Esma Kılıç, Murat Semerci, M. Aydın Ulaş, and Olcay Taner Yıldız, and also
those who have taken CmpE 544, CmpE 545, CmpE 591, and CmpE 58E
during these past few years. The best way to test your knowledge of a
topic is by teaching it.
It has been a pleasure working with the MIT Press again on this second
edition, and I thank Bob Prior, Ada Brunstein, Erin K. Shoudy, Kathleen
Caruso, and Marcy Ross for all their help and support.
Notations
x Scalar value
x Vector
X Matrix
x T
Transpose
−1
X Inverse
X Random variable
P (X) Probability mass function when X is discrete
p(X) Probability density function when X is continuous
P (X|Y ) Conditional probability of X given Y
E[X] Expected value of the random variable X
Var(X) Variance of X
Cov(X, Y ) Covariance of X and Y
Corr(X, Y ) Correlation of X and Y
μ Mean
2
σ Variance
Σ Covariance matrix
m Estimator to the mean
s2 Estimator to the variance
S Estimator to the covariance matrix
xl Notations
x Input
d Number of inputs (input dimensionality)
y Output
r Required output
K Number of outputs (classes)
N Number of training instances
z Hidden value, intrinsic dimension, latent factor
k Number of hidden dimensions, latent factors
Ci Class i
X Training sample
{xt }N
t=1 Set of x with index t ranging from 1 to N
{x , r }t
t t
Set of ordered pairs of input and desired output with
index t
devices are digital now and record reliable data. Think, for example, of a
supermarket chain that has hundreds of stores all over a country selling
thousands of goods to millions of customers. The point of sale terminals
record the details of each transaction: date, customer identification code,
goods bought and their amount, total money spent, and so forth. This
typically amounts to gigabytes of data every day. What the supermarket
chain wants is to be able to predict who are the likely customers for a
product. Again, the algorithm for this is not evident; it changes in time
and by geographic location. The stored data becomes useful only when
it is analyzed and turned into information that we can make use of, for
example, to make predictions.
We do not know exactly which people are likely to buy this ice cream
flavor, or the next book of this author, or see this new movie, or visit this
city, or click this link. If we knew, we would not need any analysis of the
data; we would just go ahead and write down the code. But because we
do not, we can only collect data and hope to extract the answers to these
and similar questions from data.
We do believe that there is a process that explains the data we observe.
Though we do not know the details of the process underlying the gener-
ation of data—for example, consumer behavior—we know that it is not
completely random. People do not go to supermarkets and buy things
at random. When they buy beer, they buy chips; they buy ice cream in
summer and spices for Glühwein in winter. There are certain patterns in
the data.
We may not be able to identify the process completely, but we believe
we can construct a good and useful approximation. That approximation
may not explain everything, but may still be able to account for some part
of the data. We believe that though identifying the complete process may
not be possible, we can still detect certain patterns or regularities. This
is the niche of machine learning. Such patterns may help us understand
the process, or we can use those patterns to make predictions: Assuming
that the future, at least the near future, will not be much different from
the past when the sample data was collected, the future predictions can
also be expected to be right.
Application of machine learning methods to large databases is called
data mining. The analogy is that a large volume of earth and raw ma-
terial is extracted from a mine, which when processed leads to a small
amount of very precious material; similarly, in data mining, a large vol-
ume of data is processed to construct a simple model with valuable use,
1.1 What Is Machine Learning? 3
for example, having high predictive accuracy. Its application areas are
abundant: In addition to retail, in finance banks analyze their past data
to build models to use in credit applications, fraud detection, and the
stock market. In manufacturing, learning models are used for optimiza-
tion, control, and troubleshooting. In medicine, learning programs are
used for medical diagnosis. In telecommunications, call patterns are an-
alyzed for network optimization and maximizing the quality of service.
In science, large amounts of data in physics, astronomy, and biology can
only be analyzed fast enough by computers. The World Wide Web is huge;
it is constantly growing, and searching for relevant information cannot be
done manually.
But machine learning is not just a database problem; it is also a part
of artificial intelligence. To be intelligent, a system that is in a changing
environment should have the ability to learn. If the system can learn and
adapt to such changes, the system designer need not foresee and provide
solutions for all possible situations.
Machine learning also helps us find solutions to many problems in vi-
sion, speech recognition, and robotics. Let us take the example of rec-
ognizing faces: This is a task we do effortlessly; every day we recognize
family members and friends by looking at their faces or from their pho-
tographs, despite differences in pose, lighting, hair style, and so forth.
But we do it unconsciously and are unable to explain how we do it. Be-
cause we are not able to explain our expertise, we cannot write the com-
puter program. At the same time, we know that a face image is not just a
random collection of pixels; a face has structure. It is symmetric. There
are the eyes, the nose, the mouth, located in certain places on the face.
Each person’s face is a pattern composed of a particular combination
of these. By analyzing sample face images of a person, a learning pro-
gram captures the pattern specific to that person and then recognizes by
checking for this pattern in a given image. This is one example of pattern
recognition.
Machine learning is programming computers to optimize a performance
criterion using example data or past experience. We have a model defined
up to some parameters, and learning is the execution of a computer pro-
gram to optimize the parameters of the model using the training data or
past experience. The model may be predictive to make predictions in the
future, or descriptive to gain knowledge from data, or both.
Machine learning uses the theory of statistics in building mathematical
models, because the core task is making inference from a sample. The
4 1 Introduction
1.2.2 Classification
Populism Forever.
A Lesson in Fusion.
Hearst got beat for Governor of New York while the balance of the
State ticket he was on got elected. A few of the successful
candidates are Independence League men, but most of them are
straight Democrats. Thus Hearst’s reform work was turned to the
benefit of corrupt and foul Tammany. We hope this lesson in fusion
will be enough for the League. Hearst was defeated by about
60,000, while the other State candidates on the League-Democratic
fusion ticket were elected by small pluralities. Tammany scratched
Hearst. The Wall Street element of the Democratic party either
scratched him or voted the Republican ticket. We are inclined to
think well of Hearst because of those who scratched him. Hearst
says the fight for the rights of the people is still on. With his great
daily papers he can do a vast work toward overthrowing the rule of
the money power, if he gets into the middle of the road and stays
there. But if he endeavors to work within the old party he will do
more to prevent the success of the people than forty Clevelands
could do. Maybe the thing wasn’t hardly ripe and he had to back out,
which he did by fusing with the Democrats after the League had
nominated a straight ticket. We are guessing that Hearst will be in
the middle of the road supporting Tom Watson for President in 1908.
—Missouri World.
A Significant Vote.
Where He Belongs.
As to Gins.
R. W. Barkley, New York City. November 12, 1906.
I note that you are proposed as President of the Cotton Association. I have
read your Magazine from the first number until Mann got it, and I know your
desire to benefit the South. I control the patent rights on a cotton gin which
works on a new principle and which leaves the cotton in natural lengths,
thereby enhancing the price to the planter by one to five cents per pound.
The gin can be run by hand, or by power, and a few farmers can own one in
common and thereby earn money by ginning their own cotton. The gin
consists of “mechanism for gradually opening and loosening the cotton fibres
while still attached to the seeds, with means for thereafter removing the
seeds.” Just take a little cotton and gradually pull the fibres apart, without,
however, separating them from the seed, until you have a large puff ball and
then see how easily they come off at the seed. Well, that is what this
machine does. No “gin cut” cotton in it. Seed practically unhurt, also. Am
looking for money wherewith to build a large machine, (the inventor made
the working model by hand himself); it does the work fairly well, but it is
getting to be ram-shackle for demonstration purposes, and then for capital
wherewith to work the gin commercially. Such a gin ought to interest you
and also the Cotton Association.
Editor’s Note.—Having just been run through one new and improved gin—
known as Town Topics—and having been badly “gin cut” myself, have but
slight inclination for new inventions of the gin variety.
(Copy.)
November 14, 1906.
Editor Watson’s Magazine, New York.
Dear Sir:—After reading and carefully considering the recent differences
between you and the Honorable Thos. E. Watson, I wish to say to you that I
think Mr. Watson has been treated very unfairly. I am a great admirer of Mr.
Watson and his writings, and this led me to subscribe to the Magazine in its
beginning. I have been highly pleased with it, and especially so with Mr.
Watson’s editorials, but as he has been forced to sever his connection with
the Magazine, and as his writings were the principle things which induced me
to subscribe to the Magazine, I write to request that you erase my name
from your list of subscribers. If I remember correctly, my subscription is paid
up to March 1st, 1907, but under the circumstances I do not wish another
copy mailed to my address.
Very respectfully,
S. R. Sikes.
(Copy.)
Ocilla, Ga., Oct. 24, 1906.
Mr. C. Q. DeFrance, New York City.
Dear Sir:—Please strike my name from the list of subscribers to the Watson
Magazine. I learn the stockholders endeavored to place restrictions on Mr.
Watson as Editor and Manager, and he, for that reason, severed his
connection with it. Thank God for that. I am glad to know he had so much
manhood about him. Tom Watson is one among the greatest statesmen the
United States has. It is a source of satisfaction to know that he will neither
speak nor write with a corporation muzzle on. When the stockholders
attempted to restrict Mr. Watson in his Editorials for the Magazine, they didn’t
only insult him, but they insulted every reader of it who believes in the pure
Jeffersonian principles which Mr. Watson so ably advocates and defends. I
would be proud of Tom Watson were he from any other section of the Union.
He being a Southern man and a Georgian at that, I am exceedingly proud of
him. I fear somebody has been taken upon the Mount and shown the
glorious things the railroads will do if they will only fall down and worship
them. If no Watson is with the Magazine then no Magazine for me.
Respectfully,
D. J. Henderson, Sr.
(Copy.)
Mr. C. Q. DeFrance, Business Manager, New York City, N. Y.
Dear Sir:—I received the November number of the Watson’s Magazine a
few days ago, and your circular letter and subscription blanks today, and in
reply would say that I am one of those who much prefer the play of Hamlet
with the Prince of Denmark left in.
Further comments are unnecessary. My subscription expires February,
1907. Please discontinue same with the November number received.
Yours very truly,
A. A. DeLong.
(Copy.)
Watson’s Magazine, 121 W. 42nd Street, New York, N. Y.
Gentlemen:—You will please discontinue my subscription to Watson’s
Magazine.
I subscribed to this periodical in order to read Mr. Watson’s editorials; and,
inasmuch as he is no longer identified with this publication, it is useless to
send it to me any longer.
Very truly,
Burgess Smith.
(Copy.)
Dixie, Ga., Nov. 23, 1906.
Watson’s Magazine, 2 West 40th St., New York City.
Gentlemen:—I hereby cancel my subscription to Watson’s Magazine, and
ask you to refund balance that you are due me on same. I do not care to
read your slanderous vaporing about Tom Watson. You will soon find out
that, bad as you try to make him out to be, he was really the Magazine, and
without him it will sink in the cesspool of public contempt—as it should do.
Yours very truly,
(Signed) G. B. Crane.
Here are others, clean, clear-cut and business-like:
(Copy.)
Honaker, Va., Nov. 14, 1906.
Watson’s Magazine Co., New York.
Gentlemen:—The November number of Watson’s Magazine is at hand. As
Mr. Watson is no longer the Magazine, will you please discontinue my
subscription and return to me the three month’s unexpired subscription price,
and oblige,
Yours truly,
J. L. Kibler.
Dearing, Ga., Nov. 14, 1906.
Hon. Thos. E. Watson.
I received a card from you yesterday concerning the Magazine. I noticed
your proposition to make good the subscription to the Watson Magazine. I
think mine will be out in June, but I got it at club rates and don’t want to be
a burden to you, but I don’t want a Watson’s without a Watson in it, so you
send me the Watson’s Jeffersonian Magazine, and I’ll see you and pay for six
months at least, as I have great confidence in you as a reform leader and
want to help what little I can.
Yours truly,
J. J. Pennington.
G. M. Stembridge, of Milledgeville, is good enough to say, in subscribing,
“you are doing more for the Reform cause than any other man in the United
States.” “If ever anybody wants to whip you,” writes friend M. S. Chiles, of
Macon, in remitting his subscription, “I will be pleased to push you aside and
say, ‘Whip me first.’” “I would not carry the New York publication from the
postoffice,” says W. W. Shamhart, of Newton, Ill. Dr. R. R. Smith, of Burtons,
Miss., doesn’t “like the jingle of the editorials” of the bogus Watson’s
Magazine for November and would like to see any response that I may make.
Verily, he shall see it. J. K. Sears, of McCoy, Oregon, wants a Magazine
“published at Atlanta by Tom Watson and not by Col. Mann at New York.”
“Please enter my name from now till doomsday,” writes Prof. J. H. Camp, of
Chicago, who at the same time cancels his subscription to what he calls “the
New York dummy.” Dr. J. D. Allen, of Milledgeville, enrolls himself and says,
“send me the first copy.” “I shall always be a subscriber,” writes W. W.
Bennett, Esq., of Baxley. “The reason I subscribed to the other Magazine,”
says W. W. Arendell, of Gause, Tex., “was that you were the editor,” so of
course he wants the genuine Watson’s Jeffersonian. B. L. Milling, of Neal, Ga.,
was a subscriber to Watson’s Magazine of New York from the first issue, “and
would continue to be, had it not been that ‘the gang’ tried to impose upon
you,” he writes. Likewise C. W. King, of Rome, Ga., “only subscribed to the
New York publication on account of your colors flying at the mast-head, so”—
he writes—“of course I wish to enter my name as a subscriber to your new
venture.” H. Gillabaugh, of Missoula, Montana, thinks the bogus Watson’s
Magazine as at present conducted, is “like a church with the devil as pastor.”
Welcome to Our Bookstore - The Ultimate Destination for Book Lovers
Are you passionate about books and eager to explore new worlds of
knowledge? At our website, we offer a vast collection of books that
cater to every interest and age group. From classic literature to
specialized publications, self-help books, and children’s stories, we
have it all! Each book is a gateway to new adventures, helping you
expand your knowledge and nourish your soul
Experience Convenient and Enjoyable Book Shopping Our website is more
than just an online bookstore—it’s a bridge connecting readers to the
timeless values of culture and wisdom. With a sleek and user-friendly
interface and a smart search system, you can find your favorite books
quickly and easily. Enjoy special promotions, fast home delivery, and
a seamless shopping experience that saves you time and enhances your
love for reading.
Let us accompany you on the journey of exploring knowledge and
personal growth!
ebookgate.com