Instant download Training Data for Machine Learning: Human Supervision from Annotation to Data Science 1st Edition Anthony Sarkis pdf all chapter
Instant download Training Data for Machine Learning: Human Supervision from Annotation to Data Science 1st Edition Anthony Sarkis pdf all chapter
com
https://ebookmeta.com/product/training-data-for-machine-
learning-human-supervision-from-annotation-to-data-
science-1st-edition-anthony-sarkis/
OR CLICK HERE
DOWLOAD NOW
https://ebookmeta.com/product/training-data-for-machine-learning-
human-supervision-from-annotation-to-data-science-1st-edition-anthony-
sarkis/
ebookmeta.com
https://ebookmeta.com/product/training-data-for-machine-learning-
models-1st-edition-anthony-sarkis/
ebookmeta.com
https://ebookmeta.com/product/scaling-python-with-dask-from-data-
science-to-machine-learning-1st-edition-holden-karau/
ebookmeta.com
https://ebookmeta.com/product/the-mini-book-of-pies-sophie-conran/
ebookmeta.com
Sacrificial Princess and the King of Beasts Volume 5 1st
Edition Yu Tomofuji
https://ebookmeta.com/product/sacrificial-princess-and-the-king-of-
beasts-volume-5-1st-edition-yu-tomofuji/
ebookmeta.com
https://ebookmeta.com/product/mxenes-fundamentals-and-
applications-1st-edition-inamuddin/
ebookmeta.com
https://ebookmeta.com/product/wine-tasting-a-professional-handbook-
ronald-s-jackson/
ebookmeta.com
https://ebookmeta.com/product/cotton-candy-1st-edition-stuart-drake-
bray/
ebookmeta.com
https://ebookmeta.com/product/sir-isaac-newton-overlord-of-
gravity-1st-edition-angela-royston/
ebookmeta.com
Training Data for Machine
Learning Models
With Early Release ebooks, you get books in their earliest form—
the author’s raw and unedited content as they write—so you can
take advantage of these technologies long before the official
release of these titles.
Anthony Sarkis
Training Data for Machine Learning Models
by Anthony Sarkis
Copyright © 2021 Anthony Sarkis. All rights reserved.
Printed in the United States of America.
Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North,
Sebastopol, CA 95472.
O’Reilly books may be purchased for educational, business, or sales
promotional use. Online editions are also available for most titles
(http://oreilly.com). For more information, contact our
corporate/institutional sales department: 800-998-9938 or
[email protected].
Copyeditor: TK
Proofreader: TK
Indexer: TK
Model
A Machine Learning (ML) Model that is created as the end result of a ML Training Process.
Figure 1-1. Diagram or screenshot of common supervision interface.
Training Data is not an algorithm, nor is it tied to a specific machine learning approach. Rather it’s the
definition of what we want to achieve. A fundamental challenge is effectively identifying and mapping
the desired human meaning into a machine readable form.
The effectiveness of training data depends primarily on how well it relates to the human defined
meaning and how reasonably it represents real model usage. Practically, choices around Training Data
have a huge impact on the ability to train a model effectively.
Training Data makes sense when a set of conditions are true. For example, training data for a parking
lot detection system may have very different views. If we create a training data set based on a top
down view (see left side of Figure 1.2) and then attempt to use the image on the right we will get
unexpected results. That’s why it’s important that the data we use to train a system closely matches the
data our trained system would see in production.
Figure 1-2. If the left is your training data and right is your use case, you are in trouble!
See figure 1-2—A machine learning system trained only on images from a top-down view as in the left
image has a hard time running in an environment where the images are from a front view front as in
the right image. Our system would not understand the concept of a car and parking lot from a front
view if it has never seen such an image during training.
Concepts Introduction
There are two general categories of training data: Classic and Supervised. The general focus of this
book is on Supervised. We will contrast Supervised to Classic later in more detail. The following
presentation of concepts is intended to be introductory to provide a baseline understanding around
definitions and assumptions. These themes will be explored in greater detail throughout the book.
Representations
Let’s imagine we are working on a Strawberry picking system. We need to know what a strawberry is,
where it is, and how ripe it is. Let’s introduce a few new terms to help us work more efficiently:
Label
A Label, also called a class2 Other names include: Label template, represents the highest level of
meaning. For example, a label can be “Strawberry” or “Leaf”. For those technical folks, you can think
of it as a Table in a database. This is usually attached3 to a specific Instance.
Instance
A single example. It is connected to a label to define what it is. And usually contains positional or
spatial information defining where it is. Continuing the technical example, this is like a row in a
database. An Instance may have many Attributes.
Figure 1-3. Diagram showing labeled and not labeled instances.
Attributes
Attributes are things unique to a specific Instance. Imagine you only want the system to pick
Strawberries of a certain ripeness. You may represent the ripeness as a slider or you may also have
a multiple choice question answer about ripeness. From the database perspective this is somewhat
akin to a column. This choice will affect the speed of supervision. A single instance may have many
unique attributes, for example in addition to ripeness there may be disease identification, produce
quality grading, etc.
Choices
Your choices here are as much part of Training Data as doing the literal supervision. You also make
choices on what type of Spatial representation to use.
Spatial
We can represent the Strawberries as boxes, polygons, and other options. In later chapters I will discuss these
trade-offs.
There are other choices which will be detailed in later chapters. From a system design perspective there
can be choices around what type of raw data to use, such as image, audio, text, or video. And
sometimes even combining multiple modalities together. Angle, size, and other attributes may also apply
here. Unlike attributes generally, spatial locations are singular in nature. An object is usually only in one
place at a given moment in time.4
This is all in the context of explicit (or direct) supervision of the data. Someone is directly viewing the
data and taking action. This contrasts with classic training data where the data is implicitly observed “in
the wild” and not editable by humans.
We will cover this in more detail in Chapter 3.
Sets of Assumptions
Imagine you are an Olympic runner training to run in the set of conditions that are expected to be
present at the Olympics. It’s likely if you are training for a specific event, say the 100 Meter, then you
will train only for the 100 Meter, and not for the 400, 800, or High Jump. Because while similar, those
events are outside of the scope of what you expect to encounter—the 100 Meter.
Training Data is very similar. We define a set of assumptions and expect the Training Data to be useful
in that context - and only in that context. Similar to the above we can start with high level assumptions.
Our strawberry picker is assumed to be on a commercial strawberry field. Then, like the 100M
specificity, we can get into more specific assumptions. For example perhaps we assume the camera will
be mounted to a drone, or a ground based robot, or the system will be used during the day.
Assumptions are in any system, however, they take on a different context with Training Data because of
the inherent randomness involved. Somewhat surprisingly, human analogies around Training (for sports,
work etc.) are actually very applicable to Training Data.
Randomness
Let’s zoom in on this human centric example of training for the Olympics. I can train how to do
something for all my life, such as beating an Olympic record - and still not be 100% certain that I will
be able to do it. In fact for many things, I can probably only be certain I won’t be able to do it. The
intuition that I am not guaranteed to be an olympian is clear. Getting a similar intuition around AI
training is part of the challenge.
This is especially hard because we typically think of computer systems as being deterministic, meaning
if I run the same operation twice I will get the same result. The way AI models get trained is not
deterministic. The world in which AIs operate is not deterministic. The processes around creation of
training data involve humans and are not deterministic. Therefore at the heart of training data is an
inherent randomness. Much of the work with training data, especially around the abstractions, is
defining what is and is not possible in the system. Essentially trying to reign in and control the
randomness into more reasonable bounds.
We create training data to cover the expected cases. What we expect to happen. And as we will cover
in more depth later, we primarily use rapid retraining of the data to handle the expected randomness of
the world.
Relevancy
Continuing the theme of validation of existing data, how do we know if our data is actually relevant to
the real world? For example, we could get a perfect 100% on a test set, but if the test set isn’t relevant
to the real world data then we are in trouble! There is currently no known direct test that can define if
the data is relevant to the production data - only time will truly tell. This is similar to the way a
traditional system can be loaded tested for x number of users, but the true usage pattern will always be
different. Of course we can do our best to plan for the expected use and design the system.
What-To-Label
As part of Dataset Construction we know we need to create a smaller set from a larger set of raw data -
but how? This is the concern of What-To-Label. Generally, the goal of these approaches is to find
“interesting” data. For example if I have thousands of images that are similar, but 3 are obviously
different, I may get the most bang for my buck if I start with the 3 most different ones.
Iterations
In traditional programming we iterate on the design of functions, features, and systems. In Training
Data there is a similar form of iteration on all of the concepts we are here in discussing. The models
themselves are also iterative, for example they may be retrained on a predetermined frequency, such as
daily. Two of the biggest competing approaches here are the “Fire and forget” and the “Continual
Retrain”. In some cases it may be impractical to retrain, and so a single, final model is created.
Transfer Learning
The idea of transfer learning is to start off from an existing knowledge base before training a new
model.
Transfer learning is used to dramatically speed up training new models. From a Training Data view,
transfer learning introduces challenges around bias. Because we are indirectly using training data from
that prior model training. If there was undesirable bias in that model, it may carry over to our new case.
Essentially it creates a dependency on that prior training data set. Dependencies are an unavoidable
reality of software, but it’s important to be aware of them and surface the trade-offs clearly.
Technical Specifics
There are a variety of technical specifics, such as formats and representations that I will cover in some
detail. While generally these representations have a “flavor of the month” feel, I will cover some of the
currently popular ones and speak to the general goals the formats are aiming to achieve.
Dataset
A dataset is like a folder. It usually has the special meaning that there are both “raw” data (such as images) and
annotations in the same place. For example a folder of 100 images plus a text file that lists the annotations.
These systems continue to have significant value - however - they have some limits. They won’t help us
build systems to interpret a computerized tomography (CT) Scan, understand football tactics, or drive a
car. As models require less and less data to be effective, it puts more emphasis on creating application
specific training data.
The idea behind Supervised Learning is generally a human expressly saying “here’s an example of what
a player passing a ball” looks like. “Here’s what a tumor looks like”. “This section of the apple is rotten”.
Control
The question in any system: control.
How Training Data Controls the Model
Where is the control? In normal computer code this is human written logic in the form of loops, if
statements, etc. This logic defines the system.
In Machine Learning I define features of interest and a dataset. The algorithm generates a model which
is effectively the control logic. I exercise control by choosing features.
In a Deep Learning system, the algorithm does its own Feature Selection. The algorithm attempts to
determine what features are relevant to a given goal. That goal is defined by Training Data. In fact,
Training Data is the entire definition of the goal.
Here’s how it works. An internal part of the algorithm, called a loss function, describes a key part of how
the algorithm can learn a good representation of this goal. This is not the goal itself. The algorithm uses
this loss function to determine how close it is to the goal defined in the training data. The training data
is the “ground truth” for correctness of the model’s relationship to the human defined goal.5
Dependencies
In traditional software development there is a degree of dependency between the end user and the
engineering. The end user cannot truly say if the program is “correct”, and neither can the engineer.
Their definitions of correctness may be similar, but are most likely not exactly equal. It’s hard for an end
user to say what they want until a “prototype” of it has been built. Therefore both the end user and
engineer are dependent on each other. This is called a circular dependency. The ability to improve the
software comes from the interplay between both.
With Training Data, the AI Supervisors, control the meaning of the system when doing the literal
supervision. The Data Scientists control it when choosing abstractions such as label templates.
For example, if I as a supervisor were to label a tumor as cancerous when in fact it’s benign, I would be
controlling the output of the system in a detrimental way. In this context, it’s worth understanding there
is no validation possible to ever 100% eliminate this control. Engineering cannot, in a reasonable time
frame, look at all the data.
Historical Aside
There used to be this assumption that Data Science knew what ‘correct’ was. The theory was that they could
define some examples of correct, and then as long as the human supervisors generally stuck to that guide, they
knew what correct was. The problem is, how can an english speaking data scientist know if a translation to french
is correct? How can a data scientist know if a doctor’s medical opinion on an X-Ray image is correct? The short
answer is - they can’t. As the role of AI systems grows subject matter experts increasingly exercise control on the
system that supersedes Data Science.6
To understand why this goes beyond the “garbage in garbage out” phrase. Consider that in a traditional
program, while the end user may not be happy, the engineer can, through a concept called unit tests, at
least guarantee that the code is “correct”.
This is impossible in the context of training data, because the controls available to engineering, such as
a validation set, are still based on the control executed by the individual AI supervisors.
Note: Classic cases where there’s existing data that can’t be edited. Context of changing the underlying data
(unlike say sales statistics that are fixed).
Further, the AI supervisors are generally bound by the control exerted by engineering in defining the
abstractions they are allowed to use. It’s almost as though anything an end user writes, starts to
become part of the fabric of the system itself.
This blurring of the lines between “content” and “system” is important. This is distinctly different from
classic systems. For example, on a social media platform, your content may be the value, but it’s still
clear what is the literal system (the box you type in, the results you see, etc) and the content you post
(text, pictures, etc).
While this entire book is about the concepts around (control) of training data - it’s worth understanding
that:
Training Data abstractions define Data Sciences control Not just algorithm selection
Training Data literals define Supervisors control. Their control can supersede Data Science
Discovery
Training Data classically has been about discovery of new insights and useful heuristics. The starting
point is often text, tabular and time series data. This data is usually used as a form of discovery, such as
for recommender systems (Netflix suggested movies), anomaly detection, and car reconditioning costs.
Crucially there is no form of human “supervision”. In the modern deep learning context, there may not
even be feature engineering. To slightly oversimplify, the data is fixed, a very high level goal is defined,
and the algorithm goes to work.
Feature engineering
A practice of selecting specific columns (or slices) that are more relevant, for example the odometer column for a
vehicle reconditioning cost predictor
Monkey See, Monkey Do
With Supervised, we already know what the correct answer is, and the goal is to essentially copy that
understanding. This direct control makes supervised learning applicable to a new variety of use cases,
especially in the realm of “dense” data, such as video, images, and audio. Instead of the data being
fixed, we can even control generating new data, such as taking new images.
We will cover a refresher on the Classic context and in-depth comparison of how it relates to this new
Supervised context.
Introduction
Imagine we are building an autonomous system, such as a traffic light detection system.
The system will have a deep learning model that has been trained on a set of training data.
This training data consists of:
Raw images (or video)
Labels
Here we will discuss a few different approaches and the appropriate training data.
Figure 2 TK
To Supervise Example One, we need only two things:
1. To capture the relation to the file itself. E.g., that it’s a “sensor_front_2020_10_10_01_000.”
This is the “link” to the raw pixels. Eventually this file will be read, and the values converted into
tensors eg position 0,0 having RGB values.
2. To declare what it is in a meaningful way to us. “Traffic_light” or `1`
This is the core mapping idea. It can be done on pen and paper and can also be done in code.
Realistically we will need proper tools to create production training data, but from a conceptual
standpoint this is equally correct.
For example, consider this python code. Here we create a list and populate it with lists where the 0th
index is the file path and the 1st index is the ID. The completed result (assuming the literal bytes of the
.jpgs are available in the same folder) is a set of training data.7
Training_Data = [
[‘tmp/sensor_front_2020_10_10_01_000.jpg’,
1],
[‘tmp/sensor...001.jpg’, 0],
[‘tmp/sensor...002.jpg’, 0]]
This is missing a label map (what does 0 mean?). We can represent this as simple dictionary as:
Label_map = {
0 : “Traffic_light”,
1 : “No Traffic light”
}
Congrats! You have just created a Training Data Set, from scratch, with no tooling, and minimal effort!
With a little wrangling, this would be a completely valid starting point for a basic classification algorithm.
You can also see how, by simply adding more items to the list, you can increase the training data set.
While “real” sets are typically larger, and typically have more complex annotations, this is a great start!
Supervision vs Annotation
Annotation is a popular phrasing for anything to do with training data. Annotation generally implies adding or
drawing information - without regard to any concept of a system. In words to annotate is generally a “secondary”
action. This masks the importance and the context of the work being done. Supervision more accurately reflects
the overall scope and context of the work. It also better reflects the increasingly common context of correcting
(supervising!) an existing model or system.
Getting Started
This process will require several stages.
For those with technical knowledge let’s first dispel a notion - this is not about balancing the dataset.
Try to forget the concept of balancing while considering this.
To illustrate the need for net lift consider a raw, unlabeled dataset, in which 10% is labeled. [Shown Fig
as Dataset]. As a baseline approach we will random sample data 3 points (10/30/80%). At each point
we will look at model performance. If the performance is unchanged we will stop.
By chance we draw all hearts each time. Each additional heart we supervise provides minimal value -
since we already have seen many hearts. Further, we don’t really understand the complete production
picture because we did not encounter circles or triangles.
Two different things here:
1. The idea of identifying previously unknown cases
2. The idea of wanting to maximize the value of each net annotation.
Peter stared, but said nothing. Not even when the agent ran back
from the carriage with a little satchel and a strap full of shawls and
picture-books. The hack rolled away, the keen March wind chilled the
young Californian, who stood, doll in hand, respectfully waiting
admission to the warm hall beyond the door. Finally, since the
servant seemed to have been stricken speechless, she found her
own voice, and said:
“Please, boy, I’d like to see my Uncle Joe.”
“Your—Uncle—Joe, little miss?”
“That’s what I said. I must come in. I’m very cold. If this is
Baltimore, that the folks on the cars said was pretty, I guess they
didn’t know what they were talking about. I want to come in,
please.”
The old man found his wits returning. This was the queerest “parcel”
for which he had ever signed a receipt in an express-book, and he
knew there was some mistake. Yet he couldn’t withstand the
pleading brown eyes under the scarlet hat, even if he hadn’t been
“raised” to a habit of hospitality.
“Suah, little lady. Come right in. ’Tis dreadful cold out to-day. I ’most
froze goin’ to market, an’ I’se right down ashamed of myself leavin’
comp’ny waitin’ this way. Step right in the drawin’-room, little missy,
and tell me who ’tis you’d like to see.”
Picking up the luggage that had been deposited on the topmost of
the gleaming marble steps, which, even in winter, unlike his
neighbors, the master of the house disdained to hide beneath a
wooden casing, the negro led the way into the luxurious parlor. To
Josephine, fresh from the chill of the cloudy, windy day without, the
whole place seemed aglow. A rosy light came through the red-
curtained windows, shone from the open grate, repeated itself in the
deep crimson carpet that was so delightfully soft and warm.
“Sit down by the fire, little lady. There. That’s nice. Put your dolly
right here. Maybe she’s cold, too. Now, then, suah you’se fixed so
fine you can tell me who ’tis you’ve come to see,” said the man.
“What is your name, boy?” inquired Josephine.
“Peter, missy. My name’s Peter.”
“Well, then, Peter, don’t be stupid. Or are you deaf, maybe?” she
asked.
“Land, no, missy. I’se got my hearin’ fust class,” he replied,
somewhat indignantly.
“I have come to see my Uncle Joe. I wish to see him now. Please tell
him,” she commanded.
The negro scratched his gray wool and reflected. He had been born
and raised in the service of the family where he still “officiated,” and
knew its history thoroughly. His present master was the only son of
an only son, and there had never been a daughter. No, nor wife, at
least to this household. There were cousins in plenty, with whom Mr.
Joseph Smith was not on good terms. There were property interests
dividing them, and Mr. Joseph kept his vast wealth for his own use
alone. Some thought he should have shared it with others, but he
did not so think and lived his quiet life, with a trio of colored men-
servants. His house was one of the best appointed on the wide
avenue, but, also, one of the quietest. It was the first time that old
Peter had ever heard a child’s voice in that great room, and its clear
tones seemed to confuse him.
“I want to see my Uncle Joe. I want to see him right away. Go, boy,
and call him,” Josephine explained.
This was command, and Peter was used to obey, so he replied:
“All right, little missy, I’ll go see. Has you got your card? Who shall I
say ’tis?”
Josephine reflected. Once mamma had had some dear little visiting
cards engraved with her small daughter’s name, and the child
remembered with regret that if they had been packed with her
“things” at all, it must have been in the trunk, which the expressman
said would arrive by and by from the railway station. She could
merely say:
“Uncles don’t need cards when their folks come to see them. I’ve
come from mamma. She’s gone to the pickley land to see papa. Just
tell him Josephine. What’s that stuff out there?”
She ran to the window, pulled the lace curtains apart, and peered
out. The air was now full of great white flakes that whirled and
skurried about as if in the wildest sort of play.
“What is it, Peter? Quick, what is it?” she demanded.
“Huh! Don’t you know snow when you see it, little missy? Where you
lived at all your born days?” he cried, surprised.
“Oh, just snow. Course I’ve seen it, coming here on the cars. It was
on the ground, though, not in the air and the sky. I’ve lived with
mamma. Now I’ve come to live with Uncle Joe. Why don’t you tell
him? If a lady called to see my mamma do you s’pose big Bridget
wouldn’t say so?”
“I’se goin’,” he said, and went.
But he was gone so long, and the expected uncle was so slow to
welcome her, that even that beautiful room began to look dismal to
the little stranger. The violent storm which had sprung up with such
suddenness, darkened the air, and a terrible homesickness
threatened to bring on a burst of tears. Then, all at once, Josephine
remembered what Doctor Mack had said:
“Don’t be a weeper, little lady, whatever else you are. Be a smiler,
like my Cousin Helen, your mamma. You’re pretty small to tackle the
world alone, but just do it with a laugh and it will laugh back upon
you.”
Not all of which she understood, though she recalled every one of
the impressive words, but the “laughing part” was plain enough.
“Course, Rudanthy. No Uncle Joe would be glad to get a crying little
girl to his house. I’ll take off my coat and yours, darling. You are
pretty tired, I guess. I wonder where they’ll let us sleep, that black
boy and my uncle. I hope the room will have a pretty fire in it, like
this one. Don’t you?”
Rudanthy did not answer, but as Josephine laid her flat upon the
carpet, to remove her travelling cloak, she immediately closed her
waxen lids, and her little mother took this for assent.
“Oh, you sweetest thing! How I do love you!”
There followed a close hug of the faithful doll, which was witnessed
by a trio of colored men from a rear door, where they stood, open-
eyed and mouthed, wondering what in the world the master would
say when he returned and found this little trespasser upon his
hearth-stone.
When Rudanthy had been embraced, to the detriment of her jute
ringlets and her mistress’ comfort, Josephine curled down on the rug
before the grate to put the doll asleep, observing:
“You’re so cold, Rudanthy. Colder than I am, even. Your precious
hands are like ice. You must lie right here close to the fire, ’tween
me and it. By-and-by Uncle Joe will come and then—My! Won’t he
be surprised? That Peter boy is so dreadful stupid, like’s not he’ll
forget to say a single word about us. Never mind. He’s my papa’s
twin brother. Do you know what twins are, Rudanthy? I do. Big
Bridget’s sister’s got a pair of them. They’re two of a kind, though
sometimes one of them is the other kind. I mean, you know,
sometimes one twin isn’t a brother, it’s a sister. That’s what big
Bridget’s sister’s was. Oh, dear. I’m tired. I’m hungry. I liked it better
on that nice first railway car where everybody took care of me and
gave me sweeties. It’s terrible still here. I—I’m afraid I’m going to
sleep.”
In another moment the fear of the weary little traveller had become
a fact. Rudanthy was already slumbering; and, alas! that was to
prove the last of her many naps. But Josephine was unconscious of
the grief awaiting her own awakening; and, fortunately, too young to
know what a different welcome should have been accorded herself
by the relative she had come so far to visit.
Peter peeped in, from time to time, found all peaceful, and retired in
thankfulness for the temporary lull. He was trembling in his shoes
against the hour when the master should return and find him so
unfaithful to his trust as to have admitted that curly-haired intruder
upon their dignified privacy. Yet he encouraged himself with the
reflection:
“Well, no need crossin’ no bridges till you meet up with ’em, and this
bridge ain’t a crossin’ till Massa Joe’s key turns in that lock. Reckon I
was guided to pick out that fine duck for dinner this night, I do.
S’posin’, now, the market had been poor? Huh! Every trouble sets
better on a full stummick ’an a empty. Massa Joe’s powerful fond of
duck, lessen it’s spoiled in the cookin’. I’ll go warn that ’Pollo to be
mighty careful it done to a turn.”
Peter departed kitchen ward, where he tarried gossipping over the
small guest above stairs and the probable outcome of her advent.
“Nobody what’s a Christian goin’ to turn a little gell outen their doors
such an evenin’ as this,” said Apollo, deftly basting the fowl in the
pan.
“I’M JOSEPHINE!”
“Mebbe not, mebbe not. But I reckon we can’t, none of us, callate
on whatever Massa Joe’s goin’ to do about anything till he does it.
He’s off to a board meeting, this evening, and I hope he sets on it
comfortable. When them boards are too hard, like, he comes home
mighty ’rascible. Keep a right smart watch on that bird, ’Pollo, won’t
you? whiles I go lay the table.”
But here another question arose to puzzle the old man. Should he,
or should he not, prepare that table for the unexpected guest? There
was nobody more particular than Mr. Smith that all his orders should
be obeyed to the letter. Each evening he wished his dinner to be
served after one prescribed fashion, and any infraction of his rules
brought a reprimand to Peter.
However, in this case he determined to risk a little for hospitality’s
sake, reflecting that if the master were displeased he could whisk off
the extra plate before it was discovered.
“Massa Joe’s just as like to scold if I don’t put it on as if I do. Never
allays account for what’ll please him best. Depends on how he takes
it.”
Busy in his dining-room he did not hear the cab roll over the snowy
street and stop at the door, nor the turn of the key in the lock. Nor,
lost in his own thoughts, did the master of the house summon a
servant to help him off with his coat and overshoes. He repaired
immediately to his library, arranged a few papers, went to his
dressing-room and attired himself for dinner, with the carefulness to
which he had been trained from childhood, and afterward strolled
leisurely toward the great parlor, turned on the electric light, and
paused upon its threshold amazed, exclaiming:
“What is this? What in the world is—this?”
The sudden radiance which touched her eyelids, rather than his
startled exclamation, roused small Josephine from her restful nap.
She sat up, rubbed her eyes, which brightened with a radiance
beyond that of electricity, and sprang to her feet. With outstretched
arms she flung herself upon the astonished gentleman, crying:
“Oh, you beautiful, beautiful man! You darling, precious Uncle Joe!
I’m Josephine! I’ve come!”
CHAPTER IV.
A MULTITUDE OF JOSEPHS.