Introduction to algorithms solutions 3rd Edition Thomas H. Cormen download
Introduction to algorithms solutions 3rd Edition Thomas H. Cormen download
https://ebookname.com/product/introduction-to-algorithms-
solutions-3rd-edition-thomas-h-cormen/
https://ebookname.com/product/introduction-to-algorithms-third-
edition-cormen-thomas-h/
https://ebookname.com/product/introduction-to-the-design-and-
analysis-of-algorithms-3rd-edition-edition-levitin/
https://ebookname.com/product/introduction-to-algebra-solutions-
manual-2nd-edition-richard-rusczyk/
https://ebookname.com/product/paediatric-handbook-8th-ed-edition-
kate-thomson/
Smart Technologies for Safety Engineering 1st Edition
Jan Holnicki-Szulc
https://ebookname.com/product/smart-technologies-for-safety-
engineering-1st-edition-jan-holnicki-szulc/
https://ebookname.com/product/williams-textbook-of-
endocrinology-12th-edition-shlomo-melmed/
https://ebookname.com/product/oxford-international-primary-
english-student-book-6-hearn/
https://ebookname.com/product/autocad-pocket-reference-7th-
edition-cheryl-shrock/
https://ebookname.com/product/the-teaching-files-
gastrointestinal-expert-consult-online-and-print-teaching-files-
in-radiology-1st-edition-frank-h-miller-md-facr-fsar-fsabi/
The Empire in One City Liverpool s Inconvenient
Imperial Past 1st Edition Sheryllynne Haggerty
https://ebookname.com/product/the-empire-in-one-city-liverpool-s-
inconvenient-imperial-past-1st-edition-sheryllynne-haggerty/
Selected Solutions for Chapter 2:
Getting Started
S ELECTION -S ORT.A/
n D A:length
for j D 1 to n 1
smallest D j
for i D j C 1 to n
if AŒi < AŒsmallest
smallest D i
exchange AŒj with AŒsmallest
The algorithm maintains the loop invariant that at the start of each iteration of the
outer for loop, the subarray AŒ1 : : j 1 consists of the j 1 smallest elements
in the array AŒ1 : : n, and this subarray is in sorted order. After the first n 1
elements, the subarray AŒ1 : : n 1 contains the smallest n 1 elements, sorted,
and therefore element AŒn must be the largest element.
The running time of the algorithm is ‚.n2 / for all cases.
Modify the algorithm so it tests whether the input satisfies some special-case con-
dition and, if it does, output a pre-computed answer. The best-case running time is
generally not a good measure of an algorithm.
AŒlow : : high contains the value . The initial call to either version should have
the parameters A; ; 1; n.
Both procedures terminate the search unsuccessfully when the range is empty (i.e.,
low > high) and terminate it successfully if the value has been found. Based
on the comparison of to the middle element in the searched range, the search
continues with the range halved. The recurrence for these procedures is therefore
T .n/ D T .n=2/ C ‚.1/, whose solution is T .n/ D ‚.lg n/.
a. The inversions are .1; 5/; .2; 5/; .3; 4/; .3; 5/; .4; 5/. (Remember that inversions
are specified by indices rather than by the values in the array.)
b. The array with elements from f1; 2; : : : ; ng with the most inversions is
hn; n 1; n 2; : : : ; 2; 1i. For all 1 i < j n, there is an inversion .i; j /.
The number of such inversions is n2 D n.n 1/=2.
c. Suppose that the array A starts out with an inversion .k; j /. Then k < j and
AŒk > AŒj . At the time that the outer for loop of lines 1–8 sets key D AŒj ,
the value that started in AŒk is still somewhere to the left of AŒj . That is,
it’s in AŒi, where 1 i < j , and so the inversion has become .i; j /. Some
iteration of the while loop of lines 5–7 moves AŒi one position to the right.
Line 8 will eventually drop key to the left of this element, thus eliminating the
inversion. Because line 5 moves only elements that are less than key, it moves
only elements that correspond to inversions. In other words, each iteration of
the while loop of lines 5–7 corresponds to the elimination of one inversion.
Selected Solutions for Chapter 2: Getting Started 2-3
d. We follow the hint and modify merge sort to count the number of inversions in
‚.n lg n/ time.
To start, let us define a merge-inversion as a situation within the execution of
merge sort in which the M ERGE procedure, after copying AŒp : : q to L and
AŒq C 1 : : r to R, has values x in L and y in R such that x > y. Consider
an inversion .i; j /, and let x D AŒi and y D AŒj , so that i < j and x > y.
We claim that if we were to run merge sort, there would be exactly one merge-
inversion involving x and y. To see why, observe that the only way in which
array elements change their positions is within the M ERGE procedure. More-
over, since M ERGE keeps elements within L in the same relative order to each
other, and correspondingly for R, the only way in which two elements can
change their ordering relative to each other is for the greater one to appear in L
and the lesser one to appear in R. Thus, there is at least one merge-inversion
involving x and y. To see that there is exactly one such merge-inversion, ob-
serve that after any call of M ERGE that involves both x and y, they are in the
same sorted subarray and will therefore both appear in L or both appear in R
in any given call thereafter. Thus, we have proven the claim.
We have shown that every inversion implies one merge-inversion. In fact, the
correspondence between inversions and merge-inversions is one-to-one. Sup-
pose we have a merge-inversion involving values x and y, where x originally
was AŒi and y was originally AŒj . Since we have a merge-inversion, x > y.
And since x is in L and y is in R, x must be within a subarray preceding the
subarray containing y. Therefore x started out in a position i preceding y’s
original position j , and so .i; j / is an inversion.
Having shown a one-to-one correspondence between inversions and merge-
inversions, it suffices for us to count merge-inversions.
Consider a merge-inversion involving y in R. Let ´ be the smallest value in L
that is greater than y. At some point during the merging process, ´ and y will
be the “exposed” values in L and R, i.e., we will have ´ D LŒi and y D RŒj
in line 13 of M ERGE. At that time, there will be merge-inversions involving y
and LŒi; LŒi C 1; LŒi C 2; : : : ; LŒn1 , and these n1 i C 1 merge-inversions
will be the only ones involving y. Therefore, we need to detect the first time
that ´ and y become exposed during the M ERGE procedure and add the value
of n1 i C 1 at that time to our total count of merge-inversions.
The following pseudocode, modeled on merge sort, works as we have just de-
scribed. It also sorts the array A.
C OUNT-I NVERSIONS.A; p; r/
inersions D 0
if p < r
q D b.p C r/=2c
inersions D inersions C C OUNT-I NVERSIONS.A; p; q/
inersions D inersions C C OUNT-I NVERSIONS.A; q C 1; r/
inersions D inersions C M ERGE -I NVERSIONS.A; p; q; r/
return inersions
2-4 Selected Solutions for Chapter 2: Getting Started
M ERGE -I NVERSIONS.A; p; q; r/
n1 D q p C 1
n2 D r q
let LŒ1 : : n1 C 1 and RŒ1 : : n2 C 1 be new arrays
for i D 1 to n1
LŒi D AŒp C i 1
for j D 1 to n2
RŒj D AŒq C j
LŒn1 C 1 D 1
RŒn2 C 1 D 1
i D1
j D1
inersions D 0
counted D FALSE
for k D p to r
if counted == FALSE and RŒj < LŒi
inersions D inersions C n1 i C 1
counted D TRUE
if LŒi RŒj
AŒk D LŒi
i D i C1
else AŒk D RŒj
j D j C1
counted D FALSE
return inersions
To show that .n C a/b D ‚.nb /, we want to find constants c1 ; c2 ; n0 > 0 such that
0 c1 nb .n C a/b c2 nb for all n n0 .
Note that
n C a n C jaj
2n when jaj n ,
and
n C a n jaj
1
n when jaj 12 n .
2
Thus, when n 2 jaj,
1
0 n n C a 2n :
2
Since b > 0, the inequality still holds when all parts are raised to the power b:
b
1
0 n .n C a/b .2n/b ;
2
b
1
0 nb .n C a/b 2b nb :
2
Thus, c1 D .1=2/b , c2 D 2b , and n0 D 2 jaj satisfy the definition.
Let the running time be T .n/. T .n/ O.n2 / means that T .n/ f .n/ for some
function f .n/ in the set O.n2 /. This statement holds for any running time T .n/,
since the function g.n/ D 0 for all n is in O.n2 /, and running times are always
nonnegative. Thus, the statement tells us nothing about the running time.
3-2 Selected Solutions for Chapter 3: Growth of Functions
The last step above follows from the property that any polylogarithmic function
grows more slowly than any positive polynomial function, i.e., that for constants
a; b > 0, we have lgb n D o.na /. Substitute lg n for n, 2 for b, and 1 for a, giving
lg2 .lg n/ D o.lg n/.
Therefore, lg.dlg lg neŠ/ D O.lg n/, and so dlg lg neŠ is polynomially bounded.
Selected Solutions for Chapter 4:
Divide-and-Conquer
If you can multiply 3 3 matrices using k multiplications, then you can multiply
n n matrices by recursively multiplying n=3 n=3 matrices, in time T .n/ D
kT .n=3/ C ‚.n2 /.
Using the master method to solve this recurrence, consider the ratio of nlog3 k
and n2 :
If log3 k D 2, case 2 applies and T .n/ D ‚.n2 lg n/. In this case, k D 9 and
T .n/ D o.nlg 7 /.
If log3 k < 2, case 3 applies and T .n/ D ‚.n2 /. In this case, k < 9 and
T .n/ D o.nlg 7 /.
If log3 k > 2, case 1 applies and T .n/ D ‚.nlog3 k /. In this case, k > 9.
T .n/ D o.nlg 7 / when log3 k < lg 7, i.e., when k < 3lg 7 21:85. The largest
such integer k is 21.
Thus, k D 21 and the running time is ‚.nlog3 k / D ‚.nlog3 21 / D O.n2:80 / (since
log3 21 2:77).
The shortest path from the root to a leaf in the recursion tree is n ! .1=3/n !
.1=3/2 n ! ! 1. Since .1=3/k n D 1 when k D log3 n, the height of the part
of the tree in which every node has two children is log3 n. Since the values at each
of these levels of the tree add up to cn, the solution to the recurrence is at least
cn log3 n D .n lg n/.
Since H IRE -A SSISTANT always hires candidate 1, it hires exactly once if and only
if no candidates other than candidate 1 are hired. This event occurs when candi-
date 1 is the best candidate of the n, which occurs with probability 1=n.
H IRE -A SSISTANT hires n times if each candidate is better than all those who were
interviewed (and hired) before. This event occurs precisely when the list of ranks
given to the algorithm is h1; 2; : : : ; ni, which occurs with probability 1=nŠ.
Another way to think of the hat-check problem is that we want to determine the
expected number of fixed points in a random permutation. (A fixed point of a
permutation is a value i for which .i/ D i.) We could enumerate all nŠ per-
mutations, count the total number of fixed points, and divide by nŠ to determine
the average number of fixed points per permutation. This would be a painstak-
ing process, and the answer would turn out to be 1. We can use indicator random
variables, however, to arrive at the same answer much more easily.
Define a random variable X that equals the number of customers that get back their
own hat, so that we want to compute E ŒX.
For i D 1; 2; : : : ; n, define the indicator random variable
Xi D I fcustomer i gets back his own hatg :
Then X D X1 C X2 C C Xn .
Since the ordering of hats is random, each customer has a probability of 1=n of
getting back his or her own hat. In other words, Pr fXi D 1g D 1=n, which, by
Lemma 5.1, implies that E ŒXi D 1=n.
5-2 Selected Solutions for Chapter 5: Probabilistic Analysis and Randomized Algorithms
Thus,
" n #
X
E ŒX D E Xi
i D1
n
X
D E ŒXi (linearity of expectation)
i D1
X n
D 1=n
i D1
D 1;
and so we expect that exactly 1 customer gets back his own hat.
Note that this is a situation in which the indicator random variables are not inde-
pendent. For example, if n D 2 and X1 D 1, then X2 must also equal 1. Con-
versely, if n D 2 and X1 D 0, then X2 must also equal 0. Despite the dependence,
Pr fXi D 1g D 1=n for all i, and linearity of expectation holds. Thus, we can use
the technique of indicator random variables even in the presence of dependence.
Let Xij be an indicator random variable for the event where the pair AŒi; AŒj
for i < j is inverted, i.e., AŒi > AŒj . More precisely, we define Xij D
I fAŒi > AŒj g for 1 i < j n. We have Pr fXij D 1g D 1=2, because
given two distinct random numbers, the probability that the first is bigger than the
second is 1=2. By Lemma 5.1, E ŒXij D 1=2.
Let X be the the random variable denoting the total number of inverted pairs in the
array, so that
n 1 X
X n
XD Xij :
iD1 j DiC1
We want the expected number of inverted pairs, so we take the expectation of both
sides of the above equation to obtain
"n 1 n #
X X
E ŒX D E Xij :
iD1 j DiC1
!
n 1
D
2 2
n.n 1/ 1
D
2 2
n.n 1/
D :
4
Thus the expected number of inverted pairs is n.n 1/=4.
Although P ERMUTE -W ITHOUT-I DENTITY will not produce the identity permuta-
tion, there are other permutations that it fails to produce. For example, consider
its operation when n D 3, when it should be able to produce the nŠ 1 D 5 non-
identity permutations. The for loop iterates for i D 1 and i D 2. When i D 1,
the call to R ANDOM returns one of two possible values (either 2 or 3), and when
i D 2, the call to R ANDOM returns just one value (3). Thus, P ERMUTE -W ITHOUT-
I DENTITY can produce only 2 1 D 2 possible permutations, rather than the 5 that
are required.
Since a heap is an almost-complete binary tree (complete at all levels except pos-
sibly the lowest), it has at most 2hC1 1 elements (if it is complete) and at least
2h 1 C 1 D 2h elements (if the lowest level has just 1 element and the other levels
are complete).
If you put a value at the root that is less than every value in the left and right
subtrees, then M AX -H EAPIFY will be called recursively until a leaf is reached. To
make the recursive calls traverse the longest path to a leaf, choose values that make
M AX -H EAPIFY always recurse on the left child. It follows the left branch when
the left child is greater than or equal to the right child, so putting 0 at the root
and 1 at all the other nodes, for example, will accomplish that. With such values,
M AX -H EAPIFY will be called h times (where h is the heap height, which is the
number of edges in the longest path from the root to a leaf), so its running time
will be ‚.h/ (since each call does ‚.1/ work), which is ‚.lg n/. Since we have
a case in which M AX -H EAPIFY’s running time is ‚.lg n/, its worst-case running
time is .lg n/.
6-4 Selected Solutions for Chapter 6: Heapsort
of .n lg n/, consider the case in which the input array is given in strictly in-
creasing order. Each call to M AX -H EAP -I NSERT causes H EAP -I NCREASE -
K EY to go all the way up to the root. Since the depth of node i is blg ic, the
total time is
Xn X n
‚.blg ic/ ‚.blg dn=2ec/
iD1 iDdn=2e
n
X
‚.blg.n=2/c/
iDdn=2e
n
X
D ‚.blg n 1c/
iDdn=2e
n=2 ‚.lg n/
D .n lg n/ :
In the worst case, therefore, B UILD -M AX -H EAP0 requires ‚.n lg n/ time to
build an n-element heap.
Selected Solutions for Chapter 7:
Quicksort
The minimum depth follows a path that always takes the smaller part of the parti-
tion—i.e., that multiplies the number of elements by ˛. One iteration reduces the
number of elements from n to ˛n, and i iterations reduces the number of elements
to ˛ i n. At a leaf, there is just one remaining element, and so at a minimum-depth
leaf of depth m, we have ˛ m n D 1. Thus, ˛ m D 1=n. Taking logs, we get
m lg ˛ D lg n, or m D lg n= lg ˛.
Similarly, maximum depth corresponds to always taking the larger part of the par-
tition, i.e., keeping a fraction 1 ˛ of the elements each time. The maximum
depth M is reached when there is one element left, that is, when .1 ˛/M n D 1.
Thus, M D lg n= lg.1 ˛/.
All these equations are approximate because we are ignoring floors and ceilings.
Selected Solutions for Chapter 8:
Sorting in Linear Time
If the sort runs in linear time for m input permutations, then the height h of the
portion of the decision tree consisting of the m corresponding leaves and their
ancestors is linear.
Use the same argument as in the proof of Theorem 8.1 to show that this is impos-
sible for m D nŠ=2, nŠ=n, or nŠ=2n .
We have 2h m, which gives us h lg m. For all the possible m’s given here,
lg m D .n lg n/, hence h D .n lg n/.
In particular,
nŠ
lg D lg nŠ 1 n lg n n lg e 1;
2
nŠ
lg D lg nŠ lg n n lg n n lg e lg n ;
n
nŠ
lg n D lg nŠ n n lg n n lg e n:
2
Basis: If d D 1, there’s only one digit, so sorting on that digit sorts the array.
Inductive step: Assuming that radix sort works for d 1 digits, we’ll show that it
works for d digits.
Radix sort sorts separately on each digit, starting from digit 1. Thus, radix sort of
d digits, which sorts on digits 1; : : : ; d is equivalent to radix sort of the low-order
d 1 digits followed by a sort on digit d . By our induction hypothesis, the sort of
the low-order d 1 digits works, so just before the sort on digit d , the elements
are in order according to their low-order d 1 digits.
The sort on digit d will order the elements by their d th digit. Consider two ele-
ments, a and b, with d th digits ad and bd respectively.
If ad < bd , the sort will put a before b, which is correct, since a < b regardless
of the low-order digits.
If ad > bd , the sort will put a after b, which is correct, since a > b regardless
of the low-order digits.
If ad D bd , the sort will leave a and b in the same order they were in, because
it is stable. But that order is already correct, since the correct order of a and b
is determined by the low-order d 1 digits when their d th digits are equal, and
the elements are already sorted by their low-order d 1 digits.
If the intermediate sort were not stable, it might rearrange elements whose d th
digits were equal—elements that were in the right order after the sort on their
lower-order digits.
Treat the numbers as 3-digit numbers in radix n. Each digit ranges from 0 to n 1.
Sort these 3-digit numbers with radix sort.
There are 3 calls to counting sort, each taking ‚.n C n/ D ‚.n/ time, so that the
total time is ‚.n/.
a. For a comparison algorithm A to sort, no two input permutations can reach the
same leaf of the decision tree, so there must be at least nŠ leaves reached in TA ,
one for each possible input permutation. Since A is a deterministic algorithm, it
must always reach the same leaf when given a particular permutation as input,
so at most nŠ leaves are reached (one for each permutation). Therefore exactly
nŠ leaves are reached, one for each input permutation.
Selected Solutions for Chapter 8: Sorting in Linear Time 8-3
These nŠ leaves will each have probability 1=nŠ, since each of the nŠ possible
permutations is the input with the probability 1=nŠ. Any remaining leaves will
have probability 0, since they are not reached for any input.
Without loss of generality, we can assume for the rest of this problem that paths
leading only to 0-probability leaves aren’t in the tree, since they cannot affect
the running time of the sort. That is, we can assume that TA consists of only the
nŠ leaves labeled 1=nŠ and their ancestors.
b. If k > 1, then the root of T is not a leaf. This implies that all of T ’s leaves
are leaves in LT and RT . Since every leaf at depth h in LT or RT has depth
h C 1 in T , D.T / must be the sum of D.LT /, D.RT /, and k, the total number
of leaves. To prove this last assertion, let dT .x/ D depth of node x in tree T .
Then, X
D.T / D dT .x/
x2leaves.T /
X X
D dT .x/ C dT .x/
x2leaves.LT / x2leaves.RT /
X X
D .dLT .x/ C 1/ C .dRT .x/ C 1/
x2leaves.LT / x2leaves.RT /
X X X
D dLT .x/ C dRT .x/ C 1
x2leaves.LT / x2leaves.RT / x2leaves.T /
D D.LT / C D.RT / C k :
c. To show that d.k/ D min1i k 1 fd.i/ C d.k i/ C kg we will show sepa-
rately that
d.k/ min fd.i/ C d.k i / C kg
1i k 1
and
d.k/ min fd.i/ C d.k i / C kg :
1i k 1
To show that d.k/ min1i k 1 fd.i/ C d.k i/ C kg, we need only show
that d.k/ d.i/ C d.k i/ C k, for i D 1; 2; : : : ; k 1. For any i from 1
to k 1 we can find trees RT with i leaves and LT with k i leaves such
that D.RT / D d.i / and D.LT / D d.k i/. Construct T such that RT and
LT are the right and left subtrees of T ’s root respectively. Then
d.k/ D.T / (by definition of d as min D.T / value)
D D.RT / C D.LT / C k (by part (b))
D d.i/ C d.k i/ C k (by choice of RT and LT ) .
To show that d.k/ min1i k 1 fd.i/ C d.k i/ C kg, we need only show
that d.k/ d.i / C d.k i/ C k, for some i in f1; 2; : : : ; k 1g. Take the
tree T with k leaves such that D.T / D d.k/, let RT and LT be T ’s right
and left subtree, respecitvely, and let i be the number of leaves in RT . Then
k i is the number of leaves in LT and
d.k/ D D.T / (by choice of T )
D D.RT / C D.LT / C k (by part (b))
d.i/ C d.k i/ C k (by defintion of d as min D.T / value) .
8-4 Selected Solutions for Chapter 8: Sorting in Linear Time
D.TA / is the sum of the decision-tree path lengths for sorting all input per-
mutations, and the path lengths are proportional to the run time. Since the nŠ
permutations have equal probability 1=nŠ, the expected time to sort n random
elements (1 input permutation) is the total time for all permutations divided
by nŠ:
.nŠ lg.nŠ//
D .lg.nŠ// D .n lg n/ :
nŠ
f. We will show how to modify a randomized decision tree (algorithm) to define a
deterministic decision tree (algorithm) that is at least as good as the randomized
one in terms of the average number of comparisons.
At each randomized node, pick the child with the smallest subtree (the subtree
with the smallest average number of comparisons on a path to a leaf). Delete all
the other children of the randomized node and splice out the randomized node
itself.
The deterministic algorithm corresponding to this modified tree still works, be-
cause the randomized algorithm worked no matter which path was taken from
each randomized node.
The average number of comparisons for the modified algorithm is no larger
than the average number for the original randomized tree, since we discarded
the higher-average subtrees in each case. In particular, each time we splice out
a randomized node, we leave the overall average less than or equal to what it
was, because
the same set of input permutations reaches the modified subtree as before, but
those inputs are handled in less than or equal to average time than before, and
the rest of the tree is unmodified.
The randomized algorithm thus takes at least as much time on average as the
corresponding deterministic one. (We’ve shown that the expected running time
for a deterministic comparison sort is .n lg n/, hence the expected time for a
randomized comparison sort is also .n lg n/.)
Selected Solutions for Chapter 9:
Medians and Order Statistics
For groups of 7, the algorithm still works in linear time. The number of elements
greater than x (and similarly, the number less than x) is at least
l m
1 n 2n
4 2 8;
2 7 7
and the recurrence becomes
T .n/ T .dn=7e/ C T .5n=7 C 8/ C O.n/ ;
which can be shown to be O.n/ by substitution, as for the groups of 5 case in the
text.
For groups of 3, however, the algorithm no longer works in linear time. The number
of elements greater than x, and the number of elements less than x, is at least
l m
1 n n
2 2 4;
2 3 3
and the recurrence becomes
T .n/ T .dn=3e/ C T .2n=3 C 4/ C O.n/ ;
which does not have a linear solution.
We can prove that the worst-case time for groups of 3 is .n lg n/. We do so by
deriving a recurrence for a particular case that takes .n lg n/ time.
In counting up the number of elements greater than x (and similarly, the l lnum-
mm
ber less than x), consider the particular case in which there are exactly 12 n3
groups with medians x and in which the “leftover” group does contribute 2
elements
l l mm greater than x. Then the number of elements greater than x is exactly
1 n
2 2 3 1 C 1 (the 1 discounts x’s group, as usual, and the C1 is con-
tributed by x’s group) D 2 dn=6e 1, and the recursive step for elements x has
n .2 dn=6e 1/ n .2.n=6 C 1/ 1/ D 2n=3 1 elements. Observe also
that the O.n/ term in the recurrence is really ‚.n/, since the partitioning in step 4
takes ‚.n/ (not just O.n/) time. Thus, we get the recurrence
T .n/ T .dn=3e/ C T .2n=3 1/ C ‚.n/ T .n=3/ C T .2n=3 1/ C ‚.n/ ;
from which you can show that T .n/ cn lg n by substitution. You can also see
that T .n/ is nonlinear by noticing that each level of the recursion tree sums to n.
In fact, any odd group size 5 works in linear time.
9-2 Selected Solutions for Chapter 9: Medians and Order Statistics
A modification to quicksort that allows it to run in O.n lg n/ time in the worst case
uses the deterministic PARTITION algorithm that was modified to take an element
to partition around as an input parameter.
S ELECT takes an array A, the bounds p and r of the subarray in A, and the rank i
of an order statistic, and in time linear in the size of the subarray AŒp : : r it returns
the ith smallest element in AŒp : : r.
For an n-element array, the largest subarray that B EST-C ASE -Q UICKSORT re-
curses on has n=2 elements. This situation occurs when n D r p C 1 is even;
then the subarray AŒq C 1 : : r has n=2 elements, and the subarray AŒp : : q 1
has n=2 1 elements.
Because B EST-C ASE -Q UICKSORT always recurses on subarrays that are at most
half the size of the original array, the recurrence for the worst-case running time is
T .n/ 2T .n=2/ C ‚.n/ D O.n lg n/.
We assume that are given a procedure M EDIAN that takes as parameters an ar-
ray A and subarray indices p and r, and returns the value of the median element of
AŒp : : r in O.n/ time in the worst case.
Given M EDIAN, here is a linear-time algorithm S ELECT0 for finding the i th small-
est element in AŒp : : r. This algorithm uses the deterministic PARTITION algo-
rithm that was modified to take an element to partition around as an input parame-
ter.
Selected Solutions for Chapter 9: Medians and Order Statistics 9-3
S ELECT0 .A; p; r; i/
if p == r
return AŒp
x D M EDIAN.A; p; r/
q D PARTITION.x/
k D q pC1
if i == k
return AŒq
elseif i < k
return S ELECT0 .A; p; q 1; i/
else return S ELECT0 .A; q C 1; r; i k/
Because x is the median of AŒp : : r, each of the subarrays AŒp : : q 1 and
AŒq C 1 : : r has at most half the number of elements of AŒp : : r. The recurrence
for the worst-case running time of S ELECT0 is T .n/ T .n=2/ C O.n/ D O.n/.
The man’s face I could not see, but the woman, whose hair,
beneath her navy blue Tam o’ Shanter cap showed
dishevelled as a ray of sunlight struck it, and whose white
silk dress showed muddy and bedraggled beneath her dark
cloak, I recognised in an instant—although her back was
turned towards me.
It was Lady Lolita, the goddess of my admiration. Lolita—
my queen—my love.
Chapter Six.
For Love of Lolita.
After he had left her, she stood alone, gazing after him. No
word, however, escaped her. By his attitude I knew that he
had threatened her, and that she had no defence. She was
inert and helpless.
For the first time she seemed to realise that the sun was
already shining, and that it was open day, for she glanced
at her clothes, and with feminine dexterity shook out her
bedraggled skirts and glanced at them dismayed.
“If I can only conceal the fact that I’ve been absent all
night, it will be of such very material assistance,” she said
after we had crossed the high road and gained the shelter
of a long narrow spinney. “I shall never be able to
sufficiently repay you for this,” she added.
Just before nine the doctor came in, ruddy and well-shaven,
and throwing himself into an armchair exclaimed—
“Well, I’ve been up into the park with the police. They’ve
sent that blundering fool Redway—worse than useless!
We’ve been over the ground, but there’s so many footprints
that it’s impossible to distinguish any—save one.”
I pushed the box before him, with sinking heart, and at the
same time invited him to the table to have breakfast, for I
had not yet finished.
“I’ve given over that bit of white fur to Red way,” he went
on. “And I expect we shall find that the owner of it is also
owner of the small shoes. I know most of the girls of
Sibberton—in fact, I’ve attended all of them, I expect—but I
can’t suggest one who would, or even could, wear such a
shoe as that upon the woman who was present at the
tragedy, if not the actual assassin.”
“His lordship would like to see you, sir. He’s in the library.”
All this caused her husband deep regret and worry. He was
unhappy, for with her flitting to and from the Continental
spas, to Rome, to Florence, to Scotland, to Paris and
elsewhere, he enjoyed little of her society, although he
loved her dearly and had married her purely on that
account.
Often in the silence of his room he sighed heavily when he
spoke of her to me, and more than once, old friends that we
were, he had unbosomed himself to me, so that, knowing
what I did, I honestly pitied him. There was, in fact,
affection just as strong in the heart of the millionaire
landowner as in that of his very humble secretary.
“Most likely,” was the young Earl’s reply. “He evidently had
some fixed purpose in watching my movements, but what it
could be is an entire mystery. During the last fortnight I was
in town I always carried my little revolver, fearing—well, to
tell you the truth, fearing lest he should make an attack
upon me,” he admitted with a smile. “The fact was, I had
become thoroughly unnerved.”
“I mean to do all I can to find out who the fellow was and
why he was killed,” the Earl declared, striding up and down
the room impatiently. “I’ve just seen Lolita, who seems very
upset about it. She, too, admits that she saw the man
watching me at Ranelagh, at the bazaar, and also at other
places.”
“Well, now, ninetheen pound won’t hurt yer. You shall ’ave it
for ninetheen pound.”
“Nine.”
“Well, I’ll tell you in confidence. Mind, however, you don’t let
it out to a single soul—but the fact is that the house is
under the observation of the police, and has been for some
time. Sergeant Bullen, the detective, is on duty up there at
the end of the road,” and he jerked his thumb in that
direction. “He said good-night to me only a minute ago.”
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
ebookname.com