mastertheorem_umd
mastertheorem_umd
Justin Wyss-Gallifent
October 2, 2023
1
1 The Theorem (Straightforward Version)
Of course we would rather not do this sort of calculation every time so we
might ask if there are reliable formulas which emerge in specific situations and
the answer is yes, and these are encapsulated in the Master Theorem:
Theorem 1.0.1. Suppose T (n) satisfies the recurrence relation:
n
T (n) = aT + f (n)
b
n
n n
for positive integers a ≥ 1 and b > 1 and where b can mean either b or b ,
it doesn’t matter which. Then we have:
1. If f (n) = O(nc ) and logb a > c then T (n) = Θ nlogb a .
2. If f (n) = Θ(nc ) and logb a = c then T (n) = Θ nlogb a lg n .
2f. (Fancy Version) If f (n) = Θ(nc lgk n) and logb a = c then T (n) = Θ nlogb a lgk+1 n .
Proof. Formal proof omitted. See the intuition section later, if you’re interested.
QED
T (n) = 4T (n/2) + n2 + lg n
Observe that f (n) = n2 + lg n and logb a = log2 4 = 2 and then:
• f (n) = Θ(n2 ) with c = 2:
Observe log2 4 = 2 = c so Case 2 applies and so:
2
Example 2.2. Consider:
T (n) = 2T (n/8) + n
T (n) = 3T (n/4) + n lg n + n
3
Example 2.5. Consider:
T (n) = Θ(nlog2 10 lg n)
Example 2.6. Consider:
T (n) = 9T (n/3) + n2 lg n + lg n
4
3 Motivation Behind the Theorem
3.1 Intution Without the f (n)
If f (n) = 0 then f (n) = O(1) = O(n0 ) and 0 < logb a as long as a > 1 (which
it is) and so all what follows here lies in the first case of the Master Theorem.
Consider a divide-and-conquer algorithm which breaks a problem of size n into
a subproblems each of size n/b. In such a case we would have:
T (n) = aT (n/b)
Now observe:
• It seems reasonable that if a = b then we have no overall gain because
the number of new problems equals the reducing ratio (for example two
problems of half the size doesn’t help) but we can actually say more.
If we assume a reasonable T (1) = α for some constant α then this is
essentially saying, for example, that T (2) = 2T (2/2) = 2(1) = 2α, T (4) =
2T (4/2) = 2(2) = 4α, T (8) = 2T (8/2) = 2(4) = 8α, and so on, and in
general it seems reasonable that T (n) = nα = Θ(n).
This also seems reasonable with any a = b (not just 2), that we still get
T (n) = Θ(n).
This arises in the first case of the Master Theorem because if a = b then
logb a = 1 and then T (n) = Θ(nlogb a ) = Θ(n1 ).
• On the other hand if b > a then we have an overall decrease in time, for
example if T (n) = 2T (n/3) then the subproblems are 1/3 the size and
there are only two, that’s good, better than Θ(n)!
This arises in the first case of the Master Theorem because if b > a then
logb a < 1 and then T (n) = Θ(nlogb a ) = Θ(nless than 1 ).
• And on the other hand if b < a then we have an overall gain in time,,
for example if T (n) = 3T (n/2) then the subproblems are 1/2 the size but
there are three, that’s bad, worse than Θ(n)!
This arises in the first case of the Master Theorem because if b < a then
logb a > 1 and then T (n) = Θ(nlogb a ) = Θ(nmore than 1 ).
5
3.2 Proof Without the f (n)
If f (n) = 0 we can in fact solve the recurrence relation easily using the digging-
down approach. If we accept a base case of T (1) then we have:
T (n) = aT (n/b)
= a2 T (n/b2 )
= a3 T (n/b2 )
.. ..
. .
For any k ≥ 1 we have T (n) = ak T (n/bk ) which ends when n/bk = 1 which is
k = logb n.
Thus we get:
T (n) = ak T (1)
= alogb n T (1)
= a(loga n/ loga b) T (1)
1/ loga b
= aloga n T (1)
= n1/ loga b T (1)
= nlogb a T (1)
= Θ nlogb a
Now let’s suppose there is some additional time requirement f (n) for a problem
of size n.
• If this new time requirement is at most (meaning O) polynomially smaller
than the recursive part then the recursive part is the dominating factor.
This is represented in the theorem by the line:
If f (n) = Θ(nc ) for c < logb a then T (n) = Θ nlogb a .
• If this new time requirement is the same (meaning Θ) polynomially as the
recursive part then they combine and a logarithmic factor is introduced
(this is not obvious). This is represented in the theorem by the line:
If f (n) = Θ(nc ) for c = logb a then T (n) = Θ nlogb a lg n .
6
• If this new time requirement is at least (meaning Ω)mpolynomially larger
than the recursive part then this new time requirement is the dominating
factor. This is represented in the theorem by the line:
If f (n) = Θ(nc ) for c > logb a then T (n) = Θ (f (n)).
Along with the regularity condition.
Think of the aT (n/b) as the tree structure and f (n) as the node weights. Basi-
cally the Master Theorem is saying:
1. If the tree structure is greater than the node weights then the node weights
don’t matter and the tree structure wins - winter.
2. If they balance out nicely then they combine - spring.
3. If the tree structure is less than the node weights then the tree structure
doesn’t matter and the node weights win - summer.
7
5 Thoughts, Problems, Ideas
1. Suppose T (n) = 5T (n/5) + f (n).
√
(a) Apply the Master Theorem with f (n) = n.
√
(b) Apply the Master Theorem with f (n) = n + n.
√
(c) Apply the Master Theorem with f (n) = 3n + n3 .
(d) Apply the Master Theorem with f (n) = n lg n.
2. Suppose T (n) = 4T (n/8) + f (n).
√
(a) Apply the Master Theorem with f (n) = n.
(b) Apply the Master Theorem with f (n) = n2/3 + lg n.
(c) Apply the Master Theorem with f (n) = n.
8
3. Suppose T (n) = 4T (n/2) + f (n).
√
(a) Apply the Master Theorem with f (n) = n + n.
(b) Apply the Master Theorem with f (n) = n2 + n + 1.
(c) Apply the Master Theorem with f (n) = n3 lg n.
4. Binary Search has T (n) = T (n/2) + Θ(1). What is T (n)?
5. Merge Sort has T (n) = 2T (n/2) + Θ(n). What is T (n)?
6. The Max-Heapify routine in Heap Sort has T (n) ≤ T (2n/3) + Θ(1). What
is T (n)?
7. An optimal sorted matrix search has T (n) = 2T (n/2) + Θ(n). What is
T (n)?
8. A divide and conquer algorithm which splits a list of length n into two
equally sized lists, makes recursive calls to both and in addition uses con-
stant time will have T (n) = 2T (n) + C. What is T (n)?