0% found this document useful (0 votes)
3 views

mastertheorem_umd

The document outlines the Master Theorem, which provides a method for analyzing the time complexity of divide-and-conquer algorithms through recurrence relations. It includes a straightforward version of the theorem, applications with examples, and motivations behind the theorem's structure and implications. Additionally, it discusses cases where the theorem applies and offers intuitive explanations for its conditions.

Uploaded by

bhavya agrawal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

mastertheorem_umd

The document outlines the Master Theorem, which provides a method for analyzing the time complexity of divide-and-conquer algorithms through recurrence relations. It includes a straightforward version of the theorem, applications with examples, and motivations behind the theorem's structure and implications. Additionally, it discusses cases where the theorem applies and offers intuitive explanations for its conditions.

Uploaded by

bhavya agrawal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

CMSC 351: The Master Theorem

Justin Wyss-Gallifent

October 2, 2023

1 The Theorem (Straightforward Version) . . . . . . . . . . . . . . 2


2 Application and Examples . . . . . . . . . . . . . . . . . . . . . . 2
3 Motivation Behind the Theorem . . . . . . . . . . . . . . . . . . 5
3.1 Intution Without the f (n) . . . . . . . . . . . . . . . . . . 5
3.2 Proof Without the f (n) . . . . . . . . . . . . . . . . . . . 6
3.3 Intution with the f (n) . . . . . . . . . . . . . . . . . . . . 6
4 Even More Rudimentary Intuition . . . . . . . . . . . . . . . . . 7
5 Thoughts, Problems, Ideas . . . . . . . . . . . . . . . . . . . . . . 8

1
1 The Theorem (Straightforward Version)
Of course we would rather not do this sort of calculation every time so we
might ask if there are reliable formulas which emerge in specific situations and
the answer is yes, and these are encapsulated in the Master Theorem:
Theorem 1.0.1. Suppose T (n) satisfies the recurrence relation:
n
T (n) = aT + f (n)
b
n
n n
for positive integers a ≥ 1 and b > 1 and where b can mean either b or b ,
it doesn’t matter which. Then we have:

1. If f (n) = O(nc ) and logb a > c then T (n) = Θ nlogb a .

2. If f (n) = Θ(nc ) and logb a = c then T (n) = Θ nlogb a lg n .
 
2f. (Fancy Version) If f (n) = Θ(nc lgk n) and logb a = c then T (n) = Θ nlogb a lgk+1 n .

3. If f (n) = Ω(nc ) and logb a < c then T (n) = Θ (f (n)).


Note: For this case, f (n) must also satisfy a regularity condition which
states that there is some C < 1 and n0 such that af (n/b) ≤ Cf (n) for all
n ≥ n0 . This regularity condition is almost always true and we will not
worry about it.

Proof. Formal proof omitted. See the intuition section later, if you’re interested.
QED

2 Application and Examples


When applying the Master Theorem it can be helpful to successively try the
cases until we find one that works. It’s often easiest to check Case 2 and 2f first
as we’ll see.
Example 2.1. Consider:

T (n) = 4T (n/2) + n2 + lg n
Observe that f (n) = n2 + lg n and logb a = log2 4 = 2 and then:
• f (n) = Θ(n2 ) with c = 2:
Observe log2 4 = 2 = c so Case 2 applies and so:

T (n) = Θ(nlog2 4 lg n) = Θ(n2 lg n)

2
Example 2.2. Consider:

T (n) = 2T (n/8) + n

Observe that f (n) = n and logb a = log8 2 = 1/3 and then:


• f (n) = Θ(n) with c = 1:
Observe log8 2 = 1/3 6= 1 = c so Case 2/2f do not apply.
• f (n) = O(n) with c = 1:
Observe log8 2 = 1/3 6> 1 = c so Case 1 does not apply.
• f (n) = Ω(n) with c = 1:
Observe log8 2 = 1/3 < 1 = c so Case 3 applies and so:

T (n) = Θ(f (n)) = Θ(n)


Example 2.3. Consider:

T (n) = 3T (n/3) + n+1

Observe that f (n) = n + 1 and logb a = log3 3 = 1 and then:
• f (n) = Θ(n1/2 ) with c = 1/2:
Observe log3 3 = 1 6= 1/2 = c so Case 2/2f do not apply.
• f (n) = O(n1/2 ) with c = 1/2:
Observe log3 3 = 1 > 1/2 = c so Case 1 applies and so:

T (n) = Θ(nlog3 3 ) = Θ(n)


Example 2.4. Consider:

T (n) = 3T (n/4) + n lg n + n

Observe that f (n) = n lg n + n and logb a = log4 3 ≈ 0.∗ and then:


• f (n) = Θ(n lg n) with c = 1:
Observe log4 3 ≈ 0.∗ =
6 1 = c so Case 2/2f do not apply.
• f (n) = O(n2 ) with c = 2:
Observe log4 3 ≈ 0.∗ >
6 2 = c so Case 1 does not apply.
• f (n) = Ω(n) with c = 1:
Observe log4 3 ≈ 0.∗ < 1 = c so Case 3 applies and so:

T (n) = Θ(f (n)) = Θ(n lg n)

3
Example 2.5. Consider:

T (n) = 10T (n/2) + n2 lg n + n2 + 1

Observe that f (n) = n2 lg n + n2 + 1 and logb a = log2 10 ≈ 3.∗ and then:


• f (n) = Θ(n2 lg n) with c = 2:
Observe log2 10 ≈ 3.∗ =
6 2 = c so Case 2/2f do not apply.
• f (n) = O(n3 ) with c = 3:
Observe log2 10 ≈ 3.∗ > 3 = c so Case 1 applies and so:

T (n) = Θ(nlog2 10 lg n)
Example 2.6. Consider:

T (n) = 9T (n/3) + n2 lg n + lg n

Observe that f (n) = n2 lg n + lg n and logb a = log3 9 = 2 and then:


• f (n) = Θ(n2 lg n) with c = 2:
Observe log3 9 = 2 = c so Case 2f applies with k = 1 and so:

T (n) = Θ(nlog3 9 lg2 n) = Θ(n2 lg2 n)


Here are some examples where it does not apply:
Example 2.7. Supppose T (n) = 2T (n/4) + 3T (n/2) + n.
The Master Theorem does not apply because it has the wrong form. Note
that there is another method which often applies called the Akra-Bazzi
method. It applies to recurrence formulas of the following form under certain
conditions:
Xk
T (n) = f (n) + ai T (bi n + hi (n))
i=1

We will not cover it.


Example 2.8. Supppose T (n) = 2T (n/4) + f (n) and all you know is that
f (n) = O(n2 ).
The fact that f (n) = O(n2 ) implies that we could only use Case 1 and insists
that c = 2. However logb a = log4 2 = 0.5 6> 2 = c and so Case 1 does not
apply.
Example 2.9. Supppose T (n) = 8T (n/4) + f (n) and all you know is that
f (n) = Ω(n).
The fact that f (n) = Ω(n) implies that we could only use Case 3 and insists
that c = 1. However logb a = log4 8 = 23 6< 1 = c and so Case 3 does not
apply.

4
3 Motivation Behind the Theorem
3.1 Intution Without the f (n)
If f (n) = 0 then f (n) = O(1) = O(n0 ) and 0 < logb a as long as a > 1 (which
it is) and so all what follows here lies in the first case of the Master Theorem.
Consider a divide-and-conquer algorithm which breaks a problem of size n into
a subproblems each of size n/b. In such a case we would have:

T (n) = aT (n/b)

Now observe:
• It seems reasonable that if a = b then we have no overall gain because
the number of new problems equals the reducing ratio (for example two
problems of half the size doesn’t help) but we can actually say more.
If we assume a reasonable T (1) = α for some constant α then this is
essentially saying, for example, that T (2) = 2T (2/2) = 2(1) = 2α, T (4) =
2T (4/2) = 2(2) = 4α, T (8) = 2T (8/2) = 2(4) = 8α, and so on, and in
general it seems reasonable that T (n) = nα = Θ(n).
This also seems reasonable with any a = b (not just 2), that we still get
T (n) = Θ(n).
This arises in the first case of the Master Theorem because if a = b then
logb a = 1 and then T (n) = Θ(nlogb a ) = Θ(n1 ).
• On the other hand if b > a then we have an overall decrease in time, for
example if T (n) = 2T (n/3) then the subproblems are 1/3 the size and
there are only two, that’s good, better than Θ(n)!
This arises in the first case of the Master Theorem because if b > a then
logb a < 1 and then T (n) = Θ(nlogb a ) = Θ(nless than 1 ).
• And on the other hand if b < a then we have an overall gain in time,,
for example if T (n) = 3T (n/2) then the subproblems are 1/2 the size but
there are three, that’s bad, worse than Θ(n)!
This arises in the first case of the Master Theorem because if b < a then
logb a > 1 and then T (n) = Θ(nlogb a ) = Θ(nmore than 1 ).

5
3.2 Proof Without the f (n)
If f (n) = 0 we can in fact solve the recurrence relation easily using the digging-
down approach. If we accept a base case of T (1) then we have:

T (n) = aT (n/b)
= a2 T (n/b2 )
= a3 T (n/b2 )
.. ..
. .

For any k ≥ 1 we have T (n) = ak T (n/bk ) which ends when n/bk = 1 which is
k = logb n.

Thus we get:

T (n) = ak T (1)
= alogb n T (1)
= a(loga n/ loga b) T (1)
1/ loga b
= aloga n T (1)
= n1/ loga b T (1)
= nlogb a T (1)
= Θ nlogb a


3.3 Intution with the f (n)


Now that we’ve intuitively and formally accepted that:

T (n) = aT (n/b) =⇒ T (n) = Θ(nlogb a )

Now let’s suppose there is some additional time requirement f (n) for a problem
of size n.
• If this new time requirement is at most (meaning O) polynomially smaller
than the recursive part then the recursive part is the dominating factor.
This is represented in the theorem by the line:

If f (n) = Θ(nc ) for c < logb a then T (n) = Θ nlogb a .
• If this new time requirement is the same (meaning Θ) polynomially as the
recursive part then they combine and a logarithmic factor is introduced
(this is not obvious). This is represented in the theorem by the line:

If f (n) = Θ(nc ) for c = logb a then T (n) = Θ nlogb a lg n .

6
• If this new time requirement is at least (meaning Ω)mpolynomially larger
than the recursive part then this new time requirement is the dominating
factor. This is represented in the theorem by the line:
If f (n) = Θ(nc ) for c > logb a then T (n) = Θ (f (n)).
Along with the regularity condition.

4 Even More Rudimentary Intuition


To think even more crudely, but not inaccurately, when you see the recurrence
relation:

T (n) = aT (n/b) + f (n)

Think of the aT (n/b) as the tree structure and f (n) as the node weights. Basi-
cally the Master Theorem is saying:
1. If the tree structure is greater than the node weights then the node weights
don’t matter and the tree structure wins - winter.
2. If they balance out nicely then they combine - spring.
3. If the tree structure is less than the node weights then the tree structure
doesn’t matter and the node weights win - summer.

7
5 Thoughts, Problems, Ideas
1. Suppose T (n) = 5T (n/5) + f (n).

(a) Apply the Master Theorem with f (n) = n.

(b) Apply the Master Theorem with f (n) = n + n.

(c) Apply the Master Theorem with f (n) = 3n + n3 .
(d) Apply the Master Theorem with f (n) = n lg n.
2. Suppose T (n) = 4T (n/8) + f (n).

(a) Apply the Master Theorem with f (n) = n.
(b) Apply the Master Theorem with f (n) = n2/3 + lg n.
(c) Apply the Master Theorem with f (n) = n.

8
3. Suppose T (n) = 4T (n/2) + f (n).

(a) Apply the Master Theorem with f (n) = n + n.
(b) Apply the Master Theorem with f (n) = n2 + n + 1.
(c) Apply the Master Theorem with f (n) = n3 lg n.
4. Binary Search has T (n) = T (n/2) + Θ(1). What is T (n)?
5. Merge Sort has T (n) = 2T (n/2) + Θ(n). What is T (n)?
6. The Max-Heapify routine in Heap Sort has T (n) ≤ T (2n/3) + Θ(1). What
is T (n)?
7. An optimal sorted matrix search has T (n) = 2T (n/2) + Θ(n). What is
T (n)?
8. A divide and conquer algorithm which splits a list of length n into two
equally sized lists, makes recursive calls to both and in addition uses con-
stant time will have T (n) = 2T (n) + C. What is T (n)?

You might also like