0% found this document useful (0 votes)
13 views

Chapter 4

Uploaded by

hahue
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Chapter 4

Uploaded by

hahue
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Chapter 4: Recurrences

Sun-Yuan Hsieh
謝孫源 教授
成功大學資訊工程學系
Overview

A recurrence is a function is defined in terms


of
one or more base cases, and
itself, with smaller arguments

2
1 if n 1
Examples: T  n  
T  n  1  1 if n  1
Solution : T  n  n.
1 if n 1
T  n  
2T  n / 2   n if n  1
Solution : T  n  n lg n  n.
0 if n 2
T  n  
 
T n  1 if n  2
Solution : T  n  lg lg n.
1 if n 1
T  n  
T  n / 3  T  2n / 3  n if n  1
Solution : T ( n) ( n lg n)
3
Many technical issues:
Floors and ceilings
[Floors and ceilings can easily be removed and
don’t affect the solution to the recurrence.]
Exact vs. asymptotic functions
Boundary condition

4
In algorithm analysis, we usually express both the recurrence and
its solution using asymptotic notation.

E.g. T  n  2T  n / 2   n  , with solution T  n   n lg n 

The boundary conditions are usually expressed as


“T(n) = O(1) for sufficiently small n.”

When we desire an exact, rather than an asymptotic,


solution, we need to deal with boundary conditions.

In practice, we just use asymptotic most of the time, and we


ignore boundary conditions.
5
Substitution method

1. Guess the solution.


2. Use induction to find the constants and show that the
solution works.

E.g.
1 if n 1 ,
T  n  
2T  n / 2  n if n  1 .
1. Guess: T(n) = nlgn + n. [Here, we have a recurrence with an
exact function, rather than asymptotic notation, and the solution
is also exact rather than asymptotic. We’ll have to check
boundary conditions and the base case.]
6
Substitution method (cont’d)
2. Induction:
Bass: n = 1  nlgn + n = 1 = T(n)
Inductive step: Inductive hypothesis is that T(k) = klgk + k for all k <
n.
We’ll use this
 ninductive
 hypothesis for T(n/2).
T  n  2T    n
 2
 n n n
2 lg    n
 2 2 2
n
n lg  n  n
2
n lg n  lg 2   n  n
n lg n  n  n  n.
n lg n  n 7
Substitution method (cont’d)

Generally, we use asymptotic notation:

T  n  2T  n / 2    n 
Assume
T  n  O1 for sufficient ly small n
Express the solution by asymptotic notation:

Don’t worry about boundary cases, nor do we show base


T  n  proof.
cases in the substitution  n lg n .

8
Substitution method (cont’d)

T(n) is always constant for any constant n.


Since we are ultimately interested in asymptotic
solution to a recurrence, it will always be possible
to choose base cases that work
When we want an asymptotic solution to a
recurrence, we don’t worry about the base cases in
our proofs.
When we want an exact solution, then we have to
deal with base cases.

9
Substitution method (cont’d)
For the substitution method:
Name the constant in the additive term
Show the upper (O) and lower (Ω) bounds separately.
Might need to use different constants for each notation

E.g.: . If we want to show an upper


bound of , we write for some
T  nconstant
positive  2T  n / c.
2    n 
T  n T  n  2T  n / 2   cn

10
1. Upper bound:
Guess: T  n  dn lg n for some positive constant d. We are given c in the
recurrence, and we get to choose d as any positive constant. It’s OK for d to
depend on c.
Substitution:
T  n  2T  n / 2   cn
 n n
2 d lg   cn
 2 2
n
dn lg  cn
2
Therefore, T(n) = O(n lg n)
dn lg n-dn  cn
dn lg n if -dn  cn 0 ,
d c
11
2. Lower bound:
Write for some positive constant c.
 n  2T  n / 2  cn for some positive constant d.
TGuess:
Substitution:
T  n  dn lg n

T  n  2T  n / 2  cn
 n n
2 d lg   cn
 2 2
n
dn lg  cn
2
Therefore, T(n) = Ω(ndnlg lg
n).n  dn  cn
Therefore, T(n) = Θ( n lg n ) 
[For
dn this
lg n particularif recurrence,
 dn  cn we
0 , can use d = c for
both the upper-bound and lower-bound proofs. That won’t always be the case.]
d c

12
Substitution method (cont’d)

Make sure you show the same exact form when doing a
substitution proof.
Consider the recurrence
T(n) = 8T(n/2) + Θ(n2) .
For an upper bound:
Guess: T(n) ≤ dn3 2
T  n  8T  n / 2   cn
3
T  n  8d  n / 2  cn 2
 
8d n 3 / 8  cn 2
dn 3  cn 2
dn3 doesn' t work!
13
Substitution method (cont’d)

Remedy: Subtract off a lower-order term.


Guess: T(n) ≤ d n3 – d’ n2
T(n) ≤ 8(d (n/2)3 – d’(n/2)2) + cn2
= 8d (n3/8) – 8d’ (n2/4) + cn2
= d n3 – 2 d’ n2 + cn2
≤ d n3 – d’ n2 if – 2d’n2 + cn2 ≤ – d’ n2 ,
d’ c

14
Substitution method (cont’d)

Be careful when using asymptotic notation.


The false proof for the recurrence T(n) = 4T(n/4) +
n, that T(n) = O(n):
T(n) ≤ 4( c(n/4)) + n
≤ cn + n
= O(n) wrong!
Because we haven’t proven the exact from of our
inductive hypothesis (which is that T(n) ≤ cn), this
proof is false.

15
Recurrence trees

Goal of the recursion-tree method


a good guess for the substitution method
a direct proof of a solution to a recurrence (provided by
carefully drawing a recursion tree)

16
Recurrence trees

T (n) 3T (  n / 4 )  (n 2 )

17
Recurrence trees

cn 2 cn 2

c( n4 ) 2 c( n4 ) 2 c( n4 ) 2 3
16 cn 2

log 4 n
c( 16n ) 2 c( 16n ) 2 c( 16n ) 2 c( 16n ) 2 c( 16n ) 2 c( 16n ) 2 c( 16n ) 2 c( 16n ) 2 c( 16n ) 2 ( 163 ) 2 cn 2

T (1) T (1) T (1) T (1) T (1) T (1) T (1) T (1) T (1) T (1)  T (1) T (1) T (1) (n log4 3 )

n log4 3
(d) Total : O(n 2 )
18
Recurrence trees

The cost of the entire tree


2 log 4 n  1
3 2  3 2  3
2
T (n) cn  cn    cn  ...    cn2  (nlog 3 )
4

16  16   16 
log 4 n  1 i
 3 2
    cn   n
 16 
log 4
3

i 0

(3 / 16)log n  1 2
4

 cn  (nlog 3 ). 4

(3 / 16)  1

19
Recurrence trees

log 4 n  1 i
 3 2
T ( n)    
 16 
cn   n log
 4
3

i 0
i

 3 2
    cn   nlog 3  4

i 0  16 
1

1  (3 / 16)
cn 2   n log  4 3

16 2
 cn  ( nlog 3 ) 4

13
O(n 2 )

20
Recurrence trees

Verify by the substitution method


Show that T (n) dn 2 for some constant d  0
T ( n) 3T (  n / 4 )  cn 2
2
3d  n / 4  cn 2
3d ( n / 4) 2  cn 2
3
 dn 2  cn 2
16
dn 2 ,

where the last step holds as long as d (16 / 13) c

21
Recurrence trees

Use to generate a guess. Then verify by substitution method.


E.g.: T(n) = T(n/3) + T(2n/3) + Θ(n). For upper bound, rewrite as T(n) ≤
T(n/3) + T(2n/3) + cn; for lower bound, as T(n) ≥ T(n/3) + T(2n/3) + cn. By
summing across each level, the recursion tree shows the cost at each level of
recursion (minus the costs of recursive calls, which appear in subtrees):
cn cn

c(n/3) c(2n/3) cn

T(n/9) T(2n/9) T(2n/9) T(4n/9) cn



leftmost branch peters rightmost branch peters
out after log3 n levels out after log3/2 n levels
22
Recurrence trees (cont’d)

There are log3n full levels, and after log3/2n levels, the
problem size is down to 1.

Each level contributes ≤ cn.

Lower bound guess: ≥ d n log3 n = Ω(n log n) for


some positive constant d.

Upper bound guess: ≤ d n log3/2 n = O(n log n) for


some positive constant d.

Then prove by substitution.


23
Recurrence trees (cont’d)

1. Upper bound:
Guess: T(n) ≤ dnlgn.
Substitution:
T(n) ≤ T(n/3) + T(2n/3) +cn
≤ d (n/3) lg (n/3) + d(2n/3) lg (2n/3) + cn
= (d (n/3) lg n – d (n/3)lg3) + (d (2n/3) lgn – d (2n/3) lg (3/2) ) + cn
= d n lg n – d ((n/3) lg3 + (2n/3) lg (3/2)) + cn
= d n lg n – d ((n/3) lg3 + (2n/3) lg3 – (2n/3) lg2) +cn
= d n lg n – d n (lg3 – 2/3) + cn
≤ d n lg n if – d n (lg3 – 2/3) + cn ≤ 0,
c
d ≥
lg 3  2 / 3
Therefore, T(n) = O(n lg n).
Note: Make sure that symbolic constants used in the recurrence (e.g.,c)
and the guess (e.g.,d) are different.
24
Recurrence trees (cont’d)

2. Lower bound:
Guess: T(n) dn lg n.
Substitution: Same as for the upper bound, but replacing ≤ by ≥. End up
needing
c
0d 
lg 3  2 / 3
Therefore, T(n) = Ω(n lg n).
Since T(n) = O(n lg n) and T(n) = Ω(n lg n), we conclude that T(n) = Θ(n lg n)

25
Master method (cond’t)

Used for many divide-and-conquer recurrences of the form


T(n) = aT(n/b) + f (n),
Where a ≥ 1, b > 1, and f (n) > 0.
Based on the master theorem (Theorem 4.1).
Compare nlogba vs. f (n):

Case 1: f (n) = O ( nlogba-ε ) for some constant ε > 0.


(f (n) is polynomially smaller than nlogba)
Solution: T(n) = Θ( nlogba )

26
Master method (cond’t)

Case 2: f (n) = Θ(nlogbalgkn), where k ≥ 0


(f (n) is within a polylog factor of nlogba , but not smaller)

Solution: T(n) = Θ( nlogba lgk+1 n)

Simple case:
k =0  f (n) = Θ( nlogba ) T(n) = Θ( nlogba lg n)

27
Master method (cond’t)

Case 3: f (n) = Ω (nlogba+ε ) for some constant ε > 0 and f (n)


satisfies the regularity condition a f ( n/b ) ≤ c f (n) for some
constant c < 1 and all sufficiently large n.
( f (n) is polynomially greater than nlogba )

Solution: T(n) = Θ( f (n) )

28
Master method (cond’t)

What’s with the Case 3 regularity condition?


Generally not a problem.
It always holds whenever f (n) = nk and f (n) = Ω(nlogba+ε )
for constant ε > 0. So you don’t need to check it when f (n)
is a polynomial.

29
Master method (cond’t)

Examples:
T(n) = 5T(n/2) + Θ(n2)
nlog25 vs. n2
Since log25 – ε = 2 for some constant ε > 0, use Case 1 T(n) =
Θ(nlg5)

T(n) = 27T(n/3) + Θ(n3lgn)


nlog327 =n3 vs. n3lgn
Use Case 2 with k = 1  T(n) = Θ(n3lg2n)

30
Master method (cond’t)

T(n) = 5T(n/2) + Θ(n3)


nlog25 vs. n3
Now lg5 + ε = 3 for some constant ε > 0
Check regularity condition (don’t really need to since f (n) is a
polynomial):
a f (n/b) = 5(n/2)3 = 5n3/8 ≤ cn3 for c =5/8 <1
Use Case 3 T(n) = Θ(n3)

T(n) = 27T(n/3) + Θ(n3/lgn)


nlog327 = n3 vs. n3/lgn = n3lg-1n ≠ Θ(n3lgkn) for any k ≥ 0
Cannot use the master method.
31

You might also like