0% found this document useful (0 votes)
21 views

Single-Source Shortest Paths - Cormen book Ch 24

Uploaded by

nay33n
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

Single-Source Shortest Paths - Cormen book Ch 24

Uploaded by

nay33n
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Chapter 24:

Single-Source Shortest Paths


Shortest paths

How to find the shortest route between two points on a map.


Input:
• Directed graph G = (V, E)
• Weight function ω : E → R
Weight of path p =〈υ0, υ1, . . . , υk〉
k
=   (i −1 , i )
i =1
= sum of edge weights on path p.
Shortest-path weight u to υ:
p
min { ( p) : u } if there exists a path u ,
δ(u, υ) =
 otherwise .
Shortest path u to υ is any path p such that ω(p) = δ(u, υ).

2
Shortest paths

Example: shortest paths from s


[δ values appear inside vertices. Shaded edges show shortest paths.]
t x t x
6 6
3 9 3 9
3 3

s 0 2 1 4 2 7 s 0 2 1 4 2 7

3 3
5 5
5 11 5 11
6 6
y z y z
This example shows that the shortest path might not be unique.
It also shows that when we look at shortest paths from one vertex to all other
vertices, the shortest paths are organized as a tree.
Can think of weights as representing any measure that
• accumulates linearly along a path,
• we want to minimize.
Examples: time, cost, penalties, loss.
Generalization of breadth-first search to weighted graphs.

3
Shortest paths

Variants

• Single-source: Find shortest paths from a given source vertex s ∈ V to every


vertex υ ∈ V.
• Single-destination: Find shortest paths to a given destination vertex.
• Single-pair: Find shortest path from u to υ. No way known that’s better in
worst case than solving single-source.
• All-pairs: Find shortest path from u to υ for all u, υ ∈ V. We’ll see algorithms
for all-pairs in the next chapter.

Negative-weight edges

OK, as long as no negative-weight cycles are reachable from the source.


• If we have a negative-weight cycle, we can just keep going around it, and get
ω(s, υ) = −∞ for all υ on the cycle.
• But OK if the negative-weight cycle is not reachable from the source.
• Some algorithms work only if there are no negative-weight edges in the graph.
We’ll be clear when they’re allowed and not allowed.

4
Shortest paths

Optimal substructure

Lemma
Any subpath of a shortest path is a shortest path.
Proof Cut-and-paste.
pux pxy pyv
u x y v

Suppose this path p is a shortest path from u to υ.


Then δ(u, υ) = ω(p) = ω(pux) + ω(pxy) + ω(pyυ).
p′xy
Now suppose there exists a shorter path x y.
Then ω(p′xy) < ω(pxy).
Construct p′:
pux p'xy pyv
u x y v

5
Shortest paths

Then
ω(p′) = ω(pux) + ω(p′x y) + ω(pyυ)
< ω(pux) + ω(px y) + ω(pyυ)
= ω(p) .
So p wasn’t a shortest path after all! ■ (lemma)

Cycles

Shortest paths can’t contain cycles:


• Already ruled out negative-weight cycles.
• Positive-weight ⇒ we can get a shorter path by omitting the cycle.
• Zero-weight : no reason to use them ⇒ assume that our solutions won’t use
them.

6
Shortest paths

Output of single-source shortest-path algorithm

For each vertex υ ∈ V :


• d[υ] = δ(s, υ).
• Initially, d[υ] = ∞.
• Reduces as algorithms progress. But always maintain d[υ] ≥ δ(s, υ).
• Call d[υ] a shortest-path estimate.
• π[υ] = predecessor of υ on a shortest path from s.
• If no predecessor, π[υ] = NIL.
• π induces a tree—shortest-path tree.
• We won’t prove properties of π in lecture—see text.

7
Shortest paths

Initialization

All the shortest-paths algorithms start with INIT-SINGLE-SOURCE.

INIT-SINGLE-SOURCE(V, s)
for each υ ∈ V
do d[υ] ← ∞
π[υ] ← NIL
d[s] ← 0

Relaxing an edge (u, υ)

Can we improve the shortest-path estimate for υ by going through u and taking
(u, υ)?

RELAX(u, υ, ω)
if d[υ] > d[u] + ω(u, υ)
then d[υ] ← d[u] + ω(u, υ)
π[υ] ← u

8
Shortest paths

u v
3 3
4 10 4 6
RELAX RELAX

4 7 4 6

For all the single-source shortest-paths algorithms we’ll look at,


• start by calling INIT-SINGLE-SOURCE,
• then relax edges.
The algorithms differ in the order and how many times they relax each edge.

9
Shortest-paths properties

Based on calling INIT-SINGLE-SOURCE once and then calling RELAX zero or


more times.

Triangle inequality

For all (u, υ) ∈ E, we have δ(s, υ) ≤ δ(s, u) + ω(u, υ).


Proof Weight of shortest path s υ is ≤ weight of any path s υ. Path
s u → υ is a path s υ, and if we use a shortest path s u, its weight is
δ(s, u) + ω(u, υ). ■

Upper-bound property

Always have d[υ] ≥ δ(s, υ) for all υ. Once d[υ] = δ(s, υ), it never changes.
Proof Initially true.
Suppose there exists a vertex such that d[υ] < δ(s, υ).
Without loss of generality, υ is first vertex for which this happens.
Let u be the vertex that causes d[υ] to change.

10
Shortest-paths properties

Then d[υ] = d[u] + ω(u, υ).


So,
d[υ] < δ(s, υ)
≤ δ(s, u) + ω(u, υ) (triangle inequality)
≤ d[u] + ω(u, υ) (υ is first violation)
⇒ d[υ] < d[u] + ω(u, υ) .
Contradicts d[υ] = d[u] + ω(u, υ).
Once d[υ] reaches δ(s, υ), it never goes lower. It never goes up, since relaxations
only lower shortest-path estimates. ■

No-path property

If δ(s, υ) = ∞, then d[υ] = ∞ always.


Proof d[υ] ≥ δ(s, υ) = ∞ ⇒ d[υ] = ∞. ■

11
Shortest-paths properties

Convergence property

If s u → υ is a shortest path, d[u] = δ(s, u), and we call RELAX(u, υ, ω), then
d[υ] = δ(s, υ) afterward.
Proof After relaxation:
d[υ] ≤ d[u] + ω(u, υ) (RELAX code)
= δ(s, u) + ω(u, υ)
= δ(s, υ) (lemma—optimal substructure)
Since d[υ] ≥ δ(s, υ), must have d[υ] = δ(s, υ). ■

Path relaxation property

Let p =〈υ0, υ1, . . . , υk〉be a shortest path from s = υ0 to υk. If we relax,


in order, (υ0, υ1), (υ1, υ2), . . . , (υk−1, υk), even intermixed with other relaxations,
then d[υk] = δ(s, υk).
Proof Induction to show that d[υi] = δ(s, υi) after (υi−1, υi) is relaxed.
Basis: i = 0. Initially, d[υ0] = 0 = δ(s, υ0) = δ(s, s).
Inductive step: Assume d[υi−1] = δ(s, υi−1). Relax (υi−1, υi). By convergence
property, d[υi] = δ(s, υi) afterward and d[υi] never changes. ■

12
The Bellman-Ford algorithm

• Allows negative-weight edges.


• Computes d[υ] and π[υ] for all υ ∈ V.
• Returns TRUE if no negative-weight cycles reachable from s, FALSE otherwise.

BELLMAN-FORD(V, E, ω, s)
INIT-SINGLE-SOURCE(V, s)
for i ← 1 to |V| − 1
do for each edge (u, υ) ∈ E
do RELAX(u, υ, ω)
for each edge (u, υ) ∈ E
do if d[υ] > d[u] + ω(u, υ)
then return FALSE
return TRUE

Core: The first for loop relaxes all edges |V| − 1 times.
Time: Θ(V E).

13
The Bellman-Ford algorithm

Example:
r
-1 -1 2

s 0 3 1 2 1 x

4 -3

2 -2
5
z y
Values you get on each pass and how quickly it converges depends on order of
relaxation.
But guaranteed to converge after |V| − 1 passes, assuming no negative-weight
cycles.

14
The Bellman-Ford algorithm

Proof Use path-relaxation property.


Let υ be reachable from s, and let p =〈υ0, υ1, . . . , υk〉 be a shortest path from s
to υ, where υ0 = s and υk = υ. Since p is acyclic, it has ≤ |V| − 1 edges, so
k ≤ |V| − 1.
Each iteration of the for loop relaxes all edges:
• First iteration relaxes (υ0, υ1).
• Second iteration relaxes (υ1, υ2).
• kth iteration relaxes (υk−1, υk).
By the path-relaxation property, d[υ] = d[υk ] = δ(s, υk) = δ(s, υ). ■

How about the TRUE/FALSE return value?


• Suppose there is no negative-weight cycle reachable from s.
At termination, for all (u, υ) ∈ E,
d[υ] = δ(s, υ)
≤ δ(s, u) + ω(u, υ) (triangle inequality)
= d[u] + ω(u, υ) .
So BELLMAN-FORD returns TRUE.

15
The Bellman-Ford algorithm

• Now suppose there exists negative-weight cycle c =〈υ0, υ1, . . . , υk〉, where
υ0 = υk, reachable from s.
k
Then 
i =1
(i−1 , i ) 0.
Suppose (for contradiction) that BELLMAN-FORD returns TRUE.
Then d[υi ] ≤ d[υi−1] + ω(υi−1, υi) for i = 1, 2, . . . , k.
Sum around c:
k k

 d[i ]
i =1
 
i =1
(d [i−1 ] +  (i −1 , i ))
k k
= 
i =1
d [i−1 ] +  (i−1 , i )
i =1

Each vertex appears once in each summation  ik=1 d[i ] and  ik=1 d[i−1 ] 
k
0    (i−1 , i ) .
i =1
This contradicts c being a negative-weight cycle! ■

16
Single-source shortest paths in a directed acyclic graph

Since a dag, we’re guaranteed no negative-weight cycles.

DAG-SHORTEST-PATHS(V, E, ω, s)
topologically sort the vertices
INIT-SINGLE-SOURCE(V, s)
for each vertex u, taken in topologically sorted order
do for each vertex υ ∈ Adj[u]
do RELAX(u, υ, ω)

Example:
6 1
s t x y z
2 7 -1 -2
0 2 6 5 3
4
2
Time: Θ(V + E).

Correctness: Because we process vertices in topologically sorted order, edges of


any path must be relaxed in order of appearance in the path.
⇒ Edges on any shortest path are relaxed in order.
⇒ By path-relaxation property, correct. ■
17
Dijkstra’s algorithm

No negative-weight edges.
Essentially a weighted version of breadth-first search.
• Instead of a FIFO queue, uses a priority queue.
• Keys are shortest-path weights (d[υ]).
Have two sets of vertices:
• S = vertices whose final shortest-path weights are determined
• Q = priority queue = V − S.

DIJKSTRA(V, E, ω, s)
INIT-SINGLE-SOURCE(V, s)
S←∅
Q←V ▷i.e., insert all vertices into Q
while Q ≠ ∅
do u ← EXTRACT-MIN(Q)
S ← S ∪ {u}
for each vertex υ ∈ Adj[u]
do RELAX(u, υ, ω)

18
Dijkstra’s algorithm

• Looks a lot like Prim’s algorithm, but computing d[υ], and using shortest-path
weights as keys.
• Dijkstra’s algorithm can be viewed as greedy, since it always chooses the “light-
est” (“closest”?) vertex in V − S to add to S.

Example:
x
8
10 2

s 0 3 4 6 z

5 1
5
y

Order of adding to S: s, y, z, x.

19
Dijkstra’s algorithm

Correctness:
Loop invariant: At the start of each iteration of the while loop, d[υ] =
δ(s, υ) for all υ ∈ S.

Initialization: Initially, S = ∅, so trivially true.


Termination: At end, Q = ∅ ⇒ S = V ⇒ d[υ] = δ(s, υ) for all υ ∈ V.
Maintenance: Need to show that d[u] = δ(s, u) when u is added to S in each
iteration.
Suppose there exists u such that d[u] ≠ δ(s, u). Without loss of generality,
let u be the first vertex for which d[u] ≠ δ(s, u) when u is added to S.
Observations:
• u ≠ s, since d[s] = δ(s, s) = 0.
• Therefore, s ∈ S, so S ≠ ∅.
• There must be some path s u, since otherwise d[u] = δ(s, u) = ∞ by
no-path property.
So, there’s a path s u.

20
Dijkstra’s algorithm

p
This means there’s a shortest path s u.
Just before u is added to S, path p connects a vertex in S (i.e., s) to a vertex in
V − S (i.e., u).
Let y be first vertex along p that’s in V − S, and let x ∈ S be y’s predecessor.

p2
s
S u
p1

x y

p p2
Decompose p into s 1 x → y u. (Could have x = s or y = u, so that p1
or p2 may have no edges.)

21
Dijkstra’s algorithm

Claim
d[y] = δ(s, y) when u is added to S.
Proof x ∈ S and u is the first vertex such that d[u] ≠ δ(s, u) when u is added
to S ⇒ d[x] = δ(s, x) when x is added to S. Relaxed (x, y) at that time, so by
the convergence property, d[y] = δ(s, y). ■(claim)

Now can get a contradiction to d[u] ≠ δ(s, u):


y is on shortest path s u, and all edge weights are nonnegative
⇒ δ(s, y) ≤ δ(s, u)⇒
d[y] = δ(s, y)
≤ δ(s, u)
≤ d[u] (upper-bound property) .
Also, both y and u were in Q when we chose u, so
d[u] ≤ d[y] ⇒ d[u] = d[y] .
Therefore, d[y] = δ(s, y) = δ(s, u) = d[u].
Contradicts assumption that d[u] ≠ δ(s, u). Hence, Dijkstra’s algorithm is
correct. ■

22
Dijkstra’s algorithm

Analysis: Like Prim’s algorithm, depends on implementation of priority queue.


• If binary heap, each operation takes O(lg V) time⇒ O(E lg V).
• If a Fibonacci heap:
• Each EXTRACT-MIN takes O(1) amortized time.
• There are O(V) other operations, taking O(lg V) amortized time each.
• Therefore, time is O(V lg V + E).

23
Difference constraints

Given a set of inequalities of the form x j − xi ≤ bk .


• x’s are variables, 1 ≤ i, j ≤ n,
• b’s are constants, 1 ≤ k ≤ m.
Want to find a set of values for the x’s that satisfy all m inequalities, or determine
that no such values exist. Call such a set of values a feasible solution.

Example:
x1 − x2 ≤ 5
x1 − x3 ≤ 6
x2 − x4 ≤ −1
x3 − x4 ≤ −2
x4 − x1 ≤ −3
Solution: x = (0, −4, −5, −3)
Also: x = (5, 1, 0, 2) = [above solution] + 5

Lemma
If x is a feasible solution, then so is x + d for any constant d.
Proof x is a feasible solution ⇒ x j − xi ≤ bk for all i, j, k
⇒ (x j + d) − (xi + d) ≤ bk . ■(lemma)

24
Difference constraints

Constraint graph

G = (V, E), weighted, directed.


• V = {υ0, υ1, υ2, . . . , υn}: one vertex per variable + υ0
• E = {(υi , υ j) : x j − xi ≤ bk is a constraint} ∪ {(υ0, υ1), (υ0, υ2), . . . , (υ0, υn)}
• ω(υ0, υ j) = 0 for all j
• ω(υi , υ j) = bk if x j − xi ≤ bk

v1 v2
0 5
0 -4
0
-1
v0 0 -3
6
0
-3 -5
0 v4 -2 v3

25
Difference constraints

Theorem
Given a system of difference constraints, let G = (V, E) be the corresponding
constraint graph.
1. If G has no negative-weight cycles, then
x = (δ(υ0, υ1), δ(υ0, υ2), . . . , δ(υ0, υn))
is a feasible solution.
2. If G has a negative-weight cycle, then there is no feasible solution.

Proof
1. Show no negative-weight cycles ⇒ feasible solution.
Need to show that x j − xi ≤ bk for all constraints. Use
x j = δ(υ0, υ j)
xi = δ(υ0, υi)
bk = ω(υi, υ j) .
By the triangle inequality,
δ(υ0, υ j) ≤ δ(υ0, υi) + ω(υi, υ j)
x j ≤ xi + bk
x j − xi ≤ bk .
Therefore, feasible.

26
Difference constraints

2. Show negative-weight cycles ⇒ no feasible solution.


Without loss of generality, let a negative-weight cycle be c =〈υ1, υ2, . . . , υk〉,
where υ1 = υk . (υ0 can’t be on c, since υ0 has no entering edges.) c corresponds
to the constraints
x2 − x1 ≤ ω(υ1, υ2) ,
x3 − x2 ≤ ω(υ2, υ3) ,

xk−1 − xk−2 ≤ ω(υk−2, υk−1) ,
xk − xk−1 ≤ ω(υk−1, υk) .

If x is a solution satisfying these inequalities, it must satisfy their sum.


So add them up.
Each xi is added once and subtracted once. (υ1 = υk ⇒ x1 = xk)
We get 0 ≤ ω(c).
But ω(c) < 0, since c is a negative-weight cycle.
Contradiction ⇒ no such feasible solution x exists. ■(theorem)

27
Difference constraints

How to find a feasible solution

1. Form constraint graph.


• n + 1 vertices.
• m + n edges.
• Θ(m + n) time.
2. Run BELLMAN-FORD from υ0.
• O((n + 1)(m + n)) = O(n2 + nm) time.
3. If BELLMAN-FORD returns FALSE ⇒ no feasible solution.
If BELLMAN-FORD returns TRUE ⇒ set xi = δ(υ0, υi) for all i .

28

You might also like