Single-Source Shortest Paths - Cormen book Ch 24
Single-Source Shortest Paths - Cormen book Ch 24
2
Shortest paths
s 0 2 1 4 2 7 s 0 2 1 4 2 7
3 3
5 5
5 11 5 11
6 6
y z y z
This example shows that the shortest path might not be unique.
It also shows that when we look at shortest paths from one vertex to all other
vertices, the shortest paths are organized as a tree.
Can think of weights as representing any measure that
• accumulates linearly along a path,
• we want to minimize.
Examples: time, cost, penalties, loss.
Generalization of breadth-first search to weighted graphs.
3
Shortest paths
Variants
Negative-weight edges
4
Shortest paths
Optimal substructure
Lemma
Any subpath of a shortest path is a shortest path.
Proof Cut-and-paste.
pux pxy pyv
u x y v
5
Shortest paths
Then
ω(p′) = ω(pux) + ω(p′x y) + ω(pyυ)
< ω(pux) + ω(px y) + ω(pyυ)
= ω(p) .
So p wasn’t a shortest path after all! ■ (lemma)
Cycles
6
Shortest paths
7
Shortest paths
Initialization
INIT-SINGLE-SOURCE(V, s)
for each υ ∈ V
do d[υ] ← ∞
π[υ] ← NIL
d[s] ← 0
Can we improve the shortest-path estimate for υ by going through u and taking
(u, υ)?
RELAX(u, υ, ω)
if d[υ] > d[u] + ω(u, υ)
then d[υ] ← d[u] + ω(u, υ)
π[υ] ← u
8
Shortest paths
u v
3 3
4 10 4 6
RELAX RELAX
4 7 4 6
9
Shortest-paths properties
Triangle inequality
Upper-bound property
Always have d[υ] ≥ δ(s, υ) for all υ. Once d[υ] = δ(s, υ), it never changes.
Proof Initially true.
Suppose there exists a vertex such that d[υ] < δ(s, υ).
Without loss of generality, υ is first vertex for which this happens.
Let u be the vertex that causes d[υ] to change.
10
Shortest-paths properties
No-path property
11
Shortest-paths properties
Convergence property
If s u → υ is a shortest path, d[u] = δ(s, u), and we call RELAX(u, υ, ω), then
d[υ] = δ(s, υ) afterward.
Proof After relaxation:
d[υ] ≤ d[u] + ω(u, υ) (RELAX code)
= δ(s, u) + ω(u, υ)
= δ(s, υ) (lemma—optimal substructure)
Since d[υ] ≥ δ(s, υ), must have d[υ] = δ(s, υ). ■
12
The Bellman-Ford algorithm
BELLMAN-FORD(V, E, ω, s)
INIT-SINGLE-SOURCE(V, s)
for i ← 1 to |V| − 1
do for each edge (u, υ) ∈ E
do RELAX(u, υ, ω)
for each edge (u, υ) ∈ E
do if d[υ] > d[u] + ω(u, υ)
then return FALSE
return TRUE
Core: The first for loop relaxes all edges |V| − 1 times.
Time: Θ(V E).
13
The Bellman-Ford algorithm
Example:
r
-1 -1 2
s 0 3 1 2 1 x
4 -3
2 -2
5
z y
Values you get on each pass and how quickly it converges depends on order of
relaxation.
But guaranteed to converge after |V| − 1 passes, assuming no negative-weight
cycles.
14
The Bellman-Ford algorithm
15
The Bellman-Ford algorithm
• Now suppose there exists negative-weight cycle c =〈υ0, υ1, . . . , υk〉, where
υ0 = υk, reachable from s.
k
Then
i =1
(i−1 , i ) 0.
Suppose (for contradiction) that BELLMAN-FORD returns TRUE.
Then d[υi ] ≤ d[υi−1] + ω(υi−1, υi) for i = 1, 2, . . . , k.
Sum around c:
k k
d[i ]
i =1
i =1
(d [i−1 ] + (i −1 , i ))
k k
=
i =1
d [i−1 ] + (i−1 , i )
i =1
Each vertex appears once in each summation ik=1 d[i ] and ik=1 d[i−1 ]
k
0 (i−1 , i ) .
i =1
This contradicts c being a negative-weight cycle! ■
16
Single-source shortest paths in a directed acyclic graph
DAG-SHORTEST-PATHS(V, E, ω, s)
topologically sort the vertices
INIT-SINGLE-SOURCE(V, s)
for each vertex u, taken in topologically sorted order
do for each vertex υ ∈ Adj[u]
do RELAX(u, υ, ω)
Example:
6 1
s t x y z
2 7 -1 -2
0 2 6 5 3
4
2
Time: Θ(V + E).
No negative-weight edges.
Essentially a weighted version of breadth-first search.
• Instead of a FIFO queue, uses a priority queue.
• Keys are shortest-path weights (d[υ]).
Have two sets of vertices:
• S = vertices whose final shortest-path weights are determined
• Q = priority queue = V − S.
DIJKSTRA(V, E, ω, s)
INIT-SINGLE-SOURCE(V, s)
S←∅
Q←V ▷i.e., insert all vertices into Q
while Q ≠ ∅
do u ← EXTRACT-MIN(Q)
S ← S ∪ {u}
for each vertex υ ∈ Adj[u]
do RELAX(u, υ, ω)
18
Dijkstra’s algorithm
• Looks a lot like Prim’s algorithm, but computing d[υ], and using shortest-path
weights as keys.
• Dijkstra’s algorithm can be viewed as greedy, since it always chooses the “light-
est” (“closest”?) vertex in V − S to add to S.
Example:
x
8
10 2
s 0 3 4 6 z
5 1
5
y
Order of adding to S: s, y, z, x.
19
Dijkstra’s algorithm
Correctness:
Loop invariant: At the start of each iteration of the while loop, d[υ] =
δ(s, υ) for all υ ∈ S.
20
Dijkstra’s algorithm
p
This means there’s a shortest path s u.
Just before u is added to S, path p connects a vertex in S (i.e., s) to a vertex in
V − S (i.e., u).
Let y be first vertex along p that’s in V − S, and let x ∈ S be y’s predecessor.
p2
s
S u
p1
x y
p p2
Decompose p into s 1 x → y u. (Could have x = s or y = u, so that p1
or p2 may have no edges.)
21
Dijkstra’s algorithm
Claim
d[y] = δ(s, y) when u is added to S.
Proof x ∈ S and u is the first vertex such that d[u] ≠ δ(s, u) when u is added
to S ⇒ d[x] = δ(s, x) when x is added to S. Relaxed (x, y) at that time, so by
the convergence property, d[y] = δ(s, y). ■(claim)
22
Dijkstra’s algorithm
23
Difference constraints
Example:
x1 − x2 ≤ 5
x1 − x3 ≤ 6
x2 − x4 ≤ −1
x3 − x4 ≤ −2
x4 − x1 ≤ −3
Solution: x = (0, −4, −5, −3)
Also: x = (5, 1, 0, 2) = [above solution] + 5
Lemma
If x is a feasible solution, then so is x + d for any constant d.
Proof x is a feasible solution ⇒ x j − xi ≤ bk for all i, j, k
⇒ (x j + d) − (xi + d) ≤ bk . ■(lemma)
24
Difference constraints
Constraint graph
v1 v2
0 5
0 -4
0
-1
v0 0 -3
6
0
-3 -5
0 v4 -2 v3
25
Difference constraints
Theorem
Given a system of difference constraints, let G = (V, E) be the corresponding
constraint graph.
1. If G has no negative-weight cycles, then
x = (δ(υ0, υ1), δ(υ0, υ2), . . . , δ(υ0, υn))
is a feasible solution.
2. If G has a negative-weight cycle, then there is no feasible solution.
Proof
1. Show no negative-weight cycles ⇒ feasible solution.
Need to show that x j − xi ≤ bk for all constraints. Use
x j = δ(υ0, υ j)
xi = δ(υ0, υi)
bk = ω(υi, υ j) .
By the triangle inequality,
δ(υ0, υ j) ≤ δ(υ0, υi) + ω(υi, υ j)
x j ≤ xi + bk
x j − xi ≤ bk .
Therefore, feasible.
26
Difference constraints
27
Difference constraints
28