0% found this document useful (0 votes)
42 views

CS 332: Algorithms: NP Complete: The Exciting Conclusion Review For Final

This document provides a review for the final exam in the CS 332: Algorithms course. It discusses administrative details like homework due dates and exam format. It then reviews key concepts from the course, including the definitions of P, NP, NP-complete, and NP-hard problems. Several example problems are shown to be NP-complete through polynomial-time reductions from other known NP-complete problems, such as 3-SAT being reduced to the clique problem and the clique problem being reduced to vertex cover. The document emphasizes that hundreds of problems have been shown to be NP-complete and that the final exam may include a simple NP-completeness proof.

Uploaded by

sakshiaro
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views

CS 332: Algorithms: NP Complete: The Exciting Conclusion Review For Final

This document provides a review for the final exam in the CS 332: Algorithms course. It discusses administrative details like homework due dates and exam format. It then reviews key concepts from the course, including the definitions of P, NP, NP-complete, and NP-hard problems. Several example problems are shown to be NP-complete through polynomial-time reductions from other known NP-complete problems, such as 3-SAT being reduced to the clique problem and the clique problem being reduced to vertex cover. The document emphasizes that hundreds of problems have been shown to be NP-complete and that the final exam may include a simple NP-completeness proof.

Uploaded by

sakshiaro
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 36

CS 332: Algorithms

NP Complete: The Exciting Conclusion


Review For Final

David Luebke 1
12/08/21
Administrivia
 Homework 5 due now
 All previous homeworks available after class
 Undergrad TAs still needed (before finals)
 Final exam
 Wednesday, December 13
 9 AM - noon
 You are allowed two 8.5“ x 11“ cheat sheets
 Both sides okay
 Mechanical reproduction okay (sans microfiche)

David Luebke 2
12/08/21
Homework 5
 Optimal substructure:
 Given an optimal subset A of items, if remove item j,
remaining subset A’ = A-{j} is optimal solution to
knapsack problem (S’ = S-{j}, W’ = W - wj)
 Key insight is figuring out a formula for c[i,w], value
of soln for items 1..i and max weight w:

0 if i  0 or w  0

c[i, w]  c[i  1, w] if wi  w
 Time:O(nW)
max(vi  c[i  1, w  wi , c[i  1, w] if i  0 and w  wi

David Luebke 3
12/08/21
Review: P and NP
 What do we mean when we say a problem
is in P?
 What do we mean when we say a problem
is in NP?
 What is the relation between P and NP?

David Luebke 4
12/08/21
Review: P and NP
 What do we mean when we say a problem
is in P?
 A: A solution can be found in polynomial time
 What do we mean when we say a problem
is in NP?
 A: A solution can be verified in polynomial time
 What is the relation between P and NP?
 A: P  NP, but no one knows whether P = NP

David Luebke 5
12/08/21
Review: NP-Complete
 What, intuitively, does it mean if we can
reduce problem P to problem Q?
 How do we reduce P to Q?
 What does it mean if Q is NP-Hard?
 What does it mean if Q is NP-Complete?

David Luebke 6
12/08/21
Review: NP-Complete
 What, intuitively, does it mean if we can reduce problem P
to problem Q?
 P is “no harder than” Q
 How do we reduce P to Q?
 Transform instances of P to instances of Q in polynomial time
s.t. Q: “yes” iff P: “yes”
 What does it mean if Q is NP-Hard?
 Every problem PNP p Q
 What does it mean if Q is NP-Complete?
 Q is NP-Hard and Q  NP

David Luebke 7
12/08/21
Review:
Proving Problems NP-Complete
 What was the first problem shown to be
NP-Complete?
 A: Boolean satisfiability (SAT), by Cook
 How do we usually prove that a problem R
is NP-Complete?
 A: Show R NP, and reduce a known
NP-Complete problem Q to R

David Luebke 8
12/08/21
Review:
Directed  Undirected Ham. Cycle
 Given: directed hamiltonian cycle is
NP-Complete (draw the example)
 Transform graph G = (V, E) into G’ = (V’,
E’):
 Every vertex v in V transforms into 3 vertices
v1, v2, v3 in V’ with edges (v1,v2) and (v2,v3) in E’
 Every directed edge (v, w) in E transforms into the
undirected edge (v3, w1) in E’ (draw it)

David Luebke 9
12/08/21
Review:
Directed  Undirected Ham. Cycle
 Prove the transformation correct:
 If G has directed hamiltonian cycle, G’ will have
undirected cycle (straightforward)
 If G’ has an undirected hamiltonian cycle, G will have a
directed hamiltonian cycle
 The three vertices that correspond to a vertex v in G must be
traversed in order v1, v2, v3 or v3, v2, v1, since v2 cannot be reached
from any other vertex in G’
 Since 1’s are connected to 3’s, the order is the same for all
triples. Assume w.l.o.g. order is v1, v2, v3.
 Then G has a corresponding directed hamiltonian cycle

David Luebke 10
12/08/21
Review: Hamiltonian Cycle  TSP
 The well-known traveling salesman problem:
 Complete graph with cost c(i,j) from city i to city j
  a simple cycle over cities with cost < k ?
 How can we prove the TSP is NP-Complete?
 A: Prove TSP  NP; reduce the undirected
hamiltonian cycle problem to TSP
 TSP  NP: straightforward
 Reduction: need to show that if we can solve TSP we can
solve ham. cycle problem

David Luebke 11
12/08/21
Review: Hamiltonian Cycle  TSP
 To transform ham. cycle problem on graph
G = (V,E) to TSP, create graph G’ = (V,E’):
 G’ is a complete graph
 Edges in E’ also in E have weight 0
 All other edges in E’ have weight 1
 TSP: is there a TSP on G’ with weight 0?
 If G has a hamiltonian cycle, G’ has a cycle w/ weight 0
 If G’ has cycle w/ weight 0, every edge of that cycle has
weight 0 and is thus in G. Thus G has a ham. cycle

David Luebke 12
12/08/21
Review: Conjunctive Normal Form
 3-CNF is a useful NP-Complete problem:
 Literal: an occurrence of a Boolean or its negation
 A Boolean formula is in conjunctive normal form,
or CNF, if it is an AND of clauses, each of which is
an OR of literals
 Ex: (x1  x2)  (x1  x3  x4)  (x5)
 3-CNF: each clause has exactly 3 distinct literals
 Ex: (x1  x2  x3)  (x1  x3  x4)  (x5  x3  x4)
 Notice: true if at least one literal in each clause is true

David Luebke 13
12/08/21
3-CNF  Clique
 What is a clique of a graph G?
 A: a subset of vertices fully connected to each
other, i.e. a complete subgraph of G
 The clique problem: how large is the maximum-
size clique in a graph?
 Can we turn this into a decision problem?
 A: Yes, we call this the k-clique problem
 Is the k-clique problem within NP?

David Luebke 14
12/08/21
3-CNF  Clique
 What should the reduction do?
 A: Transform a 3-CNF formula to a graph, for
which a k-clique will exist (for some k) iff the
3-CNF formula is satisfiable

David Luebke 15
12/08/21
3-CNF  Clique
 The reduction:
 Let B = C1  C2  …  Ck be a 3-CNF formula with k
clauses, each of which has 3 distinct literals
 For each clause put a triple of vertices in the graph, one
for each literal
 Put an edge between two vertices if they are in different
triples and their literals are consistent, meaning not each
other’s negation
 Run an example:
B = (x  y  z)  (x  y  z )  (x  y  z )

David Luebke 16
12/08/21
3-CNF  Clique
 Prove the reduction works:
 If B has a satisfying assignment, then each clause
has at least one literal (vertex) that evaluates to 1
 Picking one such “true” literal from each clause
gives a set V’ of k vertices. V’ is a clique (Why?)
 If G has a clique V’ of size k, it must contain one
vertex in each clique (Why?)
 We can assign 1 to each literal corresponding with
a vertex in V’, without fear of contradiction

David Luebke 17
12/08/21
Clique  Vertex Cover
 A vertex cover for a graph G is a set of
vertices incident to every edge in G
 The vertex cover problem: what is the
minimum size vertex cover in G?
 Restated as a decision problem: does a vertex
cover of size k exist in G?
 Thm 36.12: vertex cover is NP-Complete

David Luebke 18
12/08/21
Clique  Vertex Cover
 First, show vertex cover in NP (How?)
 Next, reduce k-clique to vertex cover
 The complement GC of a graph G contains exactly
those edges not in G
 Compute GC in polynomial time
 G has a clique of size k iff GC has a vertex cover of
size |V| - k

David Luebke 19
12/08/21
Clique  Vertex Cover
 Claim: If G has a clique of size k, GC has a
vertex cover of size |V| - k
 Let V’ be the k-clique
 Then V - V’ is a vertex cover in GC
 Let (u,v) be any edge in GC
 Then u and v cannot both be in V’ (Why?)
 Thus at least one of u or v is in V-V’ (why?), so
edge (u, v) is covered by V-V’
 Since true for any edge in GC, V-V’ is a vertex cover

David Luebke 20
12/08/21
Clique  Vertex Cover
 Claim: If GC has a vertex cover V’  V, with |
V’| = |V| - k, then G has a clique of size k
 For all u,v  V, if (u,v)  GC then u  V’ or
v  V’ or both (Why?)
 Contrapositive: if u  V’ and v  V’, then
(u,v)  E
 In other words, all vertices in V-V’ are connected by
an edge, thus V-V’ is a clique
 Since |V| - |V’| = k, the size of the clique is k

David Luebke 21
12/08/21
General Comments
 Literally hundreds of problems have been
shown to be NP-Complete
 Some reductions are profound, some are
comparatively easy, many are easy once the
key insight is given
 You can expect a simple NP-Completeness
proof on the final

David Luebke 22
12/08/21
Other NP-Complete Problems
 Subset-sum: Given a set of integers, does there
exist a subset that adds up to some target T?
 0-1 knapsack: you know this one
 Hamiltonian path: Obvious
 Graph coloring: can a given graph be colored
with k colors such that no adjacent vertices are
the same color?
 Etc…

David Luebke 23
12/08/21
Final Exam
 Coverage: 60% stuff since midterm, 40% stuff
before midterm
 Goal: doable in 2 hours
 This review just covers material since the
midterm review

David Luebke 24
12/08/21
Final Exam: Study Tips
 Study tips:
 Study each lecture since the midterm
 Study the homework and homework solutions
 Study the midterm
 Re-make your midterm cheat sheet
 I recommend handwriting or typing it
 Think about what you should have had on it the first
time…cheat sheet is about identifying important
concepts

David Luebke 25
12/08/21
Graph Representation
 Adjacency list
 Adjacency matrix
 Tradeoffs:
 What makes a graph dense?
 What makes a graph sparse?
 What about planar graphs?

David Luebke 26
12/08/21
Basic Graph Algorithms
 Breadth-first search
 What can we use BFS to calculate?
 A: shortest-path distance to source vertex
 Depth-first search
 Tree edges, back edges, cross and forward edges
 What can we use DFS for?
 A: finding cycles, topological sort

David Luebke 27
12/08/21
Topological Sort, MST
 Topological sort
 Examples: getting dressed, project dependency
 What kind of graph do we do topological sort on?
 Minimum spanning tree
 Optimal substructure
 Min edge theorem (enables greedy approach)

David Luebke 28
12/08/21
MST Algorithms
 Prim’s algorithm
 What is the bottleneck in Prim’s algorithm?
 A: priority queue operations
 Kruskal’s algorithm
 What is the bottleneck in Kruskal’s algorithm?
 Answer: depends on disjoint-set implementation
 As covered in class, disjoint-set union operations
 As described in book, sorting the edges

David Luebke 29
12/08/21
Single-Source Shortest Path
 Optimal substructure
 Key idea: relaxation of edges
 What does the Bellman-Ford algorithm do?
 What is the running time?
 What does Dijkstra’s algorithm do?
 What is the running time?
 When does Dijkstra’s algorithm not apply?

David Luebke 30
12/08/21
Disjoint-Set Union
 We talked about representing sets as linked lists,
every element stores pointer to list head
 What is the cost of merging sets A and B?
 A: O(max(|A|, |B|))
 What is the maximum cost of merging n
1-element sets into a single n-element set?
 A: O(n2)
 How did we improve this? By how much?
 A: always copy smaller into larger: O(n lg n)

David Luebke 31
12/08/21
Amortized Analysis
 Idea: worst-case cost of an operation may
overestimate its cost over course of algorithm
 Goal: get a tighter amortized bound on its cost
 Aggregate method: total cost of operation over course
of algorithm divided by # operations
 Example: disjoint-set union
 Accounting method: “charge” a cost to each operation,
accumulate unused cost in bank, never go negative
 Example: dynamically-doubling arrays

David Luebke 32
12/08/21
Dynamic Programming
 Indications: optimal substructure, repeated
subproblems
 What is the difference between memoization and
dynamic programming?
 A: same basic idea, but:
 Memoization: recursive algorithm, looking up
subproblem solutions after computing once
 Dynamic programming: build table of subproblem
solutions bottom-up

David Luebke 33
12/08/21
LCS Via Dynamic Programming
 Longest common subsequence (LCS) problem:
 Given two sequences x[1..m] and y[1..n], find the
longest subsequence which occurs in both
 Brute-force algorithm: 2m subsequences of x to
check against n elements of y: O(n 2m)
 Define c[i,j] = length of LCS of x[1..i], y[1..j]
 Theorem:
c[i  1, j  1]  1 if x[i ]  y[ j ],
c[i, j ]  
 max(c[i, j  1], c[i  1, j ]) otherwise
David Luebke 34
12/08/21
Greedy Algorithms
 Indicators:
 Optimal substructure
 Greedy choice property: a locally optimal choice leads to a
globally optimal solution
 Example problems:
 Activity selection: Set of activities, with start and end times.
Maximize compatible set of activities.
 Fractional knapsack: sort items by $/lb, then take items in
sorted order
 MST

David Luebke 35
12/08/21
The End

David Luebke 36
12/08/21

You might also like