0% found this document useful (0 votes)
28 views10 pages

Solutions For HW10-CS 6033 Fall 2023

Uploaded by

ojasva
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views10 pages

Solutions For HW10-CS 6033 Fall 2023

Uploaded by

ojasva
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Solutions for HW10-CS 6033 Fall 2023

Q1 → Prim’s MST
Q2 → UNION-FIND
Q3 → Rod Cutting Optimal Revenue
Q4 → UNION-FIND Properties
Q5 → Kruskal’s Time Complexity
Q6 → Rooms Connection
Q7 → Krushal’s Opposite Approach
Q8 → Prim’s Negative Weights
Q9 → Greedy Counterexample
Q10 → Rod Cutting Modification
Q11 → Houses on lots

Q1 → Prim’s MST

This is the result of running Prim’s algorithm :


Q2 → UNION-FIND

After the first 4 unions :

After UNION(1,4) and UNION(5,7) :

After UNION(2,9) :
After UNION(5,1) :

After FINDSET(7), which returns 4 :

Q3 → Rod Cutting Optimal Revenue

Using the rod-cutting algorithm with optimal cut, we can get the following revenue table :
length 0 1 2 3 4 5

r 0 2 5 8 10 13

s 0 1 2 3 1 2

So, we can get a max revenue of 13 by creating 2 rods of size 3 and 2.

Q4 → UNION-FIND Properties

(a) 2^rank - 1.
A new rank is created during the UNION of 2 trees having the same rank at the root.
When rank 1 is created, the minimum number of nodes in the subtree is 1. When rank 2
is created, it will combine 2 trees having root rank as 1. So 2 trees having 2 nodes each
and one of those will become the new parent with rank 2 which will have the remaining 3
as nodes in its subtree. So there are a minimum 4 nodes in the tree having root rank 2. If
we need a minimum number of nodes for tree having root rank 3, we have to combine 2
trees of height 4, out of which one will become the new parent with rank 3 which will
have the remaining 7 nodes as nodes in its subtree. And so on for any rank.
(b) We know that the height of the tree is bounded by the rank of the root node.
From (a), we found that 2^r<=n where r is the rank of the root node and n is the number
of nodes in that tree.
Which implies that r<=log2(n)
So the upper bound on the height of the tree is O(log n).
(c) The total number of nodes will be the same as the number of MAKE-SET operations
which is n. We know from (a) that each tree of height h contains at least 2^h nodes.
What happens if you join two such trees? If they are of different height, the height of the
combined tree is the same as the height of the higher one, thus the new tree still has
more than 2^h nodes (same height but more nodes). Now if they are the same height,
the resulting tree will increase its height by one, and will contain at least 2^h + 2^h =
2^(h+1) nodes. So the condition will still hold. The most basic trees (1 node, height 0)
also fulfill the condition. It follows that all trees that can be constructed by joining two
trees together to fulfill it as well. Now the height is just the maximal number of steps to
follow during a find. If a tree has n nodes and height h (n >= 2^h) this gives immediately
log2(n) >= h >= steps.
Q5 → Kruskal’s Time Complexity

If the edge weights are integers in the range from 1 to |V|, we can make Kruskal’s algorithm run
in O(Eɑ(V)) time by using counting sort to sort the edges by weight in linear time. I would take
the same approach if the edge weights were integers bounded by a constant, since the runtime
is dominated by the task of deciding whether an edge joins disjoint forests, which is independent
of edge weights.

Q6 → Rooms Connection

The problem can be solved as a graph. Edges which are closer than 12-feet will have weight 10,
and the remaining edges will have weight 30.
We can leverage the fact that the edge weight can only have one of two possible values, and
run a DFS on this graph, only moving from one node to the other if the edge weight between
them is 10. This will result in one or more several disconnected trees. Since the original graph
was connected, these trees can also be connected, albeit with a longer cable.
We assign each of these trees an identity, and run DFS again, this time only considering edges
with weight 30, and making sure never to connect two trees which are already connected (which
we can keep track with a hash table). Once all trees have been connected, we obtain the MST.

Q7 → Krushal’s Opposite Approach

The concept here is to remove edges in decreasing order until none can be removed without
splitting the graph. If we find an edge that disconnects the graph, we skip it and move on to the
next edge. We can conclude that this gives us a tree as the output (as every edge except the
tree edges would be removed). We can also argue that this would be a minimum spanning tree
because we are removing edges in decreasing order. Hence, evil Kruskal is right after all.

Q8 → Prim’s Negative Weights

Yes, Prim’s algorithm works with negative edge weights because the cut property still applies.
Negative edges can only cause issues when there are negative edge cycles, but since we avoid
cycles in MST algorithms, there will not be any problems

Q9 → Greedy Counterexample

Here is one counter example:

i price density (price/i)

1 0.5 0.5

2 3 1.5

3 9 3

4 10 2.5

Here the length of the rod is i=4.


The cut with the highest density is i=3. Make the cut and then we are left with a rod of length 1.
So the overall price would be 9+0.5. However if we choose i=4, we will have a price of 10 which
is better.
Q10 → Rod Cutting Modification

Recurrence Formula:
dp[i] means the best profit we can make with a length of i
dp[i] = 0, i = 0
dp[i] = max(p[i], dp[i-j] + p[j] - c), j = 1 … i-1 and i > 0

MEMOIZED-CUT-ROD(p, n):
let r[0..n] be a new array
for i = 0 to n:
r[i] = -inf
return MEMOIZED-CUT-ROD-AUX(p, n, r)

MEMOIZED-CUT-ROD-AUX(p, n, r):
if r[n] >= 0:
return r[n]
if n == 0:
q = 0
else:
q = p[n]
for i = 1 to n-1:
q = max(q, p[i] + MEMOIZED-CUT-ROD-AUX(p, n-i, r) - c)
r[n] = q
return q
Q11 → Houses on lots

Recurrence Formula:
dp[i] = max revenue we can make from [0, xi]
dp[i] = 0 , i = 0
dp[i] = max(dp[i-1], dp[k] + r[i]), xk is the largest lot before xi that satisfies xi - xk > t or k = 0, i > 0

Notice that if dp[j] = dp[j-1] indicates that we don't build a house at lot j.
xk is the first lot that satisfies xk < xi - t. Since the dp array is an increasing array (dp[0] < dp[1] <
dp[2] … ), we can just update dp[i] with dp[k].

MAX-REVENUE(x, r, t, n):
Let s[0 ... n] be a new array
s[0] = 0
for i = 1 to n
s[i] = max(s[i-1], r[i] + s[CLOSEST(x, i, t)])
return s[n]
CLOSEST(x, i, t):
ans = 0
for j = 0 to i - 1
if x[i] - x[j] <= t
break // First house within t miles of xi
ans = j
return ans

CLOSEST takes O(n) for each call. A quicker way would be to build a lookup table for constant
time.

BUILD_CLOSEST(x, t):
current = x.length
closest = x.length
while(current > 0 and closest > 0)
while(x[current] - x[closest] <= t)
closest = closest - 1
while(x[current] - x[closest] > t)
CLOSEST[current] = closest
current = current - 1
while(current > 0)
CLOSEST[current] = closest
current = current - 1

To find the closest lot t miles away from lot i, we can just lookup CLOSEST[i] in O(1) time. There
are n subproblems in total and for each subproblem, we only need O(1) to solve. So total time
complexity is O(n) and total space complexity is O(n) too.

One workable example :


(I used n=4 here just for verification of the case that dp[i] = dp[i-1]):
x = [2, 5, 10, 12] (x1 = 2, x2 = 5, x3 = 10, x4 = 12)
r = [2, 5, 6, 1] (r1 = 2, r2 = 5, r3 = 6, r4 = 1)
t=3

dp[0] = 0
dp[1] = max(dp[0], dp[0] + r[1]) = 2, CLOSEST(x, 1, 3) = 0
dp[2] = max(dp[1], dp[0] + r[2]) = 5, CLOSEST(x, 2, 3) = 0
dp[3] = max(dp[2], dp[2] + r[3]) = 11, CLOSEST(x, 3, 3) = 2
dp[4] = max(dp[3], dp[2] + r[4]) = 11, CLOSEST(x, 4, 3) = 2

You might also like