0% found this document useful (0 votes)
11 views

rec04

The document discusses AVL trees, a type of self-balancing binary search tree that maintains a height of O(log n) to optimize operations like find, insert, and delete. It explains the process of balancing the tree through rotations during insertions and deletions, ensuring that the height difference between child nodes does not exceed one. Additionally, it introduces subtree size augmentation, where each node maintains the size of its subtree to enhance operation efficiency.

Uploaded by

screenwise121
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

rec04

The document discusses AVL trees, a type of self-balancing binary search tree that maintains a height of O(log n) to optimize operations like find, insert, and delete. It explains the process of balancing the tree through rotations during insertions and deletions, ensuring that the height difference between child nodes does not exceed one. Additionally, it introduces subtree size augmentation, where each node maintains the size of its subtree to enhance operation efficiency.

Uploaded by

screenwise121
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

6.

006 Intro to Algorithms Recitation 04 February 11, 2011

AVL Trees
Recall the operations (e.g. find, insert, delete) of a binary search tree. The runtime of
these operations were all O(h) where h represents the height of the tree, defined as the length of
the longest branch. In the worst case, all the nodes of a tree could be on the same branch. In this
case, h = n, so the runtime of these binary search tree operations are O(n). However, we can
maintain a much better upper bound on the height of the tree if we make efforts to balance the
tree and even out the length of all branches. AVL trees are binary search trees that balances itself
every time an element is inserted or deleted. Each node of an AVL tree has the property that
the heights of the sub-tree rooted at its children differ by at most one.

om
Upper Bound of AVL Tree Height

.c
es
We can show that an AVL tree with n nodes has O(log n) height. Let Nh represent the minimum

mot
number of nodes that can form an AVL tree of height h.

cfon
If we know Nh−1 and Nh−2 , we can determine Nh . Since this Nh -noded tree must have a height
h, the root must have a child that has height h − 1. To minimize the total number of nodes in this
psd.
tree, we would have this sub-tree contain Nh−1 nodes. By the property of an AVL tree, if one child
stet
.njuo

has height h − 1, the minimum height of the other child is h − 2. By creating a tree with a root
whose left sub-tree has Nh−1 nodes and whose right sub-tree has Nh−2 nodes, we have constructed
wf
wpd

the AVL tree of height h with the least nodes possible. This AVL tree has a total of Nh−1 +Nh−2 +1
wst

nodes (Nh−1 and Nh−2 coming from the sub-trees at the children of the root, the 1 coming from
.ijtu

the root itself).


wis

The base cases are N1 = 1 and N2 = 2. From here, we can iteratively construct Nh by using
wv

the fact that Nh = Nh−1 + Nh−2 + 1 that we figured out above.


we

Using this formula, we can then reduce as such:


or
rM

Nh = Nh−1 + Nh−2 + 1 (1)


Fo

Nh−1 = Nh−2 + Nh−3 + 1 (2)


Nh = (Nh−2 + Nh−3 + 1) + Nh−2 + 1 (3)
Nh > 2Nh−2 (4)
h
Nh > 2 2 (5)
h
log Nh > log 2 2 (6)
2 log Nh > h (7)
h = O(log Nh ) (8)

Showing that the height of an AVL tree is indeed O(log n).

For More visit www.justpdfnotes.com


6.006 Intro to Algorithms Recitation 04 February 11, 2011

AVL Rotation
We’ve shown that if we can indeed keep the tree balanced, we can keep the height of the tree at
O(log n), which speeds up the worst case runtime of the tree operations. The next step is to show
how to keep the tree balanced as we insert and delete nodes from the tree.

om
.c
es
mot
cfon
psd.
stet
.njuo

Since we need to maintain the property that the height of the children must not differ by more
than 1 for every node, it would be useful if we could access a node’s height without needing to
wf
wpd

examine the entire length of the branch that it’s on. Recall that for a binary search tree, each
wst

node contained a key, left, right, and a parent. AVL trees will also contain an additional
.ijtu

parameter, height to help us keep track of balance.


wis
wv
we
or
rM
Fo

There are two operations needed to help balance an AVL tree: a left rotation and a right rotation.
Rotations simply re-arrange the nodes of a tree to shift around the heights while maintaining the
order of its elements. Making a rotation requires re-assigning left, right, and parent of a
few nodes, but nothing more than that. Rotations are O(1) time operations.

AVL Insertion
Now delegating to recitation notes from fall of 2009:

For More visit www.justpdfnotes.com


6.006 Intro to Algorithms Recitation 04 February 11, 2011

om
.c
es
mot
cfon
psd.
stet
.njuo

When we insert a node into node n, we have three possible cases.


wf
wpd

1. Children of n have same height k. Inserting into either sub-tree will still result in a valid
wst

AVL tree
.ijtu
wis

2. The left child of n is heavier than the right child. Inserting into the left child may imbalance
wv

the AVL tree


we
or

3. The right child of n is heavier than the left child. Inserting into the right child may imbalance
rM

the AVL tree


Fo

When the AVL tree gets imbalanced, we must make rotations in the tree to re-arrange the nodes
so that the AVL tree becomes balanced once again. Note that adding a node into a k height tree
does not always turn it into a k + 1 height tree, since we could have inserted the node on a shorter
branch of that tree. However, for now, we are looking specifically at the situations where adding
a node into a k height tree does turn it into a k + 1 height tree. Let’s examine the case where we
insert a node into a heavy right child.

For More visit www.justpdfnotes.com


6.006 Intro to Algorithms Recitation 04 February 11, 2011

om
.c
es
mot
cfon
psd.
stet
.njuo
wf

There are two cases here that will imbalance the AVL tree. We will once again look at the
wpd

problem on a case by case basis.


wst
.ijtu
wis
wv
we
or
rM
Fo

In the first case, B had height k − 1, C had height k − 1, and a node was inserted into C,
making its current height k. We call a left rotation on n to make the y node the new root and

For More visit www.justpdfnotes.com


6.006 Intro to Algorithms Recitation 04 February 11, 2011

shifting the B sub-tree over to be n’s child. The order of the elements are preserved (In both trees,
A < n < B < y < C), but after the rotation, we get a balanced tree.

om
.c
es
mot
cfon
psd.
stet
.njuo

In the second case, B had height k − 1, C had height k − 1, and a node was inserted into B,
wf
wpd

making its current height k. In this case, no single rotation on a node will result in a balanced tree,
wst

but if we make a right rotation on y and then a left rotation on x, we end up with a happy AVL tree.
.ijtu

If we insert a node into a heavy left child instead, the balancing solutions are flipped (i.e. right
wis

rotations instead of left rotations and vice versa), but the same concepts apply.
wv

AVL insertions are binary search tree insertions plus at most two rotations. Since binary search
we

tree insertions take O(h) time, rotations are O(1) time, and AVL trees have h = O(log n), AVL
or

insertions take O(log n) time.


rM

There are only a finite number of ways to imbalance an AVL tree after insertion. AVL insertion
is simply identifying whether or not the insertion will imbalance the tree, figuring out what
Fo

imbalance case it causes, and making the rotations to fix that particular case.

AVL Deletion
AVL deletion is not too different from insertion. The key difference here is that unlike insertion,
which can only imbalance one node at a time, a single deletion of a node may imbalance several of
its ancestors. Thus, when we delete a node and cause an imbalance of that node’s parent, not only
do we have to make the necessary rotation on the parent, but we have to traverse up the ancestry
line, checking the balance, and possibly make some more rotations to fix the AVL tree.
Fixing the AVL tree after a deletion may require making O(log n) more rotations, but since
rotations are O(1) operations, the additional rotations does not affect the overall O(log n) runtime
of a deletion.

For More visit www.justpdfnotes.com


6.006 Intro to Algorithms Recitation 04 February 11, 2011

Tree Augmentation
In AVL trees, we augmented each node to maintain the node’s height and saw how that helped us
maintain balance. Augmentation is a very useful tool to help us solve problems that a vanilla binary
search tree cannot solve efficiently. We will learn about another useful augmentation, subtree size
augmentation.

om
.c
es
mot
cfon
psd.
stet
.njuo
wf
wpd
wst
.ijtu
wis

In subtree size augmentation, each node maintains the size of the subtree rooted at that node in
wv

a parameter size. The root of a tree containing n elements will have size = n and each leaf of
we

a tree will have size = 1.


or

The operations of this tree must maintain the size value of every node so that it is always
rM

correct.
Fo

For insertion, as we traverse down a branch to find where to insert a node, we need to increment
size for every node that we visit. This is because going through a node during insertion means
we will be inserting a node in the tree rooted at that node.
Deletion will also require some maintenance. Every time we remove a node, we must traverse
up its ancestry and decrement size of all its ancestors.
If we wanted to augment an AVL tree with subtree size, we would also have to make sure that
the rotation operations maintain size of all the nodes being moved around (Hint: the fact that
x.size = x.left.size + x.right.size + 1 is useful here).
As you’ll see in the problem set, using subtree augmentation can help speed up operations that
normally would be slow on a regular binary search tree or AVL tree.

For More visit www.justpdfnotes.com

You might also like