0% found this document useful (0 votes)
21 views57 pages

Lecture11 Updated

Uploaded by

Sana Bakrim
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views57 pages

Lecture11 Updated

Uploaded by

Sana Bakrim
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 57

1

Lecture #11
• Sorting Algorithms, part II:
– Quicksort
– Mergesort
• Trees
– Introduction
– Implementation & Basic Properties
– Traversals: The Pre-order Traversal
• On-your-own Study
– Full binary trees
2

But first… STL Challenge


Give me a data structure that I can use to maintain a
bunch of people’s names and for each person, allows me
to easily get all of the streets they lived on.

Assuming I have P total people and each person has lived on


an average of E former streets…

What is the Big-Oh cost of:

A. Finding the names of all people who have lived on


“Levering street”?
B. Determining if “Bill” ever lived on “Westwood blvd”?
C. Printing out every name along with each person’s street
addresses, in alphabetical order.
D. Printing out all of the streets that “Tala” has lived on.
3
Advanced Sorting Algos.
4
Advanced Sorting Algorithms
Why should you care?

Because this is basically how folks sort


stuff in real life.

You can sort billions of values in


seconds. SECONDS BABY!

And because you’ll be asked about


them in job interviews and on exams.

So pay attention!
6

Divide and Conquer Sorting


The last two sorts we’ll learn (for now) are
Quicksort and Mergesort.

These sorts generally work as follows:

1. Divide the elements to be sorted into two


groups of roughly equal size.
2. Sort each of these smaller groups of elements
(conquer).
3. Combine the two sorted groups into one large
sorted list.

Any time you see “divide and conquer,” you should think
recursion... EEK!
7

The Quicksort Algorithm


1. If the array contains only 0 or 1 element, return.

2. Select an arbitrary element P from the array


(typically the first element in the array).

3. Move all elements that are less than or equal to P to


Divide

the left of the array and all elements greater than P


to the right (this is called partitioning).
Conquer

4. Recursively repeat this process on the left sub-array


and then the right sub-array.

30
13 1 77 13 69 40 77
21 30 21
QuickSort
8
Select an arbitrary item P from the array.
Move items smaller than or equal to P to the left and
larger items to the right; P goes in-between.
Recursively repeat this process on the left items
Recursively repeat this process on the right items

And item P is exactly


in the right spot in between!

P
Everything on this side
is smaller than item P!
Everything on this side
is larger than item P!
EE MBA History Bio Drop-out CS USC
Major Major Major Major Grad
QuickSort
9
Select an arbitrary item P from the array.
Select an arbitrary item P from the array.
Move items smaller than or equal to P to the left and
Move items
larger smaller
items to thethan or Pequal
right; goesto P to the left and
in-between.
larger items to the right; P goes in-between.
Recursively repeat this process on the left items
Recursively repeat this process on the left items
Recursively repeat this process on the right items
Recursively repeat this process on the right items

P2 P
History Bio USC EE MBA Drop-out CS
Major Major Grad Major Major

Everything left of EE Major


(our first P) is now sorted!
QuickSort
10
Select an arbitrary item P from the array.
Select an arbitrary item P from the array.
Move items smaller than or equal to P to the left and
Move items
larger smaller
items to thethan or Pequal
right; goesto P to the left and
in-between.
larger items to the right; P goes in-between.
Recursively repeat this process on the left items
Recursively repeat this process on the left items
Recursively repeat this process on the right items
Recursively repeat this process on the right items

P3
P2 P
USC History Bio EE MBA Drop-out CS
Grad Major Major Major Major

Everything right
Finally, all items are sorted!
of EE Major
(our first P) is now
sorted!
D&C Sorts:LastQuicksort
11

First
specifies thespecifies the
Only bother sorting
And here’s arrays
an actual Quicksort
oflast element
at least twostarting
C++ element of
function:
of the
the array to sort.
array to sort.
elements! 0
DIVIDE 7
CONQUER Pick an element.
void QuickSort(int Array[],int First,int Last)
{
CONQUER
Apply Move <= items left
our
if (Last – First >= 1 QS
)
{ algorithm toMove
Apply our > items right
QS left
the
algorithm to the
half of the
int PivotIndex; right
array.
3 half of the array.
PivotIndex = Partition(Array,First,Last);
QuickSort(Array,First,PivotIndex-1); // left
QuickSort(Array,PivotIndex+1,Last); // right
}
}

30
13 1 77
21 30
13 69 40 77
21 46
0 1 2 3 4 5 6 7
12

The QS Partition Function


The Partition function uses the first item as the pivot
value and moves less-than-or-equal items to the left and
larger ones to the right.
int Partition(int a[], int low, int high) And finally, return
{ the pivot’s index in
int pi = low;
the
int pivot = a[low]; } – Select the first item as ourarray
pivot (4) to
value
do the QuickSort
{ function.
Find
Find first value
next value
while ( low <= high && a[low] <= pivot )
low++;
Find
Find first
next
next
}-
}-
value
value
>> than
than the
the pivot.
pivot.
while ( a[high] > pivot )
high--; }-
<=
<= than
than the
the pivot.
pivot.
if ( low < high )
high Swap pivot to proper
swap(a[low], a[high]); } – Swap the larger with the smaller
position in array
}
low low low low high
low low high highhighhighhigh high
while ( low < high ); } – done
swap(a[pi], a[high]); }–
pi = high;
return(pi); 30
4 1 77
12 13 30
99
4 52 40 99
4 77
12 35 47 56
} 0 1 2 3 4 5 6 7 8 9 10 11
13

Big-oh of Quicksort
n steps
We first partition the
array, at a cost of n steps.
30
13 1 77 13 69 40 77
21 30 21 46
Then we repeat the log2(n)
process for each half… levels n steps
We partition each of the 2
13
1 131 21
This is our pivot value.
halves, each taking n/2 steps, 69
40 40 69 46
46 77 77
Our partition function
at a total cost of n steps.
moves smaller values to the n steps
left of the array, larger
Then we repeat thevalues to the right.
process for each half… 1 21 40 46 77

We partition each of the 4 So at each level, we do n


halves, each taking n/4 steps, operations, and we have log2(n)
at a total cost of n steps.
levels, so we get: n log2(n).
14

Quicksort – Is It Always Fast?


Are there any kinds of input data where Quicksort is
either more or less efficient?

Yes! If our array is already sorted or mostly sorted,


then quicksort becomes very slow!

1 10 20 30 40 50 60 70

Let’s see why.


15

Worst-case Big-oh of Quicksort


n steps
We first partition the array, at
a cost of n steps.
1 10 20 30 40 50 60 70
Then we repeat the process
for the left & right groups… n-1 steps
Ok, let’s partition our right
group then. This is our pivot value. 10 20 Wait!
30 Our40 50pivot
60value
70 was
Our partition But
function our smallest value, so the
wait, there is no
Then we repeatmoves smaller values partition algorithm didn’t
the process grouptotothe
the left of
leftgroups…
for the left & right of the array,thelarger move any values to the left!
pivot value!
values to the right. All the bigger onesright
Wait our were
Wait! Our pivot value was
This is our pivot value. 20 30 40
already on50
the60
group 70side!
right
STILL has
our smallest value, so the
Our partition But
function is no nearly
wait, therepartition n items!
algorithm didn’t
Wait oursmaller
moves right group
values to the
group to the left moveof(It
any has n-1toitems)
values the
STILLleft
hasofnearly n items!
the array,the
larger left! All the bigger ones
pivot value!
(It has n-2to
values items)
the right. were already on the right!
16

Worst-case Big-oh of Quicksort


n steps
We first partition the array, at
a cost of n steps.
Wait! Our pivot value was 1 10 20 30 40 50 60 70
Then we repeat the process
our smallest value, so the
for the leftpartition But wait, there is no
& right groups… n-1 steps
algorithm didn’t
group to the left of
move any values to the left!
the pivot value!
Ok, let’s partition our right
All the bigger ones were 10 20 30 40 50 60 70
group then.
already on the right side!
Then we repeat the process n-2 steps
for the left & right groups…

Ok, let’s partition our right 20 30 40 50 60 70


group then. This isWait our right group
ourhas
pivot value.
STILL nearly n items!
Our partition
hasfunction
Then we repeat the (Itprocess n-3 items)
moves smaller values to the
for the left & right groups…
left of the array, larger 30 40 50 60 70
values to the right.
17

Worst-case Big-oh of Quicksort


What you’ll notice is that n steps
each time we partition, we
remove only one item off the
1 10 20 30 40 50 60 70
left side!
And if we only remove n-1 steps
one item off the
left side each time… n 10 20 30 40 50 60 70
We’re going to have to go levels
through this partitioning n-2 steps
process n times to process
the entire array!
20 30 40 50 60 70
And if the partition algorithm
requires ~n steps at each level… n-3 steps
And we go n levels deep…
30 40 50 60 70
Then our algorithm is O(n )!
2
18

Other Quicksort Worst Cases?

So, as you can see, an array that’s mostly in order


will require an average of N2 steps!

As you can probably guess, Quicksort also has the


same problem with arrays that are in reverse order!

So if you happen to know your data will be


mostly sorted (or in reverse) order, avoid Quicksort!

It’s a DOG!
19

QuickSort Questions
Can QuickSort be applied easily to
sort items within a linked list?

Is QuickSort a “stable” sort?

Does QuickSort use a fixed amount


of RAM, or can it vary?

Can QuickSort be parallelized across multiple


cores?

When might you use QuickSort?


Mergesort
20

The Mergesort is another extremely efficient sort – yet


it’s pretty easy to understand.

But before we learn the Mergesort, we need to learn


another algorithm called “merge”.
Mergesort
21

The basic merge algorithm takes two-presorted arrays as


inputs and outputs a combined, third sorted array.

By always selecting andA1


moving B
the smallest book from either
shelf we guarantee all of our
books will end up sorted!

i1
Ok, let’s look at the
C++ code for the A2
1. Initialize counter variables i1, i2 to zero
Merge Algorithm
2. While there are more items to copy…
merge algorithm! Consider
If A1[i1]the left-most
is less book in both shelves
than A2[i2]
Take theCopysmallest
A1[i1] to of the array
output two books
B and i1++
Add Else
it to the new shelf
Repeat Copy
theA2[i2]
wholetoprocess
output array B and
until all i2++
books
3.are
If either
movedarray runs out, copy the entire
contents of the other array over
i2
22

Merge Algorithm in C++ Here’s the C++ version of our


merge function!
void merge(int data[], int n1, int n2,
int temp[]) You pass in an input array called
{ data and the sizes of the two
int i1=0, i2=0, k=0; parts of it to merge: n1 and n2
int *A1 = data, *A2 = data + n1;
while (i1 < n1 || i2 < n2) The last parameter, temp, is a
{ dynamically allocated array that
if (i1 == n1) temporarily holds the merged
temp[k++] = A2[i2++]; results as we loop.
else if (i2 == n2)
Finally, we copy our merged
temp[k++] = A1[i1++];
results back to the data array.
else if (data[i1] <= A2[i2])
temp[k++] = A1[i1++];
else temp 1 4 11 13 21 25 30 …
temp[k++] = A2[i2++];
} A1 A2
for (int i=0;i<n1+n2;i++)
data[i] = temp[i]; data 1 13 21 4 11 25 30
}
n1=3 n2=4
23

Mergesort
OK – so what’s the full mergesort alogrithm:

Mergesort function :
1. If array has one element, then return (it’s sorted).
2. Split up the array into two equal sections
3. Recursively call Mergesort function on the left half
4. Recursively call Mergesort function on the right half
5. Merge the two halves using our merge function

Ok, let’s see how to mergesort a shelf full of books!


24

1. If array has 1 item, then return


2. Split array in two equal sections
3. Call Mergesort on the left half
4. Call Mergesort on the right half
5. Merge the halves back together

1. If array has 1 item, then return


2. Split array in two equal sections
3. Call Mergesort on the left half
4. Call Mergesort on the right half
5. Merge the halves back together

1. If array has 1 item, then return


2. Split array in two equal sections
3. Call Mergesort on the left half
4. Call Mergesort on the right half
5. Merge the halves back together

1. If array
1. Ifhas 1 item,
array has 1then
item,return
then return
2. Split2.array
Splitinarray
two equal
in twosections
equal sections
3. Call 3.
Mergesort on the on
Call Mergesort left half
the left half
4. Call 4.
Mergesort on the on
Call Mergesort right
thehalf
right half
5. Merge the halves
5. Merge back together
the halves back together
i1 i2
25
To save time, we’ll skip
1. If array
thehas 1 item,
tracing on then return
this side…
2. Split array in two equal sections
3. Call Mergesort on the left half
4. Call Mergesort on the right half
5. Merge the halves back together

1. If array has 1 item, then return


2. Split array in two equal sections
3. Call Mergesort on the left half
4. Call Mergesort on the right half
5. Merge the halves back together

1. If array has 1 item, then return


2. Split array in two equal sections
3. Call Mergesort on the left half
4. Call Mergesort on the right half
5. Merge the halves back together
i1 i2

i1 i2
26

1. If array has 1 item, then return


2. Split array in two equal sections
3. Call Mergesort on the left half
4. Call Mergesort on the right half
5. Merge the halves back together

i1 i2 And our array is


sorted!!!!
27

Mergesort – One Final Detail


While I showed the Mergesort moving
books into a bunch of small piles…

i1 i2
The real algorithm sorts the data in-place
in the array…

and only uses a separate array for merging.

Let’s see how it really works!


1. If array has one element, then return.
1. If array has one element, then return.
2. Split the array in two equal sections
2. Split the array in two equal sections
3. Call Mergesort on the left half
3. Call Mergesort on the left half
4. Call Mergesort on the right half
4. Call Mergesort on the right half
5. Merge the two halves back together
5. Merge the two halves back together
Big-oh of
28

Mergesort

log2n levels deep


Why? Because we
keep dividing our
piles in half…
until our piles are
just 1 book!

n items merged
Big-oh of
29

Mergesort

log2n levels deep


Why? Because we
n items merged keep dividing our
piles in half…
until our piles are
just 1 book!

n items merged
Big-oh of
30

Mergesort

n items merged
log2n levels deep
Why? Because we
n items merged keep dividing our
piles in half…
until our piles are
just 1 book!

n items merged

Overall, this gives us n·log2(n) steps to sort


n items of data. Not bad! 
31

Mergesort – Any Problem Cases


So, are there any cases
where mergesort is less
efficient?
No! Mergesort works
equally well regardless
of the ordering of the
data…

However, because the merge function needs secondary


arrays to merge, this can slow things down a bit…

In contrast, quicksort doesn’t need to allocate any new


arrays to work.
32

MergeSort Questions

Can MergeSort be applied easily to


sort items within a linked list?

Is MergeSort a “stable” sort?

Are there any special uses for MergeSort


that other sorts can’t handle?

Can MergeSort be parallelized across


multiple cores?
33
Sorting Overview
Sort Stable/ Notes
Name Non-
stable
Selection Unstable Always O(n2), but simple to implement. Can be used with linked lists.
Sort Minimizes the number of item-swaps (important if swaps are slow)

Insertion Stable O(n) for already or nearly-ordered arrays. O(n 2) otherwise. Can be
Sort used with linked lists. Easy to implement.

Bubble Stable O(n) for already or nearly-ordered arrays (with a good


Sort implementation). O(n2) otherwise. Can be used with linked lists.
Easy to implement. Rarely a good answer on an interview!
Shell Unstable O(n1.25) approx. OK for linked lists. Used in some embedded
Sort systems (eg, in a car) instead of quicksort due to fixed RAM usage.

Quick Unstable O(n log2n) average, O(n2) for already/mostly/reverse ordered arrays or
Sort arrays with the same value repeated many times. Can be used with
linked lists. Can be parallelized across multiple cores. Can require up
O(n) slots of extra RAM (for recursion) in the worst case, O(log 2n) avg.

Merge Stable O(n log2n) always. Used for sorting large amounts of data on disk
Sort (aka “external sorting”). Can be used to sort linked lists. Can be
parallelized across multiple cores. Downside: Requires n slots of
34

Challenge Problems

1. Give an algorithm to efficiently determine


which element occurs the largest number of
times in the array.

2. What’s the best algorithm to sort 1,000,000


random numbers that are all between 1 and 5?
35
Trees
36
Trees
Why should you care?

Trees are used to organize data in


many software applications, including:

Databases (B-TREES)
Data Compression (Huffman Trees)
Bitcoin (Merkle Trees)
Medical Diagnosis (Decision Trees)

And because you’ll be asked about


them in job interviews and on exams.

So pay attention!
Trees
38

“I think that I shall never see a data structure as


lovely as a tree.” - Carey Nachenberg

A Tree is a special linked list-based data structure


that has many uses in Computer Science:

A Decision Tree
AAn A Does
Binary
Family theTree
Search Tree
• To organize hierarchical data Expression
patient
Tree
• To make information easily have a+fever?
“marty”
“carey”
searchable yes no
• To simplify the evaluation of Does he/she Does he/she
mathematical expressions have “leon”
spots
* on
“harry”
his/her face?
have +
a sore
“andrea”
“rich”
throat?
• To make decisions
yes
“sheila”
“alan”
He has 32 …
“simon”
“jacob” -42 “milton”
17 “nancy”
“martha” “zai”
4
COOTIES!
NULL NULL
NULL NULL NULL
NULL NULL NULL NULL
NULL NULL
NULL NULL NULL NULL
NULL
Basic Tree Facts
39

1. Trees are made of nodes


(just like linked list nodes). root ptr
2. Every tree has a "root" pointer. Empty tree NULL
3. The top node of a tree
is called its "root" node. of just one next
But instead 5
Root node
4. Every node may have
pointer, zeronode can have
a tree
two or nodes.
or more “children” more next pointers!
5. A node with 0 children is
called a “leaf” node. 2 children -33 1 child 17
NULL
6. A struct
tree with no nodes is
node
The{ tree’s
called anroot
“empty tree.”
pointer is likevalue;
int a linked // some value
list’s head
nodepointer!
*left, *right;Leaf
0 node
children 53 91 -115
NULL NULL NULL NULL NULL NULL
};

node *rootPtr;
40

Tree Nodes Can Have Many Children


A tree node can have more than just two children:
struct node
{
int value; // node data

node *pChild1, *pChild2, *pChild3, …;


}; root ptr

struct node
{
int value; // node data 3
node *pChildren[26]; NULL
};

7 4 15
NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL NULL
Binary Trees
41

A binary tree is a special form of tree. In a binary tree,


every node has at most two children nodes:
A left child and a right child.

struct BTNODE // binary tree node


{ A Binary Tree
string value; // node data
“carey”
BTNODE *pLeft, *pRight;
};
“leon” “andrea”

“sheila” “simon” “martha” “milton”


NULL NULL NULL NULL NULL NULL NULL NULL
Binary Tree Subtrees
42

We can pick any node in the tree…


And then focus on its “subtree” - which includes it and
all of nodes below it.
This subtree includes four
different nodes…
“carey”
It has the “leon” node
as its root.
“leon” “andrea”
like this node…

“sheila” “simon” “martha” “milton”


NULL NULL NULL NULL NULL NULL NULL

“ziggy”
NULL NULL
Binary Tree Subtrees
43

If we pick a node from our tree…


we can also identify its left and right sub-trees.

“carey” Carey’s
like this node…
right
Carey’s subtree
left “leon” “andrea”

subtree

“sheila” “simon” “martha” “milton”


NULL NULL NULL NULL NULL NULL NULL

“ziggy”
NULL NULL
44

Operations on Binary Trees


The following are common operations that we might
perform on a Binary Tree:
• enumerating all the items
• searching for an item
• adding a new item at a certain position on the tree
• deleting an item
• deleting the entire tree (destruction)
• removing a whole section of a tree (called pruning)
• adding a whole section to a tree (called grafting)

We’ll learn about many of these operations over the


next two classes.
A Simple Tree
45
struct BTNODE // node
{
int value; // data
BTNODE *left, *right; Example
};
OS – can Asyou
temp 1200
1100 lists, we use
with linked
main() reserve 12 bytes dynamic
pRoot 1000 memory to
{
OS – can youforallocate our nodes.
of memory
BTNODE *temp, *pRoot;
reserve me?
12 bytes
pRoot = new BTNODE; 1000
of
pRoot->value = 5; memory for value 5
me? left right
temp = new BTNODE;
temp->value = 7; OS – can you 1200 1100

reserve 12 bytes
temp->left = NULL;
temp->right = NULL;
of memory
temp-> for
value 7
1200
pRoot->left = temp;
me? left right temp-> value -3 1100
temp = new BTNODE;
temp->value = -3; NULL NULL Sure, here’s a
left right
temp->left = NULL; chunk ofNULL memory
NULL
temp->right = NULL; at address 1200.
1100.
1000.
pRoot->right = temp; And of course, later we’d have
// etc…
to delete our tree’s nodes.
We’ve created a binary tree…
46

now what?
Now that we’ve created a
binary tree, what can we
do with it?
Well, next class we’ll learn
how to use the binary tree to
speed up searching for data.
But for now, let’s learn how
to iterate through each item
in a tree, one at a time.
This is called “traversing” the
tree, and there are several
ways to do it.
47

Binary Tree Traversals


When we iterate through all the nodes
in a tree, it’s called a traversal.
“a” root
Any time we traverse through a tree, we
always start with the root node.
“b” “c”
NULL NULL

There are four common ways to


traverse a tree. “d” “e”
NULL NULL NULL NULL

Each technique differs in the order that each node


is visited during the traversal:
1. Pre-order traversal
2. In-order traversal
3. Post-order traversal
4. Level-order traversal
48

The Preorder Traversal


“a” root
Preorder:
1. Process the current node. “b” “c”
2. Process the nodes in the left NULL NULL

sub-tree.
3. Process the nodes in the right “d” “e”
sub-tree. NULL NULL NULL NULL

By “process the current node” we typically mean one of


the following:

1. Print the current node’s value out.


2. Search the current node to see if its value matches
the one you’re searching for.
3. Add the current node’s value to a total for the tree
4. Etc…
49

The Pre-order Traversal cur a


“a” root
Output:
cur b
“b” “c”
NULL NULL

cur NULL cur d


“d” “e”
Our base NULLcase
NULL checks
NULL NULLto

see if cur points at an


void PreOrder(Node *cur) empty sub-tree.
void
{ PreOrder(Node *cur)
void
{ PreOrder(Node *cur) //Then
Otherwise we “process”
if (cur == NULL)
Once we
if empty,
If usethere’s
return…
so,
cur’s recursion
entire to to
nothing
left
{ PreOrder(Node
void if (cur ==
return;NULL)*cur) // if the
empty, value
return… in the current
{ if (cur == NULL)
return; // if process
subtree
empty, has cur’s
process
return… been entire
and left
we’re
processed, done!
cout << cur->value; // Process the node…
current node.
main()
return;
if (cur == NULL) // if subtree
empty, return…(a simplifying step).
cout <<
return; cur->value; // we
Processthen
the process
current its
node. entire
{
coutPreOrder(cur->left);
<< cur->value; // Process
// Process nodes innode.
theright
current left sub-tree.Node *root;
subtree!
PreOrder(cur->left);
PreOrder(cur-> right);////Process
cout << cur->value; Processnodes
nodesininleft
// Process the current node. sub-tree.
right sub-tree.
PreOrder(cur->left);
}PreOrder(cur-> right);////
Process nodes
Process nodesininleft sub-tree.
right sub-tree.…
}PreOrder(cur-> right);////Process
PreOrder(cur->left); Processnodes
nodesininleft
right sub-tree.
sub-tree.
} PreOrder(cur-> right); // Process nodes in right sub-tree. PreOrder(root);
} }
50

The Pre-order Traversal cur “a” root


Output:
a b d cur “b” “c”
cur
NULL NULL
NULL
cur “d” “e”
NULL NULL NULL NULL

void PreOrder(Node *cur)


void PreOrder(Node *cur)
{
void
{ PreOrder(Node *cur)
if (cur == NULL) // if empty, return…
{ PreOrder(Node
void if (cur == NULL) *cur) // if empty, return…
return;
{ if (cur == NULL)
return; // if empty, return… main()
return;
if (cur == NULL)
cout << cur->value; // if//empty,
Process return…
the current node. {
cout <<
return; cur->value; // Process the current node.
coutPreOrder(cur->left);
<< cur->value; // Process the current
// Process left sub-tree.Node *root;
nodes innode.
PreOrder(cur->left);
cout << cur->value; right); // Process
// Process nodes
the current in left
node.sub-tree.
PreOrder(cur-> // Process nodes in right sub-tree.
PreOrder(cur->left);
PreOrder(cur-> right);//// Process
Process nodes
nodesininleft sub-tree.
right sub-tree.…
}
}PreOrder(cur-> right);////Process
PreOrder(cur->left); Processnodes
nodesininleft
right sub-tree.
sub-tree.
} PreOrder(cur-> right); // Process nodes in right sub-tree. PreOrder(root);
} }
51

The Pre-order Traversal cur “a” root


Output:
a b d cur “b” “c”
NULL NULL

cur “d” “e”


NULL NULL NULL NULL

void PreOrder(Node *cur)


void
{ PreOrder(Node *cur)
{ PreOrder(Node
void if (cur == NULL) *cur) // if empty, return…
{ if (cur == NULL)
return; // if empty, return… main()
return;
if (cur == NULL) // if empty, return…
cout << cur->value; // Process the current node. {
return;
cout << cur->value; // Process the current node. Node *root;
PreOrder(cur->left);
cout << cur->value; // Process nodes in left
// Process the current node. sub-tree.
PreOrder(cur->left);
PreOrder(cur-> right);//// Process nodes
Process nodesininleft sub-tree.
right sub-tree.…
}PreOrder(cur-> right);////Process
PreOrder(cur->left); Processnodes
nodesininleft
right sub-tree.
sub-tree.
} PreOrder(cur-> right); // Process nodes in right sub-tree. PreOrder(root);
} }
52

The Pre-order Traversal cur “a” root


Output:
a b d cur “b” “c”
NULL NULL

“d” e
“e”
cur
NULL NULL NULL NULL

void PreOrder(Node *cur)


void
{ PreOrder(Node *cur)
{ PreOrder(Node
void if (cur == NULL) *cur) // if empty, return…
{ if (curreturn;
== NULL) // if empty, return… main()
return;
if (cur == NULL) // if empty, return…
cout << cur->value; // Process the current node. {
return;
cout << cur->value; // Process the current node. Node *root;
PreOrder(cur->left);
cout << cur->value; // Process nodes in
// Process the current node.left sub-tree.
PreOrder(cur->left);
PreOrder(cur-> right);//// Process nodes
Process nodesin in
left sub-tree.
right sub-tree.…
}PreOrder(cur-> right);////Process
PreOrder(cur->left); Processnodes
nodesininleft
right sub-tree.
sub-tree.
} PreOrder(cur-> right); // Process nodes in right sub-tree. PreOrder(root);
} }
53

The Pre-order Traversal cur “a” root


Output:
a b d e cur “b” “c”
NULL NULL

“d” “e”
NULL NULL NULL NULL

void PreOrder(Node *cur)


{ PreOrder(Node *cur)
void
{ if (cur == NULL) // if empty, return… main()
return;
if (cur == NULL) // if empty, return… {
return;
cout << cur->value; // Process the current node. Node *root;
cout << cur->value; // Process the current node.
PreOrder(cur->left); // Process nodes in left sub-tree. …
PreOrder(cur-> right);////Process
PreOrder(cur->left); Processnodes
nodesininleft
right sub-tree.
sub-tree.
} PreOrder(cur-> right); // Process nodes in right sub-tree. PreOrder(root);
} }
54

The Pre-order Traversal cur “a” root


Output:
a b d e c
“b” cur “c”
NULL NULL

“d” “e”
NULL NULL NULL NULL

void PreOrder(Node *cur)


void
{ PreOrder(Node *cur)
{ if (cur == NULL) // if empty, return… main()
if (curreturn;
== NULL) // if empty, return… {
return;
cout << cur->value; // Process the current node. Node *root;
cout << cur->value; // Process the current node.
PreOrder(cur->left); // Process nodes in left sub-tree. …
PreOrder(cur->left);
PreOrder(cur-> right);////
Process nodes
Process in in
nodes left sub-tree.
right sub-tree.
}PreOrder(cur-> right); // Process nodes in right sub-tree.
PreOrder(root);
} }
55

Appendix – On Your Own Study


• Full Binary Trees
Full Binary Trees
56

A full binary tree is one in which


every leaf node has the same depth,
and every non-leaf has exactly two children.

Depth: 0Has two “carey”


children!

Depth: 1 “leon” “andrea”

All of the 2
Depth: “sheila” “simon” “martha” “milton” Are at the
leaf nodes… NULL NULL NULL NULL NULL NULL NULL NULL
same depth!
Full Binary Trees
57

A full binary tree is one in which


every leaf node has the same depth,
and every non-leaf has exactly two children.
root ptr root ptr

“carey”
“carey”

“leon” “andrea”
“leon” “andrea” NULL NULL NULL
NULL NULL NULL NULL

Is it full? root ptr


Is it full? “simon”
NULL NULL

Is it full?
“carey”
NULL NULL

You might also like