Alg Wks1 2
Alg Wks1 2
3rd ed.
Ch1-ch2
2024
1
Getting Started
Algorithms
Algorithm: Is any well-defined computational
procedure that takes some value, or set of values, as
input and produces some value, or set of values, as
output. An algorithm is thus a sequence of
computational steps that transform the input into the
output.
Data structures
A data structure is a way to store and organize data
in order to facilitate access and modifications.
2
Getting Started
L1.3
The Role of Algorithms in Computing
Getting Started
Algorithms
Example: sorting problem
What kinds of problems are solved by algorithms?
What is the suitable data structures?
Is it Hard problem?
So, what about the Efficiency of the solution?
• Computer A takes
• Comp. B takes:
By using an algorithm whose running time grows more slowly, even with a poor
compiler, computer B runs more than 17 times faster than computer A
4
Getting Started
Example:
Input: 8 2 4 9 3 6
Output: 2 3 4 6 8 9
L1.5
Getting Started
Complexities
• Complexities: the amount of resources (such as
time or memory) required to solve a problem
or perform a task
6
Getting Started
7
Getting Started
• Insertion Sort
8
Insertion Sort
Analysis
• Analyzing algorithms
• Space
• Time (Running Time)
9
Insertion Sort
10
Best Case
Average case
Worst case Analysis –Worst case
12
Analysis
• Best Case
• Average case We can express this running
time as an + b for
• Worst case constants a and b that depend on
the statement costs ci; it is thus
a linear function of n.
• Order of Growth:
13
Designing algorithms
• The divide-and-conquer approach
• Divide the problem into a number of subproblems that are smaller
instances of the same problem.
Conquer the subproblems by solving them recursively. If the
subproblem sizes are small enough, however, just solve the
subproblems in a straightforward manner.
Combine the solutions to the subproblems into the solution for the
original problem.
Smaller Smaller
problem problem
Recurse! Recurse!
16
MergeSort
17
It works Let’s assume n = 2t
Not technically a “loop invariant,”
but a ”recursion invariant,” that
• Invariant: should hold at the beginning of
every recursive call.
“In every recursive call,
MERGESORT returns a sorted array.”
• Base case (n=1): a 1-element
array is always sorted.
• Maintenance: Suppose that L and • n = length(A)
R are sorted. Then MERGE(L,R) is • if n ≤ 1:
sorted. • return A
• Termination: “In the top • L = MERGESORT(A[1 : n/2])
recursive call, MERGESORT • R = MERGESORT(A[n/2+1 : n ])
returns a sorted array.” • return MERGE(L,R)
The maintenance step needs more
details!! Why is this statement true? 18
Analysis
19
Why cn????
c(n/2)+c(n/2) = cn
20
21
Simple extra Example
23
1000000
n log(n) vs n2
continued
800000
n n log(n) n^2
8 24 64 600000
16 64 256
32 160 1024
400000
64 384 4096
128 896 16384
256 2048 65536 200000
n log(n) n^2 24
Growth of Functions
(later)
25
Back to Insertion sort
INSERTION-SORT (A, n) ⊳ A[1 . . n]
for j ← 2 to n
key ← A[ j]
“pseudocode” i←j–1
while i > 0 and A[i] > key
do A[i+1] ← A[i]
i←i–1
A[i+1] = key
1 i j n
A:
key
sorted
L1.26
Back to Insertion sort pseudocode
Go one-at-a-time
until things are in
the right place.
Mmm : it is 1+2+3+….+n
m
So, it is big oh n2
27
Back to Insertion sort: running ntime
2
n-1 iterations
of the outer
loop
30
To summarize
Can we do better?
31
Example of insertion sort (Back)
8 2 4 9 3 6
L1.32
Example of insertion sort
8 2 4 9 3 6
L1.33
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
L1.34
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
L1.35
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
L1.36
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
L1.37
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
L1.38
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
L1.39
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
L1.40
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
L1.41
Example of insertion sort
8 2 4 9 3 6
2 8 4 9 3 6
2 4 8 9 3 6
2 4 8 9 3 6
2 3 4 8 9 6
2 3 4 6 8 9 done
L1.42
Running time
BIG IDEAS:
“Asymptotic Analysis”
L1.45
Asymptotic notations
introduction
Which is larger:
10n2 or 2n3
What is n0
--------------------------------------------------------------
2n3<= 3n3+5 <=5n3 n0=2
46
Q-notation
DEF:
3
2n <= 3
3n +5 <=5n 3 n0=2
L1.47
Growth
48
Asymptotic performance
When n gets large enough, a Q(n2) algorithm always beats a Q(n3) algorithm.
.
• Asymptotic analysis is a
useful tool to help to
structure our thinking
toward better algorithm
• We shouldn’t ignore
T(n)
asymptotically slower
algorithms, however.
• Real-world design
n n0 situations often call for a
L1.49 careful balancing
Insertion sort analysis (back)
Worst case: Input reverse sorted.
n
T ( n) = Q ( j ) = Q( )
[arithmetic series]
n2
j =2
Average case: All permutations equally likely.
n
T ( n) = Q( j / 2) = Q(n 2 )
j =2
Is insertion sort a fast sorting algorithm?
• Moderately so, for small n.
• Not at all, for large n.
L1.50
Example 2: Integer Multiplication
• Let X = A B and Y = C D where A,B,C and D
are integer digits
• Simple Method: XY = 2 x 2 multiplications
• Running Time Recurrence
T(n) is the max number of multiplications
cn cn
cn/2 cn/2 cn
h = lg n
cn/4 cn/4 cn/4 cn/4 cn
…
Q(1) #leaves = n Q(n)
Total = Q(n lg n)
L1.52
Example 3:Merge sort
MERGE-SORT A[1 . . n]
1. If n = 1, done.
2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
3. “Merge” the 2 sorted lists.
L1.53
Analyzing merge sort
MERGE-SORT A[1 . . n]
T(n)
Q(1)
1. If n = 1, done.
2T(n/2)
2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
Q(n)
3. “Merge” the 2 sorted lists
Sloppiness: Should be T( n/2 ) + T( n/2 ) , but it turns out not to matter
asymptotically.
L1.54
Recurrence for merge sort
Q(1) if n = 1;
T(n) =
2T(n/2) + Q(n) if n > 1.
L1.55
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
L1.56
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
T(n)
L1.57
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
T(n/2) T(n/2)
L1.58
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2
L1.59
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2
Q(1)
L1.60
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2
h = lg n
cn/4 cn/4 cn/4 cn/4
Q(1)
L1.61
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2
h = lg n
cn/4 cn/4 cn/4 cn/4
Q(1)
L1.62
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n
cn/4 cn/4 cn/4 cn/4
Q(1)
L1.63
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n
cn/4 cn/4 cn/4 cn/4 cn
…
Q(1)
L1.64
Finally
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n
cn/4 cn/4 cn/4 cn/4 cn
…
Q(1) #leaves = n Q(n)
L1.65
Conclusions
• Q(n lg n) grows more slowly than Q(n2).
• Therefore, merge sort asymptotically
beats insertion sort in the worst case.
• In practice, merge sort beats insertion
sort for n > 30 or so.
• O(1) <O(lgn)<O(n)< etc. as discussed
L1.66
Next :
Time Complexity- Asymptotic Notations
67