12 AlgorithmAnalysis
12 AlgorithmAnalysis
with
Big Oh
Cost
for (int i = 1; i <= n; i++) -> 1 + n+1 + n = 2n+2
sum = sum++; -> n Total Cost: 3n + 2
Cost
k = 0 -> 1
for (int i = 0; i < n; i++) -> 2n+2
for (int j = 0; j < n; j++) -> n(2n+2) = 2n2 +2n
k++; -> n2 Total Cost: 3n2 + 4n + 3`
Total Cost of Sequential Search
Cost
45
n
60 100
Linear Search Continued
Power of 2 n log2n
24 16 4
28 128 8
212 4,096 12
224 16,777,216 24
Graph Illustrating Relative
Growth n, log n, n2
f(n)
n2
n
log n
n
Other logarithm examples
The guessing game:
Guess a number from 1 to 100
try the middle, you could be right
if it is too high
– check near middle of 1..49
if it is too low
– check near middle of 51..100
Should find the answer in a maximum of 7 tries
If 1..250, a maximum of 2c >= 250, c == 8
If 1..500, a maximum of 2c >= 500, c == 9
If 1..1000, a maximum of 2c >= 1000, c == 10
Logarithmic Explosion
Assuming an infinitely large piece of paper
that can be cut in half, layered, and cut in half
again as often as you wish.
How many times do you need to cut and layer
until paper thickness reaches the moon?
Assumptions
paper is 0.002 inches thick
distance to moon is 240,000 miles
– 240,000 * 5,280 feet per mile * 12 inches per foot =
152,060,000,000 inches to the moon
Examples of Logarithmic Explosion
The number of bits required to store a binary
number is logarithmic add 1 bit to get much larger ints
8 bits stored 256 values log2256 = 8
log 2,147,483,648 = 31
The inventor of chess asked the Emperor to be
paid like this:
1 grain of rice on the first square, 2 on the next,
double grains on each successive square 263
Compare Sequential and
Binary Search
Output from CompareSearches.java (1995)
Search for 20000 objects
Binary Search
#Comparisons: 267248
Average: 13
Run time: 20ms
Sequential Search
#Comparisons: 200010000
Average: 10000
Run time: 9930ms
If f(n) is a sum of several terms, the one with the largest growth
rate is kept, and all others omitted
Example:
f(n) = 100*n
then f(n) is O(n)
Summation of same Orders
The property is useful when an algorithm contains
several loops of the same order
Example:
f(n) is O(n)
f2(n) is O(n)
then f(n) + f2(n) is O(n) + O(n), which is O(n)
Summation of different Orders
This property works because we are only concerned
with the term of highest growth rate
Example:
f1(n) is O(n2)
f2(n) is O(n)
so f1(n) + f2(n) = n2 + n is O(n2)
Product
This property is useful for analyzing segments of an
algorithm with nested loops
Example:
f1(n) is O(n2)
f2(n) is O(n)
then f1(n) x f2(n) is O(n2) x O(n), which is O(n3)
Limitations of Big-Oh Analysis
Constants sometimes make a difference
n log n may be faster than 10000n
Doesn't differentiate between data cache
memory, main memory, and data on a disk--there
is a huge time difference to access disk data
thousands of times slower
Worst case doesn't happen often
it's an overestimate
Quick Analysis
O(n)
for(int j = 0; j < n; j++)
x[j] = 0;
O(n2)
int sum = 0;
for (int j = 0; j < n; j++)
for (int k = 0; k < n; k++)
sum += j * k;
Run times with for loops
O(n3)
for (int j = 0; j < n; j++)
for (int k = 0; k < n; k++)
for (int l = 0; l < n; l++)
sum += j * k * l;
O(n)
for (int j = 0; j < n; j++)
sum++;
for (int j = 0; j < n; j++)
sum--;
O(log n)
for (int j = 1; j < n; j = 2 * j)
sum += j;
Analyze this