Algorithms and c coding for informatic olympiads.
Algorithms and c coding for informatic olympiads.
Object Oriented
Programming
Algorithm
Analysis
• (Xa )b = Xab
Logarithms
• logaX = Y aY = X , a > 0, X > 0
e.g.: log28 = 3; 23 = 8
• loga1 = 0 because a0 = 1
logX means log2X
lgX means log10X
lnX means logeX,
where ‘e’ is the natural
number
Logarithms
• loga(XY) = logaX + logaY
• loga(X/Y) = logaX – logaY
• loga(Xn) = nlogaX
• loga b = (log2 b)/ (log2a)
•a log x
a =x
Recursive Definitions
• Basic idea: To define objects,
processes and properties in terms of
simpler objects,
simpler processes or
properties of simpler
objects/processes.
Recursive Definitions
• Terminating rule - defining the object
explicitly.
f(0) = 1 i.e. 0! = 1
f(n) = n * f(n-1)
i.e. n! = n * (n-1)!
Example
• Fibonacci numbers
F(0) = 1
F(1) = 1
F(k+1) = F(k) + F(k-1)
1, 1, 2, 3, 5, 8, …
Function Growth
• lim ( n) = ∞, n → ∞
• lim ( na ) = ∞, n → ∞, a > 0
• lim ( 1/n) = 0, n → ∞
• lim ( 1 / (na) ) = 0, n → ∞, a > 0
• lim ( log( n )) = ∞, n → ∞
• lim ( an ) = ∞, n → ∞, a > 0
Function Growth
• lim (f(x) + g(x)) = lim (f(x)) + lim
(g(x))
• Inductive base:
• Show that the claim is true for the smallest
case, usually k = 0 or k = 1.
• Inductive hypothesis:
– Assume that the claim is true for some k
– Prove that the claim is true for k+1
Example of Proof by
Induction
Prove by induction that
S(N) = Σ 2i = 2 (N+1) - 1, for any integer N ≥ 0
i=0 to N
Inductive base
Let n = 0. S(0) = 20 = 1
On the other hand, by the formula S(0) = 2 (0+1)
– 1 = 1.
Therefore the formula is true for n = 0
2. Inductive hypothesis
Assume that S(k) = 2 (k+1) – 1
We have to show that S(k+1) = 2(k + 2) -1
By the definition of S(n):
S(k+1) = S(k) + 2(k+1) = 2 (k+1) – 1 + 2(k+1)
= 2. 2(k+1) – 1 = 2(k+2) – 1
Proof by Counterexample
• Used when we want to prove that a
statement is false.
Types of statements: a claim that
refers to all members of a class.
lim( n / ((n+1)/2 )) = 2,
same rate of growth
(n+1)/2 = Θ(n)
- rate of growth of a linear function
Example
Compare n2 and n2+ 6n
lim( n2 / (n2+ 6n ) )= 1
same rate of growth.
n2+6n = Θ(n2)
rate of growth of a quadratic function
Example
Compare log n and log n2
log n2 = Θ(log n)
logarithmic rate of growth
Example
Θ(n3): n3
5n3+ 4n
105n3+ 4n2 + 6n
Θ(n2): n2
5n2+ 4n + 6
n2 + 5
Θ(log n): log n
log n2
log (n + n3)
Comparing Functions
• same rate of growth: g(n) = Θ(f(n))
• different rate of growth:
either g(n) = o (f(n))
g(n) grows slower than f(n),
and hence f(n) = ω(g(n))
or g(n) = ω (f(n))
g(n) grows faster than f(n),
and hence f(n) = o(g(n))
The Big-Oh Notation
f(n) = O(g(n))
if f(n) grows with
same rate or slower than g(n).
f(n) = Θ(g(n)) or
f(n) = o(g(n))
Example
n+5 = Θ(n) = O(n) = O(n2)
= O(n3) = O(n5)
If g(n) = O(f(n)),
then f(n) = Ω (g(n))
Rule 3:
log k N = O(N) for any constant k.
i.e., logarithms grow very slowly
Examples
n2 + n = O(n2)
we disregard any lower-order term
nlog(n) = O(nlog(n))
n2 + nlog(n) = O(n2)
Typical Growth Rates
C constant, we write O(1)
logN logarithmic
log2N log-squared
N linear
NlogN
N2 quadratic
N3 cubic
2N exponential
N! factorial
Problems
N2 = O(N2) true
2N = O(N2) true
N= O(N2) true
N2 = O(N) false
2N = O(N) true
N= O(N) true
Problems
N2 = Θ (N2) true
2N = Θ (N2) false
N = Θ (N2) false
N2 = Θ (N) false
2N = Θ (N) true
N = Θ (N) true
Running Time Calculations
APPLICATION OF BIG-OH
TO PROGRAM ANALYSIS
Background
• The work done by an algorithm, i.e.
its complexity, is determined by the
number of the basic operations
necessary to solve the problem.
The Task
Determine how the number of
operations depend on the size of input:
N - size of input
F(N) - number of operations
Basic operations in an
algorithm
•Problem: Find x in an array
Operation:
Multiplication of two real numbers
Size of input:
The dimensions of the matrices
Basic operations in an
algorithm
Problem: Sort an array of numbers
sum = 0; O(n2)
for (i=0; i < n; i++)
for (j=0; j<2n; j++)
sum++;
O(n3)
sum = 0;
for (i=0; i<n; i++)
for (j=0; j<n*n; j++)
sum++;
Example
O(n2)
sum = 0;
for (i=0; i<n; i++)
for (j=0; j<i; j++)
sum++;
Example
O(n3*log n)
for(j = 0; j < n*n; j++)
compute_val(j);
O(n)
for (i=0; i<n; i++)
if (a[i]==x)
return 1;
return -1;
Search in a table nxm
O(n*m)
for (i=0; i<n; i++)
for (j=0; j<m; j++)
if (a[i][j]==x)
return 1 ;
return -1;
LOGARITHMS IN RUNNING
TIME
Logarithms in Running Time
• Binary search
• Euclid’s algorithm
• Exponentials
• Rules to count operations
Divide-and-Conquer
Algorithms
• Subsequently reducing the problem by
a factor of two requires O(logN)
operations
Why log N?
• A complete binary tree with N leaves
has logN levels.
15 9 6
9 6 3
6 3 0
Euclid’s Algorithm
(non-recursive implementation)
long gcd(long m, long n) {
long rem;
while (n!=0) {
rem = m%n;
m = n;
n = rem;
}
return m;
}
Why O(logN)
• M % N <= M / 2
• After 1st iteration N appears as first
argument, the remainder is less than
N/2
• After 2nd iteration the remainder
appears as first argument and will be
reduced by a factor of two
• Hence, O(logN)
Computing XN
XN = X*(X2) N/2
, N is odd
XN = (X2) N/2
, N is even
Computing XN
long pow (long x, int n) {
if (n == 0)
return 1;
if (is_Even(n))
return pow(x*x, n/2);
else
return x * pow(x*x, n/2);
}
Why O(LogN)
If N is odd : two multiplications
• XN = X*XN -1