Ada 1
Ada 1
Answer-
Example-
Here LA is a linear array with N elements and ITEM is a given element to be searched. This algorithm
finds the location LOC of ITEM in LA or sets LOC =0, if the search is unsuccessful.
Else:
Step- 5 Exit.
The format for the formal representation of an algorithm consists of two parts:
a. Paragraph - It tells the purpose of the algorithm and lists the input data.
b. Steps - Steps that is to be executed.
Characteristics of an Algorithm: -
Answer-
Algorithm analysis provides theoretical estimation for the required resources by an algorithm to
solve a specific computational problem.
Analysis of algorithms is the determination of the amount of time and space resources required
to execute it.
When element is not present or it is last element, the search () function compares it with all
the elements of arr[] one by one.
Therefore, the worst-case time complexity of the linear search would be O(n) means O(7)
here.
The number of operations in the best case is constant (not dependent on n).
So time complexity in the best case would be Ω(1)
Answer-
Asymptotic notations are mathematical tools to represent the time complexity of algorithms
for asymptotic analysis.
Mathematically-
O(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ f(n) ≤ cg(n) for all n ≥ n0 }
Where-
n0 = the point where the equation starts being true and does so until infinity.
It returns the highest possible output value (big-O) for a given input.
Example-
For Linear Search, the worst case happens when the element to be searched is not present in
the array or last element.
Let we a linear array LA and element to be searched is ITEM = 4
10 3 9 5 10 11 4
When element is not present or it is last element, the search () function compares it with all
the elements of arr[] one by one.
Therefore, the worst-case time complexity of the linear search would be O(n) means O(7)
here.
Omega Notation (Ω-Notation): -
Omega notation represents the lower bound of the running time of an algorithm.
Ω(g(n)) = { f(n): there exist positive constants c and n0 such that 0 ≤ cg(n) ≤ f(n) for all n ≥ n0 }
Where-
n0 = the point where the equation starts being true and does so until infinity.
In the linear search problem, the best case occurs when x is present at the first location.
Let we a linear array LA and element to be searched is ITEM = 4
4 10 3 9 5 10 11
The number of operations in the best case is constant (not dependent on n).
So time complexity in the best case would be Ω(1)
Θ (g(n)) = {f(n): there exist positive constants c1, c2 and n0 such that 0 ≤ c2 * g(n) ≤ f(n) ≤ c1
* g(n) for all n ≥ n0}
Where-
n0 = the point where the equation starts being true and does so until infinity.
---------------------------------------------------------------------------------------------------------------------------
6. What is the difference between Big oh, Big Omega and Big Theta.
---------------------------------------------------------------------------------------------------------------------------
We can say,
If f(n) is Θ(g(n)) then a*f(n) is also Θ(g(n)), where a is a constant.
If f(n) is Ω (g(n)) then a*f(n) is also Ω (g(n)), where a is a constant.
2. Transitive Properties:
We can say,
If f(n) is Θ(g(n)) and g(n) is Θ(h(n)) then f(n) = Θ(h(n)) .
If f(n) is Ω (g(n)) and g(n) is Ω (h(n)) then f(n) = Ω (h(n))
3. Reflexive Properties:
4. Symmetric Properties: