The document provides an introduction to notation and asymptotic analysis in algorithm design, focusing on evaluating algorithm performance through time and space complexity. It explains various asymptotic notations such as Big-O, Omega, and Theta, which describe upper, lower, and tight bounds on algorithm growth rates. Additionally, it includes examples and questions to assess understanding of these concepts.
Download as PPTX, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
5 views
Introduction to Notation and Asymptotic Analysis
The document provides an introduction to notation and asymptotic analysis in algorithm design, focusing on evaluating algorithm performance through time and space complexity. It explains various asymptotic notations such as Big-O, Omega, and Theta, which describe upper, lower, and tight bounds on algorithm growth rates. Additionally, it includes examples and questions to assess understanding of these concepts.
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 26
Introduction to notation
and asymptotic analysis
CSE408 Design And Analysis of Algorithms
Evaluates algorithm performance Measures time and space complexity Asympto Examines input growth behavior tic Ignores constant, lower-order Analysis terms Focuses on dominant growth term Aims to understand efficiency Machine independence in evaluation Focus on algorithm growth Why Use rates Predicts scalability with input Asympto size tic Compares algorithm efficiencies Analysis? Evaluates resource requirements Handles large input performance Big-O (O): Upper bound
Omega (Ω): Lower bound
Types of Asympto Theta (Θ): Tight bound Little-o (o): Loose upper tic bound Notation Little-omega (ω): Loose lower bound Used for analyzing algorithms - Describes the upper bound. - Represents the worst case. Big-O - Example: T(n) ∈ O(n²). Notation - Time ≤ c⋅f(n) for n ≥ n₀. (O) - Focuses on maximum growth. - Ensures no slower performance. - Describes the lower bound.
- Represents the best case.
Omega - Example: T(n) ∈ Ω(n).
Notation (Ω) - Time ≥ c⋅g(n) for n ≥ n₀. - Focuses on minimum growth. - Ensures no faster performance. Theta Notation (Θ) •- Describes the tight bound. •- Represents the average case. •- Example: T(n) ∈ Θ(n log n). •- c₁⋅h(n) ≤ T(n) ≤ c₂⋅h(n), n ≥ n₀. •- Bounds both upper and lower. •- Indicates exact performance. Reflexive symmetric Transitive
Big(o),F(n) <= c.g(n) yes no yes
Big omega , F(n) >= yes no yes
c.g(n)
Theta ,c.g(n)<=f(n)<=c yes yes yes
.g(n)
Small (O) , f(n) < c.g(n) no no yes
Small (omega) no no yes
f(n)>c.g(n) • 1. Which of the following functions has the largest growth rate? • a. O(1)b. O(logn)c. O(n)d. O(n2) • 2. If an algorithm has a time complexity of O(3n+5) what is its simplified Big-O notation?a. O(3n)b. O(n)c. O(5)d. O(n2) • What is the time complexity of a binary search on a sorted array of size n? a. O(n) b. O(logn) c. O(n^2) d. O(1) • Which of these time complexities represents the fastest- growing function? a. O(n) b. O(2^n) c. O(nlogn) d. O(\sqrt{n}) • 5. A sorting algorithm takes O(n^2) time in the worst case and O(n) time in the best case. Which of the following algorithms fits this description?a. Merge Sort b. Quick Sort c. Bubble Sort d. Heap Sort • An algorithm has T1(n)=n2+10n+5and another has T2(n)=100nWhich algorithm is asymptotically better as n→∞, and why? T2(n) is better because it grows linearly (O(n), while T1(n) grows quadratically (O(n2) Arrange the following complexities in increasing order of growth rateO(logn), O(n) ,O(nlogn), O(n^2), O(2^n)