Asymptotic Notations for Algorithm Complexity Analysis

asymptotic notations n.w
1 / 36
Embed
Share

Learn about asymptotic notations such as Big Theta, Big O, and Big Omega to analyze the running time of algorithms in relation to input size. Explore how these notations describe the rate of growth of functions and establish bounds for algorithm efficiency.

  • Asymptotic Notations
  • Algorithm Complexity
  • Big O
  • Big Theta
  • Efficiency

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Asymptotic Notations

  2. Asymptotic Complexity Running time of an algorithm as a function of input size n for large n. Expressed using only the highest-order term in the expression for the exact running time. Instead of exact running time, say (n2). Describes behavior of function in the limit. Written using Asymptotic Notation. asymp - 1 KCS-503

  3. Asymptotic Notation , O, , o, Defined for functions over the natural numbers. Ex:f(n) = (n2). Describes how f(n) grows in comparison to n2. Define a set of functions; in practice used to compare two function sizes. The notations describe different rate-of-growth relations between the defining function and the defined set of functions. asymp - 2 KCS-503

  4. -notation For function g(n), we define (g(n)), big-Theta of n, as the set: (g(n)) = {f(n) : positive constants c1, c2, and n0, such that n n0, we have 0 c1g(n) f(n) c2g(n) } Intuitively: Set of all functions that have the same rate of growth as g(n). g(n) is an asymptotically tight bound for f(n). asymp - 3 KCS-503

  5. -notation For function g(n), we define (g(n)), big-Theta of n, as the set: (g(n)) = {f(n) : positive constants c1, c2, and n0, such that n n0, we have 0 c1g(n) f(n) c2g(n) } Technically, f(n) (g(n)). Older usage, f(n) = (g(n)). I ll accept either f(n) and g(n) are nonnegative, for large n. asymp - 4 KCS-503

  6. Example (g(n)) = {f(n) : positive constants c1, c2, and n0, such that n n0, 0 c1g(n) f(n) c2g(n)} 10n2-3n = (n2) What constants for n0, c1, and c2 will work? Make c1 a little smaller than the leading coefficient, and c2 a little bigger. To compare orders of growth, look at the leading term. Exercise: Prove that n2/2-3n= (n2) asymp - 5 KCS-503

  7. Example (g(n)) = {f(n) : positive constants c1, c2, and n0, such that n n0, 0 c1g(n) f(n) c2g(n)} Is 3n3 (n4) ?? How about 22n (2n)?? asymp - 6 KCS-503

  8. O-notation For function g(n), we define O(g(n)), big-O of n, as the set: O(g(n)) = {f(n) : positive constants c and n0, such that n n0, we have 0 f(n) cg(n) } Intuitively: Set of all functions whose rate of growth is the same as or lower than that of g(n). g(n) is an asymptotic upper bound for f(n). f(n) = (g(n)) f(n) = O(g(n)). (g(n)) O(g(n)). asymp - 7 KCS-503

  9. Examples O(g(n)) = {f(n) : positive constants c and n0, such that n n0, we have 0 f(n) cg(n) } Any linear functionan + b is in O(n2). How? Show that 3n3=O(n4) for appropriate c and n0. asymp - 8 KCS-503

  10. -notation For function g(n), we define (g(n)), big-Omega of n, as the set: (g(n)) = {f(n) : positive constants c and n0, such that n n0, we have 0 cg(n) f(n)} Intuitively: Set of all functions whose rate of growth is the same as or higher than that of g(n). g(n) is an asymptotic lower bound for f(n). f(n) = (g(n)) f(n) = (g(n)). (g(n)) (g(n)). asymp - 9 KCS-503

  11. Example (g(n)) = {f(n) : positive constants c and n0, such that n n0, we have 0 cg(n) f(n)} n = (lg n). Choose c and n0. asymp - 10 KCS-503

  12. Relations Between , O, asymp - 11 KCS-503

  13. Relations Between , , O Theorem : For any two functions g(n) and f(n), f(n) = (g(n)) iff f(n) = O(g(n)) and f(n) = (g(n)). I.e., (g(n)) = O(g(n)) (g(n)) In practice, asymptotically tight bounds are obtained from asymptotic upper and lower bounds. asymp - 12 KCS-503

  14. Running Times Running time is O(f(n)) Worst case is O(f(n)) O(f(n)) bound on the worst-case running time O(f(n)) bound on the running time of every input. (f(n)) bound on the worst-case running time (f(n)) bound on the running time of every input. Running time is (f(n)) Best case is (f(n)) Can still say Worst-case running time is (f(n)) Means worst-case running time is given by some unspecified function g(n) (f(n)). asymp - 13 KCS-503

  15. Example Insertion sort takes (n2) in the worst case, so sorting (as a problem) is O(n2). Why? Any sort algorithm must look at each item, so sorting is (n). In fact, using (e.g.) merge sort, sorting is (n lg n) in the worst case. Later, we will prove that we cannot hope that any comparison sort to do better in the worst case. asymp - 14 KCS-503

  16. Asymptotic Notation in Equations Can use asymptotic notation in equations to replace expressions containing lower-order terms. For example, 4n3 + 3n2 + 2n + 1 = 4n3 + 3n2 + (n) = 4n3 + (n2) = (n3). How to interpret? In equations, (f(n)) always stands for an anonymous functiong(n) (f(n)) In the example above, (n2) stands for 3n2 + 2n + 1. asymp - 15 KCS-503

  17. o-notation For a given function g(n), the set little-o: o(g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have0 f(n)< cg(n)}. f(n) becomes insignificant relative to g(n)as n approaches infinity: lim [f(n) / g(n)] = 0 g(n) is anupper bound for f(n)that is not asymptotically tight. Observe the difference in this definition from previous ones. Why? n asymp - 16 KCS-503

  18. -notation For a given function g(n), the set little-omega: (g(n)) = {f(n): c > 0, n0 > 0 such that n n0, we have0 cg(n) < f(n)}. f(n) becomes arbitrarily large relative to g(n)as n approaches infinity: lim [f(n) / g(n)] = . n g(n) is alower bound for f(n)that is not asymptotically tight. asymp - 17 KCS-503

  19. Comparison of Functions f g a b f (n) = O(g(n)) a b f (n) = (g(n)) a b f (n) = (g(n)) a = b f (n) = o(g(n)) a < b f (n) = (g(n)) a > b asymp - 18 KCS-503

  20. Limits lim [f(n) / g(n)] = 0 f(n) (g(n)) n lim [f(n) / g(n)] < f(n) (g(n)) n 0 < lim [f(n) / g(n)] < f(n) (g(n)) n 0 < lim [f(n) / g(n)] f(n) (g(n)) n lim [f(n) / g(n)] = f(n) (g(n)) n lim [f(n) / g(n)] undefined can t say n asymp - 19 KCS-503

  21. Properties Transitivity f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n)) f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n)) f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n)) f(n) = o (g(n)) & g(n) = o (h(n)) f(n) = o (h(n)) f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n)) Reflexivity f(n) = (f(n)) f(n) = O(f(n)) f(n) = (f(n)) asymp - 20 KCS-503

  22. Properties Symmetry f(n) = (g(n)) iffg(n) = (f(n)) Complementarity f(n) = O(g(n)) iffg(n) = (f(n)) f(n) = o(g(n)) iffg(n) = ((f(n)) asymp - 21 KCS-503

  23. Common Functions

  24. Monotonicity f(n) is monotonically increasing if m n f(m) f(n). monotonically decreasing if m n f(m) f(n). strictly increasing if m < n f(m) < f(n). strictly decreasing if m > n f(m) > f(n). asymp - 23 KCS-503

  25. Exponentials Useful Identities: 1 = 1 a a = m n mn ( ) a a + = m n m n a a a Exponentials and polynomials b n = lim n 0 n a = b n ( ) n o a asymp - 24 KCS-503

  26. Logarithms a log = log a b ( b x = logba is the exponent for a = bx. = + ) = log log ab a b c c c n log log a n a b b Natural log: ln a = logea Binary log: lg a = log2a log a = log c a b log = b c log / 1 ( ) log a a lg2a = (lg a)2 lglg a =lg(lg a) b b 1 = log a b log b a = log log c a a c b b asymp - 25 KCS-503

  27. Logarithms and exponentials Bases If the base of a logarithm is changed from one constant to another, the value is altered by a constant factor. Ex: log10n * log210 = log2n. Base of logarithm is not an issue in asymptotic notation. Exponentials with different bases differ by a exponential factor (not a constant factor). Ex: 2n= (2/3)n*3n. asymp - 26 KCS-503

  28. Polylogarithms For a 0, b > 0, lim n ( lgan / nb ) = 0, so lgan = o(nb), and nb = (lgan ) Prove using L Hopital s rule repeatedly lg(n!) = (n lg n) Prove using Stirling s approximation (in the text) for lg(n!). asymp - 27 KCS-503

  29. Exercise Express functions in A in asymptotic notation using functions in B. A B 5n2 + 100n A (B) 3n2 + 2 A (n2), n2 (B) A (B) log3(n2) logba = logca / logcb; A = 2lgn / lg3, B = 3lgn, A/B =2/(3lg3) nlg4 alog b =blog a; B =3lg n=nlg 3; A/B =nlg(4/3) as n lg2n lim( lgan / nb ) = 0 (here a = 2 and b = 1/2) A (B) A (B) log2(n3) 3lg n A (B) A (B) n1/2 n asymp - 28 KCS-503

  30. Summations Review

  31. Review on Summations Why do we need summation formulas? For computing the running times of iterative constructs (loops). (CLRS Appendix A) Example: Maximum Subvector Given an array A[1 n] of numeric values (can be positive, zero, and negative) determine the subvector A[i j] (1 i j n) whose sum of elements is maximum over all subvectors. 1 -2 2 2 asymp - 30 KCS-503

  32. Review on Summations MaxSubvector(A,n) maxsum 0; fori 1 ton doforj = i to n return maxsum sum 0 fork i to j do sum += A[k] maxsum max(sum, maxsum) n n j T(n) = 1 i=1 j=i k=i asymp - 31 KCS-503

  33. Review on Summations Constant Series: For integers a and b, a b, b = i = + 1 1 b a a Linear Series (Arithmetic Series): For n 0, ) 1 + n ( n n = i = + + + = 1 2 i n 2 1 Quadratic Series: For n 0, + + n ( 1 )( 2 ) 1 n n n = i = + + + = 2 2 2 2 1 2 i n 6 1 asymp - 32 KCS-503

  34. Review on Summations Cubic Series: For n 0, + 2 2 n ( ) 1 n n = i = + + + = 3 3 3 3 1 2 i n 4 1 Geometric Series: For real x 1, = k 0 + 1 1 n n 1 x = + + + + = 2 k n 1 x x x x x For |x| < 1, 1 = k = k x 1 x 0 asymp - 33 KCS-503

  35. Review on Summations Linear-Geometric Series: For n 0, real c 1, + + + + + 1 2 n n ( ) 1 n n c ( nc 2 c = i = + + + = 2 i n 2 ic c c nc ) 1 c 1 Harmonic Series: nth harmonic number, n I+, Hn 3 2 1 1 1 = + + + + 1 n n 1 = k = = + ln( ) ) 1 ( n O k 1 asymp - 34 KCS-503

  36. Any Query? asymp - 35 KCS-503

Related


More Related Content