Sie sind auf Seite 1von 4

Asymptotic Analysis-Ch. 3!

! Names for classes of algorithms:


constant logarithmic polylogarithmic linear <en log en> quadratic cubic polynomial exponential

Asymptotic Analysis!
Example: an algorithm with running time of order n2 will "eventually" (i.e., for sufficiently large n) run slower than one with running time of order n, which in turn will eventually run slower than one with running time of order lgn.

!(n0) = !(1) !(lgn) !(lgkn), k ! 1 !(n) !(nlgn) !(n2) !(n3) !(nk), k ! 1 !(an), a > 1

Growth ! Rate! Increasing!

Asymptotic analysis in terms of "Big Oh", "Theta", and "Omega" are the tools we will use to make these notions precise.
Note: Our conclusions will only be valid "in the limit" or "asymptotically". That is, they may not hold true for small values of n.

"Big Oh" - Upper Bounding Running Time!


Definition: f(n) " O(g(n)) if there exist constants c > 0 and n0 > 0 such that f(n) # cg(n) for all n ! n0. Intuition: ! f(n) " O(g(n)) means f(n) is of order at most, or less than or

Example: (log n)2 is O(n)!

n = 16

f(n) = (log n)2 g(n) = n

equal to g(n) when we ignore small values of n and constants

! f(n) is eventually trapped below (or = to) some constant multiple of g(n)

! some constant multiple of g(n) is an upper bound for f(n)


(for large enough n)

(log n)2 ! n for all n " 16, so (log n)2 is O(n)

Confused?!
Basic idea: ignore constant factor differences and lower-order terms
!617n3 + 277x2 + 720x + 7x = ? !2 = ? !sin(x) = ?

Logarithms!
The big-O interacts with logarithms in a particular way. logb n = log2 n log2 b
!changing b changes only constant factor !When we say f(n) = O(log n), the base is unimportant

Important Notation!
Sometimes you will see f(n) = O(n2) + O(n) ! Each occurrence of O symbol is distinct constant. ! But O(n2) term dominates O(n) term, equivalent to f(n) = O(n2)

Example: InsertionSort!
INPUT: An array A of n numbers {a1, a2,..., an} OUTPUT: A permutation of input array {a1%, a2%,..., an%} such that a1%# a2%#...# an%.

InsertionSort(A) 1.! for j$ 2 to length(A) 2.! do key$ A[j] 3.! i$ j - 1 4.! while i>0 and A[i]>key 5.! do A[i+1]$ A[i] 6.! i$ i - 1 7.! A[i+1]$ key

Time for execution on input array of length n: o best-case: b(n) = 5n - 4 o worst-case: w(n) = 3n2/2 + 11n/2 - 4

Insertion Sort - Time Complexity!


Time complexities for insertion sort are: o best-case: b(n) = 5n - 4 o worst-case: w(n) = 3n2/2 + 11n/2 - 4

Revisiting Plotting run-time graphically!


For functions f(n) and g(n) (to the right) there are positive constants c and n0 such that: f(n)!c g(n) for n ! n0 conclusion: 2n+6 is O(n) since 2n0+6 ! 4n0 for n0 ! 3 f(n) = 2n+6! g(n) = 4n! n!

Questions: 1.! is b(n) = O(n) ? Yes (5n - 4 < 6n) for all n >=0 2. is w(n) = O(n) ? No (3n2/2 + 11n/2 - 4 >= 3n) for all n>=1 3. is w(n) = O(n2)? Yes (3n2/2 + 11n/2 - 4
< 4n2) for all n >= 0

4. is w(n) = O(n3)? Yes (3n2/2 + 11n/2 - 4 <= 2n3) for all n >=2

n0 = 3!

Revisiting Plotting run-time graphically!


On the other hand n2 is not O(n) because there are no c and n0 such that: n2 " cn for all n ! n0
(As the graph to the right illustrates with c = 4, no matter how large c is chosen, there is an n0 big enough that n02 >= cn0 ).

"Big Omega" - Lower Bounding Running Time!


Definition: f(n) " &(g(n)) if there exist constants c > 0 and n0 > 0 such that

4n! g(n) = n2! n!

f(n) ! cg(n)

for all n ! n0.

Intuition: ! f(n) " &(g(n)) means f(n) is of order at least or greater than or
equal to g(n) when we ignore small values of n . ! f(n) is eventually trapped above (or = to) some constant multiple of g(n)

! some constant multiple of g(n) is a lower bound for f(n)


(for large enough n)

n0 = 4!

Insertion Sort - Time Complexity!


Time complexities for insertion sort are: o best-case: b(n) = 5n - 4 o worst-case: w(n) = 3n2/2 + 11n/2 - 4

"Theta" - Tightly Bounding Running Time!


Definition: f(n) " !(g(n)) if there exist constants c1, c2 > 0 and n0 > 0 such that c1g(n) " f(n) " c2g(n) for all n ! n0.

Questions: 1.! is b(n) = !(n) ? 3.! is w(n) = !(n) ? 5.! is w(n) = !(n3) ?

Yes (5n - 4 >= 2n) for all n0 >= 2 Yes (3n2/2 + 11n/2 - 4 >= 3n) for all n0 >= 1

Intuition: ! f(n) " !(g(n)) means f(n) is of the same order as, or equal to
g(n) when we ignore small values of n. ! f(n) is eventually trapped between two constant multiples of g(n)

4.! is w(n) = !(n2) ? Yes (3n2/2 + 11n/2 - 4 >= n2) for all n0 >=1
No (3n2/2 + 11n/2 - 4 < n3) for all n0>=3

Useful way to show "Theta" relationships: !!Show both a "Big Oh" and "Big Omega" relationship.

Insertion Sort - Time Complexity!


Time complexities for insertion sort are: o best-case: b(n) = 5n - 4 o worst-case: w(n) = 3n2/2 + 11n/2 - 4

Asymptotic Analysis!
! Classifying algorithms is generally done in terms of worstcase running time:

Questions: 1.! is b(n) = "(n) ? Yes because b(n) = O(n) and !(n) 3.! is w(n) = "(n) ? No because w(n) # O(n) 4.! is w(n) =

!O (f(n)): Big Oh--asymptotic upper bound. !& (f(n)): Big Omega--asymptotic lower bound !' (f(n)): Theta--asymptotic tight bound

"(n2)

? Yes because w(n) =

O(n2)

and

!(n2)

5.! is w(n) = "(n3)? No because w(n) # !(n3)

Useful Properties for Asymptotic Analysis!


! If f(n) = O(g(n)) and g(n) = O(h(n)), then f(n) = O(h(n)) (transitivity) intuition: if f(n) "#" g(n) and g(n) "#" h(n) then f(n) "#" h(n) - transitivity also holds for & and ! ! f(n) = O(g(n)) iff g(n) = &(f(n)), intuition: f(n) "#" g(n) iff g(n) "(" f(n) ! f(n) = !(g(n)) iff g(n) = !(f(n)), intuition: f(n) "=" g(n) iff g(n) "=" f(n)

Useful Properties for Asymptotic Analysis!


! O(f(n) + g(n)) = O(max( f(n), g(n) ))) e.g., O( n3 + n ) = O(n3)

! O(f(n) * g(n)) = O( f(n) * g(n) ) e.g., O( n3 * n ) = O(n4) - also holds for & and !

Little Oh!
"Little Oh" notation is used to denote strict upper bounds, (Big-Oh bounds are not necessarily strict inequalities). Definition: f(n) " o(g(n)) if for every c > 0, there exists some n0 > 0 such that for all n ( n0, f(n) < cg(n). Intuition:
!f(n) " o(g(n)) means f(n) is "strictly less than" any constant multiple of g(n) when we ignore small values of n !f(n) is trapped below any constant multiple of g(n) for large enough n

Little Omega!
"Little Omega" notation is used to denote strict lower bounds (& bounds are not necessarily strict inequalities). Definition: f(n) " )(g(n)) if for every c > 0, there exists some n0 > 0 such that for all n ( n0, f(n) > cg(n). Intuition:
!f(n) " )(g(n)) means f(n) is "strictly greater than" any constant multiple of g(n) when we ignore small values of n !f(n) is trapped above any constant multiple of g(n) for large enough n

Little Oh and Little Omega!


Showing "Little Oh and Little Omega" relationships: f(n) " o(g(n)) f(n) " )(g(n)) iff iff lim f(n) / g(n) = 0
n*+ c)! d)! e)! f)! g)! a)! b)!

Handy Asymptotic Facts!


If T(n) is a polynomial function of degree k, then T(n) = O(nk) nb = O(an) for any constants a > 1,b > 0 (Exponentials dominate polynomials) In particular, any exponential function with a base strictly greater than 1 grows faster than any polynomial function. n = &(lgn), n = O(nlgn), and n2 = &(nlgn) n! = o(nn) n! = )(2n) lg(n!) = !(nlgn) (by Stirlings approximation) The base of a logarithm doesn't matter asymptotically, but the base of an exponential function and the degree of a polynomial do matter asymptotically.

lim f(n) / g(n) = "


n*+

! Iterated logarithm function (lg*n): ! the number of times the log function must be iteratively applied before the result is less than or equal to 1 - "log star of n" - Very slow growing, e.g. lg*(265536) = 5 (and 265536 is much larger than the number of atoms in the observable universe!!) eg: lg*2 = 1 lg*4 = 2 lg*16 = 3 lg*65536 = 4

Basic asymptotic efciency classes!


Class
1 lgn n nlgn n2 n3 2n n!

Name
Constant Logarithmic Linear n-log-n Quadratic Cubic Exponential Factorial

Comments
Algorithm ignores input (i.e., cant even scan input) Cuts problem size by constant fraction on each iteration Algorithm scans its input (at least) Some divide and conquer Loop inside loop = nested loop Loop inside nested loop Algorithm generates all subsets of n-element set Algorithm generates all permutations of n-element set

Das könnte Ihnen auch gefallen