Beruflich Dokumente
Kultur Dokumente
=
+ =
1
1
min ) (
n
i
swap element find n T
= n-1 + n-2 + n-3 + + 1 = n(n-1)/2
Or = (n - i) = n (n - 1) / 2 - O(n
2
)
Binary search
sorted sequence : (search 9)
1 4 5 7 9 10 12 15
step 1 |
step 2 |
step 3 |
best case: 1 step = O(1)
worst case: (log
2
n+1) steps = O(log n)
average case: O(log n) steps
Growth Rate of an Algorithm (Order of Growth)
We often want to compare the performance of algorithms
When doing so we generally want to know how they
perform when the problem size (n) is large
Since cost functions are complex, and may be difficult to
compute, we approximate them using Asymptotic
notation
Example of a Cost Function
Cost Function: t
A
(n) = n
2
+ 20n + 100
Which term dominates?
It depends on the size of n
n = 2, t
A
(n) = 4 + 40 + 100
The constant, 100, is the dominating term
n = 10, t
A
(n) = 100 + 200 + 100
20n is the dominating term
n = 100, t
A
(n) = 10,000 + 2,000 + 100
n
2
is the dominating term
n = 1000, t
A
(n) = 1,000,000 + 20,000 + 100
n
2
is the dominating term
Asymptotic notation Introduction
Exact counting of operations is often difficult
(and tedious), even for simple algorithms
Often, exact counts are not useful due to other
factors, e.g. the language/machine used, or the
implementation of the algorithm (different types
of operations do not take the same time anyway)
Asymptotic-notation is a mathematical language
for evaluating the running-time (and memory
usage) of algorithms.
Big O Notation
O notation approximates the cost function of an
algorithm
The approximation is usually good enough, especially
when considering the efficiency of algorithm as n gets
very large
Allows us to estimate rate of function growth
Instead of computing the entire cost function we
only need to count the number of times that an
algorithm executes its barometer instruction(s)
The instruction that is executed the most number of
times in an algorithm (the highest order term)
The general idea is
when using Big-O notation, rather than giving a
precise figure of the cost function using a
specific data size n
express the behaviour of the algorithm as its
data size n grows very large
so ignore
lower order terms and
constants
O Notation Examples
All these expressions are O(n):
n, 3n, 61n + 5, 22n 5,
All these expressions are O(n
2
):
n
2
, 9 n
2
, 18 n
2
+ 4n 53,
All these expressions are O(n log n):
n(log n), 5n(log 99n), 18 + (4n 2)(log (5n + 3)),
Asymptotic Notation (cont.)
Note: Even though it is correct to say 7n - 3 is O(n
3
), a
better statement is 7n - 3 is O(n), that is, one should make
the approximation as tight as possible
Simple Rule: Drop lower order terms and constant factors
7n-3 is O(n)
8n
2
log n + 5n
2
+ n is O(n
2
log n)
Worst/ best/average cases
Worst case is the longest running time for
any input of size n
O-notation represents upper bound I.e. an
upper bound for worst case.
Best case is time taken for some input
data set that results in best possible
performance.You cannot do better. This is
lower bound.
Average case is the average performance
The Big-Oh Notation:
given functions f(n) and g(n), we
say that f(n) is O(g(n) ) if and only
if there are positive constants c
and n
0
such that f(n) c g(n) for
n n
0
Example
f(n) = 2n + 6
For functions f(n)
and g(n) (to the
right) there are
positive constants c
and n
0
such that:
f(n) c g(n)
for n n
0
conclusion:
2n+6 is O(n).
Asymptotic Notation (terminology)
Special classes of algorithms:
constant O(1)
logarithmic: O(log n)
linear: O(n)
quadratic: O(n
2
)
polynomial: O(n
k
), k 1
exponential: O(a
n
), a > 1
EXAMPLE
Consider 1/3 n
2
5n
The dominating term is n
2
Therefore it should be of O(n
2
)
Given a positive constant c , a positive integer
n
0
to be found such that
1/3 n
2
- 5 n s c n
2
Asymptotic Analysis of The Running Time
Use the Big-Oh notation to express the
number of primitive operations executed as a
function of the input size.
Comparing the asymptotic running time
-an algorithm that runs in O(n) time is better
than one that runs in O(n
2
) time
-similarly, O(log n) is better than O(n)
-hierarchy of functions: log n << n << n
2
<<
n
3
<< 2
n
Categories of algorithm efficiency
Efficiency
Constant
Big O
O(1)
Logarithmic O(log n)
Linear O(n)
Linear logarithmic O(n log n)
Quadratic O(n
2
)
Polynomial O(n
k
)
Exponential O(c
n
)
Factorial O(n!)
O-notation
O(g(n)) = {f(n): - +ve
constants c
1
, c
2
, and n
0
such that 0 s c
1
g(n) s
f(n) s c
2
g(n),
n >
n
0
}
For function g(n), O(g(n)) is
given by:
g(n) is an asymptotically tight bound for f(n).
Intuitively: Set of all functions that
have the same rate of growth as g(n).
O -notation
O(g(n)) = {f(n): - +ve
constants c and n
0
such that
0 s f(n) s cg(n), n >
n
0
}
For function g(n), O(g(n))
is given by:
g(n) is an asymptotic upper bound for f(n).
Intuitively: Set of all functions whose rate
of growth is the same as or lower than that
of g(n).
f(n) = O(g(n)) f(n) = O(g(n)).
O(g(n)) c O(g(n)).
O -notation
O(g(n)) = {f(n): - +ve
constants c and n
0
such that
0 s cg(n) s f(n), n >
n
0
}
For function g(n), O(g(n))
is given by:
g(n) is an asymptotic lower bound for f(n).
Intuitively: Set of all
functions whose rate of
growth is the same as or
higher than that of g(n).
f(n) = O(g(n)) f(n) = O(g(n)).
O(g(n)) c O(g(n)).
Relations Between O, O, O
Practical Complexity
0
250
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
f(n) = n
f(n) = log(n)
f(n) = n log(n)
f(n) = n^2
f(n) = n^3
f(n) = 2^n