Beruflich Dokumente
Kultur Dokumente
Introduction
The methods of algorithm design form one of the core
practical technologies of computer science.
The main aim of this lecture is to familiarize the student
with the framework we shall use through the course
about the design and analysis of algorithms.
We start with a discussion of the algorithms needed to
solve computational problems. The problem of sorting is
used as a running example.
We introduce a pseudocode to show how we shall
specify the algorithms.
Algorithms
The word algorithm comes from the name of a Persian
mathematician Abu Jafar Mohammed ibn-i Musa al
Khowarizmi.
In computer science, this word refers to a special
method useable by a computer for solution of a problem.
The statement of the problem specifies in general terms
the desired input/output relationship.
For example, sorting a given sequence of numbers into
nondecreasing order provides fertile ground for
introducing many standard design techniques and
analysis tools.
Insertion Sort
Analysis of algorithms
The theoretical study of computer-program
performance and resource usage.
Whats more important than performance?
modularity
correctness
maintainability
functionality
robustness
user-friendliness
programmer time
simplicity
extensibility
reliability
Analysis of algorithms
Why study algorithms and performance?
Algorithms help us to understand scalability.
Performance often draws the line between what is feasible
and what is impossible.
Algorithmic mathematics provides a language for talking
about program behavior.
The lessons of program performance generalize to other
computing resources.
Speed is fun!
Running Time
The
Kinds of analyses
Worst-case: (usually)
T(n) = maximum time of algorithm on any input of
size n.
Average-case: (sometimes)
T(n) = expected time of algorithm over all inputs of
size n.
Need assumption of statistical distribution of inputs.
Best-case:
Cheat with a slow algorithm that works fast on some
input.
Machine-Independent time
The RAM Model
Machine independent algorithm design depends on a
hypothetical computer called Random Acces Machine (RAM).
Assumptions:
Each simple operation such as +, -, if ...etc takes exactly
one time step.
Loops and subroutines are not considered simple
operations.
Each memory acces takes exactly one time step.
Machine-independent time
Analysis of INSERTION-SORT(contd.)
INSERTION - SORT(A)
1 for j 2 to length [ A]
2
3
4
5
6
7
8
do key A[ j ]
cost times
c1
n
c2
n 1
n 1
n 1
c5
c6
c7
c8
n 1
n
t
j 2 j
n
(t
j 2 j
n
(t
j 2 j
1)
1)
Analysis of INSERTION-SORT(contd.)
j 2
j 2
T (n) c1 c2 (n 1) c4 (n 1) c5 t j c6 (t j 1)
n
c7 (t j 1) c8 (n 1).
j 2
Analysis of INSERTION-SORT(contd.)
T (n) c1n c2 (n 1) c4 (n 1) c5 (n 1) c8 (n 1)
(c1 c2 c4 c5 c8 )n (c2 c4 c5 c8 ).
Analysis of INSERTION-SORT(contd.)
The worst case: The array is reverse sorted
(tj =j for j=2,3, ...,n).
n(n 1)
j
j 1
2
n
c6 (n(n 1) / 2) c7 (n(n 1) / 2) c8 (n 1)
(c5 / 2 c6 / 2 c7 / 2)n 2 (c1 c2 c4 c5 / 2 c6 / 2 c7 / 2 c8 )n
T (n) an 2 bn c
Growth of Functions
Although we can sometimes determine the exact
running time of an algorithm, the extra precision is not
usually worth the effort of computing it.
For large inputs, the multiplicative constants and lower
order terms of an exact running time are dominated by
the effects of the input size itself.
Asymptotic Notation
The notation we use to describe the asymptotic running
time of an algorithm are defined in terms of functions
whose domains are the set of natural numbers
N 0, 1, 2, ...
O-notation
For a given function g (n) , we denote by O( g (n)) the
set of functions
f (n) : there exist positive constants c and n0 s.t.
O( g (n))
f
(
n
)
cg
(
n
)
for
all
n
n
0
-Omega notation
For a given function
set of functions
cg (n)
is always
-Theta notation
For a given function g (n), we denote by ( g (n)) the
set of functions
f (n) : there exist positive constants c1 , c2 , and n0 s.t.
( g (n))
Asymptotic notation
Graphic examples of
, O,
and .
Example 1.
1 2
2
f
(
n
)
3
n
(
n
)
Show that
2
1 2
n 3n (n 2 )
2
Theorem
For any two functions f (n) and g (n) , we have
f (n) ( g (n))
if and only if
Example 2.
f (n) 3n 2 2n 5 (n 2 )
Because :
3n 2 2n 5 (n 2 )
3n 2 2n 5 O (n 2 )
Example 3.
3n 2 100n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100n 6
Example 3.
3n 2 100n 6 O( n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O( n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
Example 3.
3n 2 100n 6 O( n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O( n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
3n 2 100n 6 O(n)
Example 3.
3n 2 100n 6 O(n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O(n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
3n 2 100n 6 O( n)
Example 3.
3n 2 100n 6 O( n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O( n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
3n 2 100n 6 O(n)
Example 3.
3n 2 100n 6 O( n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O( n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
3n 2 100n 6 O(n)
Example 3.
3n 2 100n 6 O( n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O( n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
3n 2 100n 6 O(n)
Example 3.
3n 2 100n 6 O( n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O( n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
3n 2 100n 6 O(n)
Example 3.
3n 2 100n 6 O( n 2 ) since for c 3, 3n 2 3n 2 100n 6
3n 2 100n 6 O( n 3 ) since for c 1, n 3 3n 2 100n 6 when n 3
3n 2 100n 6 O(n)
lg n log 2 n
ln n log e n
log k n (log n) k
lg lg n lg(lg n)
ab
log c (ab) log c a log c b
log b a
log b a n n log b a
log c a
log b a
log c b
log b (1 / a ) log b a
a log b c c log b a
1
log b a
log a b
n
n! 2n
e
1
1
n
n! o(n n )
n! (2 n )
lg(n!) (n lg n)