Sie sind auf Seite 1von 35

Discrete Mathematics

Summer 2004
By Dan Barrish-Flood originally for
Fundamental Algorithms
For use by Harper Langston in D.M.

What is an Algorithm?
A problem is a precise specification of input
and desired output
An algorithm is a precise specification of a
method for solving a problem.

What We Want in an
Algorithm
Correctness
Always halts
Always gives the right answer

Efficiency
Time
Space

Simplicity
3

A Typical Problem
Sorting
Input:
A sequence b1, b2, ... bn

Output:
A permutation of that sequence b1, b2, ... bn such
that

b1 b2 b3 K bn
4

A Typical Algorithm
Insertion - Sort( A)
1 for j 2 to length[ A]
2 do key A[ j ]
3
4
5
6
7
8

> Insert A[ j ] into the sorted


sequence A[1.. j 1]
i j 1
while i > 0 and A[i ] > key
do A[i + 1] A[i ]
i i 1
A[i + 1] key
5

Running Time

Insertion - Sort( A )
1 for j 2 to length[ A ]
2
3

do key A[ j ]
Insert A[ j ] into the sorted
sequence A[1.. j 1]

4
5
6

i j1
w hile i > 0 and A[ i ] > key
do A[ i + 1] A[ i ]

7
8

i i1
A[ i + 1] key

Cost
c1

times
n -1

c2

n -1

n -1

c4

n -1

c5
c6
c7
c8

n
j= 2
n
j= 2
n
j= 2

tj
(t j 1)
(t j 1)

n -1

tJ is the number of times the while loop test in line 5 is


executed for that value of j. Total running time is
c1 (n) + c2(n-1) ++c8(n-1)

A Better Way to Estimate


Running Time
Trying to determine the exact running time is
tedious, complicated and platform-specific.
Instead, well bound the worst-case running
time to within a constant factor for large
inputs.
Why worst-case?

Its easier.
Its a guarantee.
Its well-defined.
Average case is often no better.

Insertion Sort, Redux


Outer loop happens about n times.
Inner loop is done j times for each j, worst
case.
j is always < n
So running time is upper-bounded by about
n n = n2

Insertion sort runs in (n2) time, worst-case.


8

Well make all the abouts and roughs


precise.
These so-called asymptotic running times
dominate constant factors for large enough
inputs.

Functions
constants

0, 1

polynomials

n, 2n 3 5

exponentials
logarithms and
polylogarithms
factorial

2 ,3

n2
2

log2 n, log n
n!= n (n 1) ... 1
10

Polynomials
d

i
a
n
i
i =0

E.g 2n 3 5 = 5n 0 + 0n1 + 0n 2 + 2n 3
(More generally, Sum of terms an c
where c is some real number. E.g.
n1 / 2 = n , n

11

f(n) is monotonically increasing iff m n implies


f ( m) f ( n ).
I.e. It never goes down always increases
or stays the same.
For strictly increasing, replace in the
above with <.
I.e. The function is always going up.
12

Are constant functions monotonically


increasing?
Yes! (But not strictly increasing.)
c
c
>
0
n
If
, then
is strictly increasing.

.001
n
( = 1000 n) increases forever to .
So even

13

Exponentials
f ( n) = a n
Facts:
a0 = 1
a1 = a
1

a = 1/ a

a n =n a
1

a m+ n = a ma n
e.g. 2 n 1 = 2 n 2 1 = 21 2 n
(a m ) n = a mn = (a n ) m
e.g. 2 = 2
n

( 12 ) n

= 2

If a > 1, a n is strictly increasing to


14

Logarithms
log b n = that number x such that b x = n.
Some notation :
lg n = log 2 n
ln n = log e n
log k n = (log n ) k
log log n = log (log n )
Recall :
log b b = 1
log b 1 = 0
For 0 < x < 1, log b x < 0.

15

log b n

1. n = b
(definition)

Log Facts

2. logb(xy)=logbx +
logby

if b > 1, then logbn is


strictly increasing.

3. logban = nlogba
4. logbx = logcx / logcb
5. logb(1/a) = -logba
6. logba = 1/(logab)

7. alogbc= clogba
16

More About Logs


A polylogarithm is just

log k n
For k > 0 and n > 0, log n is strictly
k

increasing to .

17

Factorial
n!= n (n 1) (n 2) K 1
n

= i
i =1

0!= 1 (by definition)


n! is strictly increasing to
Stirling' s approximation :
n!= 2n ( n e ) n (1 + ( 1 n ))
Even without Stirling, we can see :
For n > 3, 2 n < n!< n n

18

More About Monotonically


Increasing Functions
If f ( n ) and g ( n ) are monotonically
increasing, then so are
f ( n) + g ( n)
f ( g ( n ))
and if f ( n ) and g ( n ) are also > 0 , then
f ( n) g ( n)
is monotonically increasing, too.
Examples: n log n
log log n
2n + n3
19

Bounding Functions
We are more interested in finding upper and lower
bounds on functions that describe the running
times of algorithms, rather than finding the exact
function.
Also, we are only interested in considering large
inputs: everything is fast for small inputs.
So we want asymptotic bounds.
Finally, we are willing to neglect constant factors
when bounding. They are tedious to compute, and
are often hardware-dependent anyway.
20

Example
20
15
3n+8

10

5
0
0

3n + 8 is certainly not upper - bounded


by n K
30
20

3n+8
10n

10
0
0

b u t w h e n n 2 , 3 n + 8 is u p p e r b o u n d e d b y 1 0 n.

21

Notation for Function Bounds


(The most important definitions in the course.
We' ll use them every day.)
Upper Bound : O (" big - oh" )
We say f (n) = O( g (n)) iff there exist two
positive constants c and n0 such that
f (n) cg (n) for all n n0
E.g.
3n + 8 = O(n)
[Proof : choose c = 10 and n0 = 2;
there are many other choices.]

22

Big-Oh Facts
The = is misleading. E.g
O( g ( n)) = f ( n)

is meaningless.
It denotes an upper bound, not necessarily
a tight bound. Thus,
n = O ( n)
n = O(n 2 )
n = O(2 n ), etc.

Transitivity:
If f (n) = O ( g (n)) and g (n) = O (h(n))
then f (n) = O (h(n))

23

Examples
Weve seen that 3n + 8 = O(n).
Is 10n = O(3n + 8)?
Yes: choose c = 10, n0 = 1:
10n 30n + 80

24

More Examples
2n2 + 5n - 8 = O(n2)?
Yes:
2n2 + 5n - 8 cn2
n2: 2 + 5/n - 8/n2 c
Choose c = 3; subtracting 2:
5/n - 8/n2 1
Choose no = 5
25

More Examples
Is 2 n = O ( 3 n ) ?
Yes : in fact, 2 n 3 n

( c = 1, n 0 = 1)

Is 3 n = O ( 2 n ) ?
3n c2 n
2 n : 3n / 2 n c
(3 / 2 ) n c
But since 3/2 > 1, ( 3 / 2 ) n is strictly increasing
to - it cannot be upper - bounded by any constant.
26

Lower Bound Notation


f (n) = ( g (n)) iff there are positive constants
c and n0 such that f(n) cg(n) for all n n0 .
(Same as definition of O, with replaced by .)
E.g.
3n = (2 n )
n = ( n)
In fact:
If f (n) = O( g (n)), then g (n) = ( f (n)),
and vice versa.
2

27

Tight Bound Notation


(Both upper and lower)
f ( n) = ( g ( n)) iff there are positive constants
c1 , c2 , and n0 such that
c1g ( n) f ( n) c2 g ( n) for all n n0 .
8
c2g(n)

f (n)

4
2
0

n0
28

More Facts
is transitive, just like O and .
f (n) = ( g (n)) iff f (n) = O( g (n))
and f (n) = ( g (n))
f (n) = ( g (n)) iff f (n) = O( g (n))
and g (n) = O( f (n))
f (n) = ( g (n)) iff g (n) = ( f (n))
29

Examples
We' ve seen that 3n + 8 = O(n), and n = O(3n + 8) so
3n + 8 = (n).
2n 2 + 5n 8 = (n 2 ). This is because :
We saw earlier that 2n 2 + 5n 8 = O(n 2 )
Also, 2n 2 + 5n 8 = (n 2 ) [c = 1, n0 = 2].
n 2 (n 3 ). This is because :
n 2 = O(n 3 ), but not vice versa.
log b1 n = (log b 2 n)
Since log functions differ by a constant factor.
Therefore, we can write (lg n) for all logarithmic growth. 30

Another Upper-Bound
Notation
We say f ( n) = o( g ( n)) [" little oh" ] iff
f ( n)
= 0.
lim
n g ( n)
E.g. 2n = o( n 2 )
2n 2 is not o( n 2 ).
Think of f ( n) = o( g ( n)) as
" f ( n) is (asymptotically) negligible with
respect to g ( n)."
31

Functions and Bounds


Constant functions :
All constant functions are (1). (also ( 2), (3),
etc., but we use 1 by convention .)
If f ( n ) is a constant function and g ( n ) is strictly
increasing to , then f ( n ) = o ( g ( n )).
f (n)
(Proof : g ( n ) , so lim
= 0.)
n g (n)
32

lg n = o(n)

Logarithms

In fact,
log b n = o(n a ) for any constants a, b
where a > 0.
So - perhaps surprisingly (*)

lg100 n = o(n .01 )


in spite of the fact that for n = 4,
lg100 n = 2100 (huge!) while n .01 = 1 +
Substituting lg n for n in (*), we see (lg lg n) b = o(lg a n)
and so on.
33

Polynomials
Any polynomial of degree d whose leading
coefficient is positive is (n d ).
Proof :
Homework for next class.
Substituting 2 n for n in lg b n = o(n a ), a > 0,
we have :
(lg(2 n )) b = o((2 n ) a )
n b = o((2 a ) n ) (recall lg n means log base 2)
Renaming 2 a , (where a > 0) to a, (where a > 1),
n b = o(a n ), a > 1 i.e. polynomials = o (exponentials)
Thus, n100 = o(1.001n )

34

Comparing Functions
n vs.

n vs. lg 2 n

2 n vs. 2 n / 2
n lg n vs. n

lg n vs. ln n
n / lg n vs. lg n

n lg n vs. n
n

lg n

vs. lg( n )

lgn vs. lg n

35

Das könnte Ihnen auch gefallen