Sie sind auf Seite 1von 68

COSC 3101A - Design and

Analysis of Algorithms
3
Recurrences
Master’s Method
Heapsort and Priority Queue

Many of the slides are taken from Monica Nicolescu’s slides, Univ. Nevada, Reno, monica@cs.unr.edu
Fibonacci numbers
Leonardo Pisano

Born: 1170 in (probably) Pisa (now in


Italy)
Died: 1250 in (possibly) Pisa (now in Italy)

1, 1, 2, 3, 5, 8, 13, 21, 34, 55, ...


“A certain man put a pair of rabbits in a
place surrounded on all sides by a wall.
How many pairs of rabbits can be
• F(n) = F(n-1) + F(n-2)
produced from that pair in a year if it is
supposed that every month each pair • F(1) = 0, F(2) = 1
begets a new pair which from the second
month on becomes productive? “ • F(3) = 1, F(4) = 2, F(5) = 3, and
so on

5/18/2004 Lecture 3 COSC3101A 2


Review: Recursive Algorithms
• A recursive algorithm is an algorithm which
solves a problem by repeatedly reducing the
problem to a smaller version of itself .. until it
reaches a form for which a solution exists.

• Compared with iterative algorithms:


– More memory
– More computation
– Simpler, natural way of thinking about the problem

5/18/2004 Lecture 3 COSC3101A 3


Review: Recursive Example (1)
N! (N factorial) is defined for positive values as:
 
N! = 0 if N = 0
N! = N * (N-1) * (N-2) * (N-3) * … * 1 if N > 0
 
For example:
5! = 5*4*3*2*1 = 120 3! = 3*2*1 = 6
 
The definition of N! can be restated as:
 
N! = N * (N-1)! Where 0! = 1
 
This is the same N!, but it is defined by reducing the problem to a
smaller version of the original (hence, recursively). 

5/18/2004 Lecture 3 COSC3101A 4


Review: Recursive Example (2)
int factorial(int num){
if (num == 0) return 1;
else return num * factorial(num – 1);
}

Assume that factorial is called with the argument 4: 


Call to factorial(4) will return  4 * value returned by factorial(3) 

Call to factorial(3) will return  3 * value returned by factorial(2)

Call to factorial(2) will return  2 * value returned by factorial(1)

Call to factorial(1) will return  1 * value returned by factorial(0)

  Call to factorial(0) returns  1


5/18/2004 Lecture 3 COSC3101A 5
Review: Recursive Example (3)
The call to factorial(0) has returned, so now factorial(1) can finish:
Call to factorial(1) returns  1 * value returned by factorial(0) 1 => 1

The call to factorial(1) has returned, so now factorial(2) can finish:


Call to factorial(2) returns  2 * value returned by factorial(1) 1 => 2

The call to factorial(2) has returned, so now factorial(3) can finish:


Call to factorial(3) returns  3 * value returned by factorial(2) 2 => 6

The call to factorial(3) has returned, so now factorial(4) can finish:


Call to factorial(4) returns  4 * value returned by factorial(3) 6=>24

recursion results in a large number of ‘activation records’ (one per


method call) being placed on the system stack

5/18/2004 Lecture 3 COSC3101A 6


Recurrences
Def.: Recurrence = an equation or inequality that
describes a function in terms of its value on smaller
inputs, and one or more base cases

E.g.: Fibonacci numbers:


• Recurrence: F(n) = F(n-1) + F(n-2)

• Boundary conditions: F(1) = 0, F(2) = 1

• Compute: F(3) = 1, F(4) = 2, F(5) = 3, and so on

In many cases, the running time of an algorithm is


expressed as a recurrence!
5/18/2004 Lecture 3 COSC3101A 7
Recurrences and Running Time
• Recurrences arise when an algorithm contains
recursive calls to itself

• What is the actual running time of the algorithm?


• Need to solve the recurrence
– Find an explicit formula of the expression (the generic
term of the sequence)

5/18/2004 Lecture 3 COSC3101A 8


Typical Recurrences Running Time
• T(n) = T(n-1) + n Θ(n2)
– Recursive algorithm that loops through the input to
eliminate one item
• T(n) = T(n/2) + c Θ(lgn)
– Recursive algorithm that halves the input in one step
• T(n) = T(n/2) + n Θ(n)
– Recursive algorithm that halves the input but must
examine every item in the input
• T(n) = 2T(n/2) + 1 Θ(n)
– Recursive algorithm that splits the input into 2 halves
and does a constant amount of other work
5/18/2004 Lecture 3 COSC3101A 9
Recurrences - Intuition
• For a recurrence of the type:

n
T (n)  aT    f (n)
b

• It takes f(n) to make the processing for the problem of


size n
• The algorithm divides the problem into a subproblems,
each of size n/b
• T(n) = number of subproblems * Running time(n/b) +
processing of the problem of size n
5/18/2004 Lecture 3 COSC3101A 10
Methods for Solving Recurrences
• Iteration Method
• Substitution Method
• Recursion Tree Method
• Master Method

5/18/2004 Lecture 3 COSC3101A 11


Iteration Method

1. Expand (iterate) the recurrence

2. Express The function as a summation


of terms dependent only on n and the
initial condition

5/18/2004 Lecture 3 COSC3101A 12


Iteration Method – Example(1)
T(n) = c + T(n/2)
T(n) = c + T(n/2) T(n/2) = c + T(n/4)
= c + c + T(n/4) T(n/4) = c + T(n/8)
= c + c + c + T(n/8)
Assume n = 2k
T(n) = c + c + … + c + T(1)
k times
= clgn + T(1)
= Θ(lgn)

5/18/2004 Lecture 3 COSC3101A 13


Iteration Method – Example(2)
T(n) = n + 2T(n/2) Assume: n = 2k
T(n) = n + 2T(n/2) T(n/2) = n/2 + 2T(n/4)
= n + 2(n/2 + 2T(n/4))
= n + n + 4T(n/4)
= n + n + 4(n/4 + 2T(n/8))
= n + n + n + 8T(n/8)
… = in + 2iT(n/2i)
= kn + 2kT(1)
= nlgn + nT(1) = Θ(nlgn)

5/18/2004 Lecture 3 COSC3101A 14


Substitution method (1)

1. Guess a solution
• Experiences, creativity
• iteration method, recursion-tree method

2. Use induction to prove that the solution


works

5/18/2004 Lecture 3 COSC3101A 15


Substitution method (2)
• Guess a solution
– T(n) = O(g(n))
– Induction goal: apply the definition of the asymptotic notation

• T(n) ≤ c g(n), for some c > 0 and n ≥ n0

– Induction hypothesis: T(k) ≤ c g(k) for all k ≤ n

• Prove the induction goal


– Use the induction hypothesis to find some values of the
constants c and n0 for which the induction goal holds

5/18/2004 Lecture 3 COSC3101A 16


Substitution Method – Example 1-1
T(n) = T(n-1) + n
• Guess: T(n) = O(n2)
– Induction goal: T(n) ≤ c n2, for some c and n ≥ n0
– Induction hypothesis: T(k) ≤ ck2 for all k ≤ n

• Proof of induction goal:


T(n) = T(n-1) + n ≤ c (n-1)2 + n
= cn2 – (2cn – c - n) ≤ cn2
if: 2cn – c – n ≥ 0  c ≥ n/(2n-1)  c ≥ 1/(2 – 1/n)
– For n ≥ 1  2 – 1/n ≥ 1  any c ≥ 1 will work

5/18/2004 Lecture 3 COSC3101A 17


Substitution Method – Example 1-2
T(n) = T(n-1) + n
• Boundary conditions:
– Base case: n0 = 1  T(1) = 1 has to verify condition:
T(1) ≤ c (1)2  1 ≤ c  OK!

• We can similarly prove that T(n) = (n2) <your


practice>
• And therefore: T(n) = (n2)

5/18/2004 Lecture 3 COSC3101A 18


Substitution Method – Example 2-1
T(n) = 2T(n/2) + n
• Guess: T(n) = O(nlgn)
– Induction goal: T(n) ≤ cn lgn, for some c and n ≥ n0
– Induction hypothesis: T(n/2) ≤ cn lg(n/2)

• Proof of induction goal:


T(n) = T(n/2) + n ≤ 2c (n/2)lg(n/2) + n
= cn lgn – cn + n ≤ cn lgn
if: - cn + n ≤ 0  c ≥ 1
5/18/2004 Lecture 3 COSC3101A 19
Substitution Method – Example 2-2
T(n) = 2T(n/2) + n
• Boundary conditions:
– Base case: n0 = 1  T(1) = 1 has to verify condition:
T(1) ≤ cn0lgn0  1 ≤ c * 1 * lg1 = 0 – contradiction
– Choose n0 = 2  T(2) = 4 has to verify condition:
T(2) ≤ c * 2 * lg2  4 ≤ 2c  choose c = 2

• We can similarly prove that T(n) = (nlgn) <your


practice>
• And therefore: T(n) = (nlgn)
5/18/2004 Lecture 3 COSC3101A 20
Changing variables
T(n) = 2T( n ) + lgn
– Rename: m = lgn  n = 2m
T (2m) = 2T(2m/2) + m
– Rename: S(m) = T(2m)
S(m) = 2S(m/2) + m  S(m) = O(mlgm)
(demonstrated before)
T(n) = T(2m) = S(m) = O(mlgm)=O(lgnlglgn)
Idea: transform the recurrence to one that you
have seen before
5/18/2004 Lecture 3 COSC3101A 21
Recursion-tree method

Convert the recurrence into a tree:

1. Each node represents the cost incurred at


various levels of recursion

2. Sum up the costs of all levels

Used to “guess” a solution for the recurrence

5/18/2004 Lecture 3 COSC3101A 22


Recursion-tree Example 1
W(n) = 2W(n/2) + n2

• Subproblem size at level i is: n/2i


• Subproblem size hits 1 when 1 = n/2i  i = lgn
• Cost of the problem at level i = n2/2i No. of nodes at level i = 2i
• Total cost: lg n 1 2
n lg n 1
 1 
i 
 1 
i
1
W ( n)  2
lg n

i 0
2
i
 2 W (1)  n
2
  2 
i 0
2
nn   2 
i 0
 O(n) n
1 1
 O ( n)  2n 2
2
 W(n) = O(n2)
5/18/2004 Lecture 3 COSC3101A 23
Recursion-tree Example 2
E.g.: T(n) = 3T(n/4) + cn2

• Subproblem size at level i is: n/4i


• Subproblem size hits 1 when 1 = n/4i  i = log4n
• Cost of a node at level i = c(n/4i)2
• Number of nodes at level i = 3i  last level has 3log4n = nlog43 nodes
• Total cost:
log4 n 1 i i

     

3 2 3 1
T (n)     cn   n log4 3
    cn 2   n log4 3  cn 2   n log4 3  O(n 2 )
i  0  16  i  0  16 
3
1
 T(n) = O(n2) 16
5/18/2004 Lecture 3 COSC3101A 24
Master method
• “Cookbook” for solving recurrences of the form:
where, a ≥ 1, b > 1, and f(n) > 0
n
T (n)  aT    f (n)
b
Case 1: if f(n) = O(nlogba -) for some  > 0, then: T(n) = (nt)

Case 2: if f(n) = (nlogba), then: T(n) = (nlogba lgn)

Case 3: if f(n) = (nlogba +) for some  > 0, and if

af(n/b) ≤ cf(n) for some c < 1 and all sufficiently large n, then:

T(n) = (f(n))

regularity condition

5/18/2004 Lecture 3 COSC3101A 25


Why n log a
b ? n
T (n)  aT    f (n)
b

n • Case 1:
T (n)  aT  
b – If f(n) is dominated by nlogba:
 n
a 2T  2  • T(n) = (nlogbn)
b 
n
a 3T  3 
b  • Case 3:
– If f(n) dominates nlogba:
n
T (n)  a iT  i  i • T(n) = (f(n))
b 

• Case 2:
• Assume n = bk  k = logbn – If f(n) = (nlogba):
• At the end of iteration i = k: • T(n) = (nlogba logn)

 bi 
T ( n)  a   
T  i   a logb nT (1)   a logb n   n logb a
log b n

b 
5/18/2004 Lecture 3 COSC3101A 26
Examples (1)

T(n) = 2T(n/2) + n

a = 2, b = 2, log22 = 1

Compare nlog22 with f(n) = n

 f(n) = (n)  Case 2

 T(n) = (nlgn)
5/18/2004 Lecture 3 COSC3101A 27
Examples (2)
T(n) = 2T(n/2) + n2
a = 2, b = 2, log22 = 1
Compare n with f(n) = n2
 f(n) = (n1+) Case 3  verify regularity cond.
a f(n/b) ≤ c f(n)
 2 n2/4 ≤ c n2  c = ½ is a solution (c<1)
 T(n) = (n2)

5/18/2004 Lecture 3 COSC3101A 28


Examples (3)

T(n) = 2T(n/2) + n

a = 2, b = 2, log22 = 1

Compare n with f(n) = n1/2

 f(n) = O(n1-) Case 1

 T(n) = (n)

5/18/2004 Lecture 3 COSC3101A 29


Examples (4)

T(n) = 3T(n/4) + nlgn

a = 3, b = 4, log43 = 0.793

Compare n0.793 with f(n) = nlgn

f(n) = (nlog43+) Case 3

Check regularity condition:

3(n/4)lg(n/4) ≤ (3/4)nlgn = c f(n), c=3/4

T(n) = (nlgn)
5/18/2004 Lecture 3 COSC3101A 30
Examples (5)

T(n) = 2T(n/2) + nlgn

a = 2, b = 2, log22 = 1

• Compare n with f(n) = nlgn


– seems like case 3 should apply

• f(n) must be polynomially larger by a factor of n

• In this case it is only larger by a factor of lgn

5/18/2004 Lecture 3 COSC3101A 31


A Job Scheduling Application
• Job scheduling
– The key is the priority of the jobs in the queue
– The job with the highest priority needs to be executed
next

• Operations
– Insert, remove maximum

• Data structures
– Priority queues
– Ordered array/list, unordered array/list
5/18/2004 Lecture 3 COSC3101A 32
Example

5/18/2004 Lecture 3 COSC3101A 33


PQ Implementations & Cost
Worst-case asymptotic costs for a PQ with N items

Insert Remove max

ordered array N 1
ordered list N 1
unordered array 1 N
unordered list 1 N

Can we implement both operations efficiently?


5/18/2004 Lecture 3 COSC3101A 34
Background on Trees
• Def: Binary tree = structure composed of a finite
set of nodes that either:
– Contains no nodes, or
– Is composed of three disjoint sets of nodes: a root
node, a left subtree and a right subtree
4 root

Left subtree Right subtree


1 3
2 16 9 10
14 8

5/18/2004 Lecture 3 COSC3101A 35


Special Types of Trees
• Def: Full binary tree = a 4

binary tree in which each node 1 3


is either a leaf or has degree
2 16 9 10
exactly 2. 14 8 7
12
Full binary tree

4
• Def: Complete binary tree = a
1 3
binary tree in which all leaves
2 16 9 10
have the same depth and all
internal nodes have degree 2. Complete binary tree

5/18/2004 Lecture 3 COSC3101A 36


The Heap Data Structure
• Def: A heap is a nearly complete binary tree with
the following two properties:
– Structural property: all levels are full, except
possibly the last one, which is filled from left to right
– Order (heap) property: for any node x
Parent(x) ≥ x

It doesn’t matter that 4 in


7 4
level 1 is smaller than 5 in
5 2 level 2
Heap
5/18/2004 Lecture 3 COSC3101A 37
Definitions
• Height of a node = the number of edges on a longest
simple path from the node down to a leaf
• Depth of a node = the length of a path from the root to
the node
• Height of tree = height of root node
= lgn, for a heap of n elements
4 Height of root = 3

1 3
Height of (2)= 1 2 16 9 10 Depth of (10)= 2
14 8

5/18/2004 Lecture 3 COSC3101A 38


Array Representation of Heaps
• A heap can be stored as an
array A.
– Root of tree is A[1]
– Left child of A[i] = A[2i]
– Right child of A[i] = A[2i + 1]
– Parent of A[i] = A[ i/2 ]
– Heapsize[A] ≤ length[A]
• The elements in the subarray
A[(n/2+1) .. n] are leaves
• The root is the maximum
element of the heap
A heap is a binary tree that is filled in order
5/18/2004 Lecture 3 COSC3101A 39
Heap Types
• Max-heaps (largest element at root), have the
max-heap property:
– for all nodes i, excluding the root:
A[PARENT(i)] ≥ A[i]

• Min-heaps (smallest element at root), have the


min-heap property:
– for all nodes i, excluding the root:
A[PARENT(i)] ≤ A[i]

5/18/2004 Lecture 3 COSC3101A 40


Operations on Heaps
• Maintain the max-heap property
– MAX-HEAPIFY
• Create a max-heap from an unordered array
– BUILD-MAX-HEAP
• Sort an array in place
– HEAPSORT
• Priority queue operations

5/18/2004 Lecture 3 COSC3101A 41


Operations on Priority Queues

• Max-priority queues support the following


operations:
– INSERT(S, x): inserts element x into set S

– EXTRACT-MAX(S): removes and returns element of


S with largest key
– MAXIMUM(S): returns element of S with largest key

– INCREASE-KEY(S, x, k): increases value of element


x’s key to k (Assume k ≥ x’s current key value)
5/18/2004 Lecture 3 COSC3101A 42
Maintaining the Heap Property
• Suppose a node is smaller than a
child
– Left and Right subtrees of i are max-heaps
• Invariant:
– the heap condition is violated only at that
node
• To eliminate the violation:
– Exchange with larger child
– Move down the tree
– Continue until node is not smaller than
children

5/18/2004 Lecture 3 COSC3101A 43


Maintaining the Heap Property
• Assumptions: Alg: MAX-HEAPIFY(A, i, n)
– Left and Right 1. l ← LEFT(i)
subtrees of i are 2. r ← RIGHT(i)
max-heaps
3. if l ≤ n and A[l] > A[i]
– A[i] may be
4. then largest ←l
smaller than its
children 5. else largest ←i
6. if r ≤ n and A[r] > A[largest]
7. then largest ←r
8. if largest  i
9. then exchange A[i] ↔ A[largest]
10. MAX-HEAPIFY(A, largest, n)
5/18/2004 Lecture 3 COSC3101A 44
Example
MAX-HEAPIFY(A, 2, 10)

A[2]  A[4]

A[2] violates the heap property A[4] violates the heap property

A[4]  A[9]

Heap property restored


5/18/2004 Lecture 3 COSC3101A 45
MAX-HEAPIFY Running Time
• Intuitively:
– A heap is an almost complete binary tree  must
process O(lgn) levels, with constant work at each
level

• Running time of MAX-HEAPIFY is O(lgn)

• Can be written in terms of the height of the heap,


as being O(h)
– Since the height of the heap is lgn
5/18/2004 Lecture 3 COSC3101A 46
Building a Heap
• Convert an array A[1 … n] into a max-heap (n = length[A])
• The elements in the subarray A[(n/2+1) .. n] are leaves
• Apply MAX-HEAPIFY on elements from 1 to n/2

Alg: BUILD-MAX-HEAP(A) 1

4
1. n = length[A]
2 3

2. for i ← n/2 downto 1 4


1
5 6
3
7

3. do MAX-HEAPIFY(A, i, n) 8
2 9 10
16 9 10
14 8 7

A: 4 1 3 2 16 9 10 14 8 7

5/18/2004 Lecture 3 COSC3101A 47


Example: A 4 1 3 2 16 9 10 14 8 7

i=5 i=4 i=3


1 1 1

4 4 4
2 3 2 3 2 3

1 3 1 3 1 3
4 5 6 7 4 5 6 7 4 5 6 7

8
2 9 10
16 9 10 8 2 9 10
16 9 10 8 14 9 10
16 9 10
14 8 7 14 8 7 2 8 7

i=2 i=1
1 1 1

4 4 16
2 3 2 3 2 3

1 10 16 10 14 10
4 5 6 7 4 5 6 7 4 5 6 7

8
14 9 10
16 9 3 8
14 9 10
7 9 3 8
8 9 10
7 9 3
2 8 7 2 8 1 2 4 1
5/18/2004 Lecture 3 COSC3101A 48
Correctness of BUILD-MAX-HEAP
• Loop invariant:
– At the start of each iteration of the for loop, each node
i + 1, i + 2,…, n is the root of a max-heap
• Initialization:
– i = n/2: Nodes n/2+1, n/2+2, …, n are leaves 
they are the root of trivial max-heaps

4
2 3

1 3
4 5 6 7

8
2 9 10
16 9 10
14 8 7

5/18/2004 Lecture 3 COSC3101A 49


Correctness of BUILD-MAX-HEAP
• Maintenance:
– MAX-HEAPIFY makes node i a max-
heap root and preserves the property that 1

4
nodes i + 1, i + 2, …, n are roots of max-
heaps 2 3

1 3
– Decrementing i in the for loop 4 5 6 7

2 16 9 10
reestablishes the loop invariant 8 9 10

14 8 7
• Termination:
– i = 0  each node 1, 2, …, n is the root
of a max-heap (by the loop invariant)

5/18/2004 Lecture 3 COSC3101A 50


Running Time of BUILD-MAX-HEAP

Alg: BUILD-MAX-HEAP(A)
1. n = length[A]
2. for i ← n/2 downto 1
O(n)
3. do MAX-HEAPIFY(A, i, n) O(lgn)

 Running time: O(nlgn)


• This is not an asymptotically tight upper bound

5/18/2004 Lecture 3 COSC3101A 51


Running Time of BUILD-MAX-HEAP
• HEAPIFY takes O(h)  the cost of HEAPIFY on a node i is
proportional to the height of the node i in the tree
Height Level No. of nodes
h0 = 3 (lgn) i=0 20

h1 = 2 i=1 21

h2 = 1 i=2 22

h3 = 0 i = 3 (lgn) 23

h
hi = h – i height of the heap rooted at level i  T (n)   ni hi
ni = 2i number of nodes at level i i 0

5/18/2004 Lecture 3 COSC3101A 52


Running Time of BUILD-MAX-HEAP
h
T (n)   ni hi Cost of HEAPIFY at level i  number of nodes at that level
i 0
h
  2i  h  i  Replace the values of ni and hi computed before
i 0
h
hi h
 h i
2 Multiply by 2h both at the nominator and denominator and
1
i 0 2 write 2i as i
2
h
k
2  k
h
Change variables: k = h - i
k 0 2

k
 n k
The sum above is smaller than the sum of all elements to 
k 0 2 and h = lgn

 O(n) The sum above is smaller than 2

Running time of BUILD-MAX-HEAP: T(n) = O(n)


5/18/2004 Lecture 3 COSC3101A 53
Heapsort
• Goal:
– Sort an array using heap representations

• Idea:
– Build a max-heap from the array
– Swap the root (the maximum element) with the last
element in the array
– “Discard” this last node by decreasing the heap size
– Call MAX-HEAPIFY on the new root
– Repeat this process until only one node remains

5/18/2004 Lecture 3 COSC3101A 54


Alg: HEAPSORT(A)

1. BUILD-MAX-HEAP(A) O(n)

2. for i ← length[A] downto 2


n-1 times
3. do exchange A[1] ↔ A[i]
4. MAX-HEAPIFY(A, 1, i - 1) O(lgn)

• Running time: O(nlgn)

5/18/2004 Lecture 3 COSC3101A 55


Example: A=[7, 4, 3, 1, 2]

MAX-HEAPIFY(A, 1, 4) MAX-HEAPIFY(A, 1, 3) MAX-HEAPIFY(A, 1, 2)

MAX-HEAPIFY(A, 1, 1)

5/18/2004 Lecture 3 COSC3101A 56


HEAP-MAXIMUM
Goal:
– Return the largest element of the heap

Running time: O(1)


Alg: HEAP-MAXIMUM(A)
1. return A[1]
Heap A:

Heap-Maximum(A) returns 7
5/18/2004 Lecture 3 COSC3101A 57
HEAP-EXTRACT-MAX
Goal:
– Extract the largest element of the heap (i.e., return the max
value and also remove that element from the heap
Idea:
– Exchange the root element with the last
– Decrease the size of the heap by 1 element
– Call MAX-HEAPIFY on the new root, on a heap of size n-1

Heap A: Root is the largest element

5/18/2004 Lecture 3 COSC3101A 58


HEAP-EXTRACT-MAX

Alg: HEAP-EXTRACT-MAX(A, n)

1. if n < 1
2. then error “heap underflow”

3. max ← A[1]

4. A[1] ← A[n]

5. MAX-HEAPIFY(A, 1, n-1) remakes heap

6. return max
Running time: O(lgn)
5/18/2004 Lecture 3 COSC3101A 59
Example: HEAP-EXTRACT-MAX
16 1

14 10 max = 16 14 10
8 7 9 3 8 7 9 3
2 4 1 2 4
Heap size decreased with 1

14

Call MAX-HEAPIFY(A, 1, n-1)


8 10
4 7 9 3
2 1

5/18/2004 Lecture 3 COSC3101A 60


HEAP-INCREASE-KEY
• Goal:
– Increases the key of an element i in the heap
• Idea:
– Increment the key of A[i] to its new value
– If the max-heap property does not hold anymore:
traverse a path toward the root to find the proper
place for the newly increased key
16

14 10
8 7 9 3
i
Key [i] ← 15 2 4 1

5/18/2004 Lecture 3 COSC3101A 61


HEAP-INCREASE-KEY
Alg: HEAP-INCREASE-KEY(A, i, key)

1. if key < A[i]


2. then error “new key is smaller than current key”
3. A[i] ← key
4. while i > 1 and A[PARENT(i)] < A[i] 16

5. do exchange A[i] ↔ A[PARENT(i)]


14 10
6. i ← PARENT(i)
8 7 9 3
i
2 4 1
• Running time: O(lgn)
Key [i] ← 15

5/18/2004 Lecture 3 COSC3101A 62


Example: HEAP-INCREASE-KEY
16 16

14 10 14 10
8 7 9 3 8 7 9 3
i i
2 4 1 2 15 1

Key [i ] ← 15

16 16
i
14 10 15 10
i
15 7 9 3 14 7 9 3
2 8 1 2 8 1

5/18/2004 Lecture 3 COSC3101A 63


MAX-HEAP-INSERT
• Goal:
16
– Inserts a new element into a max-
heap 14 10
8 7 9 3
• Idea:
2 4 1 -
– Expand the max-heap with a new
16
element whose key is -
– Calls HEAP-INCREASE-KEY to 14 10

set the key of the new node to its 8 7 9 3

correct value and maintain the 2 4 1 15

max-heap property
5/18/2004 Lecture 3 COSC3101A 64
MAX-HEAP-INSERT

16
Alg: MAX-HEAP-INSERT(A, key, n)
14 10
1. heap-size[A] ← n + 1 8 7 9 3
2 4 1 -
2. A[n + 1] ← -

3. HEAP-INCREASE-KEY(A, n + 1, key)

Running time: O(lgn)

5/18/2004 Lecture 3 COSC3101A 65


Example: MAX-HEAP-INSERT
Insert value 15: Increase the key to 15
- Start by inserting - Call HEAP-INCREASE-KEY on A[11] = 15
16 16

14 10 14 10
8 7 9 3 8 7 9 3
2 4 1 - 2 4 1 15

The restored heap containing


the newly added element

16 16

14 10 15 10

8 15 9 3 8 14 9 3

2 4 1 7 2 4 1 7
5/18/2004 Lecture 3 COSC3101A 66
Summary
• We can perform the following operations on
heaps:
– MAX-HEAPIFY O(lgn)
– BUILD-MAX-HEAP O(n)
– HEAP-SORT O(nlgn)
– MAX-HEAP-INSERT O(lgn)
– HEAP-EXTRACT-MAX O(lgn)
– HEAP-INCREASE-KEY O(lgn)
– HEAP-MAXIMUM O(1)
5/18/2004 Lecture 3 COSC3101A 67
Readings
• Chapter 8, 6

5/18/2004 Lecture 3 COSC3101A 68

Das könnte Ihnen auch gefallen