Sie sind auf Seite 1von 8

CS 306 DAA, IIITDM Jabalpur

Greedy Algorithms
Atul Gupta

A Greedy Algorithm
An algorithm that follows the problem solving heuristic of making the locally optimal choice at each stage with the hope of finding a global optimum In some problem, it finds an optimal(best) solution. While in some other (Hard) problems, it may yield a locally optimal solution that approximate the global optimal solution in a reasonable time For example, a greedy strategy for the traveling salesman problem

CS 306 DAA, IIITDM Jabalpur

Making Change
determines the minimum number of coins to give while making change of 36 cents using only coins with values {1, 5, 10, 20}.

Making Change
in general the change-making problem requires dynamic programming or integer programming to find an optimal solution (For instance try the same with {10,9,2,1} However, most currency systems, including the Euro (pictured) and US Dollar, are special cases where the greedy strategy does find an optimum solution.)

CS 306 DAA, IIITDM Jabalpur

Greedy Approach is useful where


For problems with following two properties
Greedy choice property - It iteratively makes one greedy choice after another, reducing each given problem into a smaller one. Optimal substructure - an optimal solution to the problem contains optimal solutions to the subproblems

Greedy algorithms
Greedy algorithms can be characterized as being 'short sighted', and as 'non-recoverable' A greedy algorithm never reconsiders its choices Nevertheless, they are useful because they are quick to think up and often give good approximations to the optimum

CS 306 DAA, IIITDM Jabalpur

Greedy approach may fail .


With a goal of reaching the largest-sum, at each step, the greedy algorithm will choose what appears to be the optimal immediate choice, so it will choose 12 instead of 3 at the second step, and will not reach the best solution, which contains 99.

Greedy Algorithm
Solve optimally
finding minimum spanning trees (Kruskal's algorithm and Prim's algorithm ) finding single-source shortest paths (Dijkstra's algorithm) finding optimum codes (Huffman trees) Fractional Knapsac problem The activity selection problem

CS 306 DAA, IIITDM Jabalpur

The Activity Selection Problem


A problem concerning the selection of nonconflicting activities to be performed by a single person or machine within a given time frame
given a set of activities each marked by a start time (si) and finish time (fi) Two activities i and j are said to be non-conflicting if si fj or sj fi

finding the maximal solution set (S) of nonconflicting activities

The Activity Selection Problem


Algorithm Sort the set of activities by finishing time (f[i]) S = {1} f = f[1] for i=2 to n do if s[i] f S=SUi f = f[i]

CS 306 DAA, IIITDM Jabalpur

The Activity Selection Problem


Proof of optimality Let S = {1, 2, . . ., n} be the set of activities ordered by finish time. Thus activity 1 has the earliest finish time Suppose, A is a subset of S is an optimal solution and let activities in A be ordered by finish time. Suppose, the first activity in A is k
If k = 1, then A begins with greedy choice and we are done (or to be very precise, there is nothing to prove here). If k not=1, we want to show that there is another solution B that begins with greedy choice, activity 1 Let B = A - {k} U {1}. Because f[1] =< f[k], the activities in B are disjoint and since B has same number of activities as A, i.e., |A| = |B|, B is also optimal

If A is an optimal solution to the original problem S, then A` = A - {1} is an optimal solution to the activity-selection problem S` = {i in S: s[i] >= f[1]}

Huffman Coding
Huffman coding is an encoding algorithm used for lossless data compression The technique works by creating a binary tree of nodes.

CS 306 DAA, IIITDM Jabalpur

The Simple Algorithm


The simplest construction algorithm uses a priority queue where the node with lowest probability is given highest priority: Create a leaf node for each symbol and add it to the priority queue. While there is more than one node in the queue: Remove the two nodes of highest priority (lowest probability) from the queue Create a new internal node with these two nodes as children and with probability equal to the sum of the two nodes' probabilities. Add the new node to the queue. The remaining node is the root node and the tree is complete.

Huffman Coding: An Example


text "this is an example of a huffman tree"
Char l o p r u x h i m n s t f a e space Freq 1 1 1 1 1 1 2 2 2 2 2 2 3 4 4 7 Code 11001 00110 10011 11000 00111 10010 1010 1000 0111 0010 1011 0110 1101 010 000 111

CS 306 DAA, IIITDM Jabalpur

Huffman tree generated from the exact frequencies of the text "this is an example of a huffman tree".

Das könnte Ihnen auch gefallen