CH. 5 ADA PDF
CH. 5 ADA PDF
Weightage: 15%
07-09-2024 2
Introduction
• In short, while making a choice there should be a greed for the optimum solution.
07-09-2024 3
General Characteristics of Greedy Algorithm
07-09-2024 4
Comparison between Greedy and Dynamic Program
07-09-2024 5
Problem Solving using Greedy Algorithm
1. Knapsack problem
2. Prim’s algorithm for minimum spanning tree
3. Kruskal’s algorithm for minimum spanning tree
4. Finding shortest path
5. Job sequencing with deadlines
6. Activity selection problem
For solving all above problems a set of feasible solutions is obtained. From these
solution optimum solution is selected. This optimum solution then becomes the
final solution for given problem.
07-09-2024 6
Activity Selection Problem
• Given a resource such as a CPU or a lecture hall and n set of activities, such as
task or lectures that want to utilize the resource.
• The activity i have the starting and finishing time which can be denoted by Si and
Fi.
• Then activity selection problem is to find max_size subset A of mutually
compatible activities. [maximize the number of lecture all in the lecture hall or
maximize number of tasks assigned to the CPU].
07-09-2024 7
Activity Selection Problem
• Consider the following set of activities denoted by S. select the compatible set of
activities.
Step 1: Sort Fi into non-decreasing order. After sorting F1 <= F2 <= F3 <=…<= Fn
Step 2: Select the next activity i to the solution set if i is compatible with each
activity in the solution set.
Step 3: Repeat step 2, until all activities get examined.
Select a1, finish time of a1 is 4, Hence we will search for the activity with Si >= 4. We will select a4
(because S4 = 5). The F4 is 7. Hence the next activity such that Si >= 7. we will get S8. the F8 = 11. Hence
select next activity such that Si >= 11. we obtain a11 activity.
Hence the solution set = {1, 4, 8, 11}
07-09-2024 8
Minimum Spanning Tree
• Spanning Tree: A spanning tree of a graph G is a sub-graph which is basically a
tree and it contains all the vertices of G containing no circuit.
• Minimum Spanning tree: A minimum spanning tree of a weighted connected
graph G is a spanning tree with minimum or smallest weight.
• Weight of the tree: A weight of the tree is defined as the sum of weights of all its
edges.
07-09-2024 10
Prim’s Algorithm
25
1 2
10
6
The edge (1-6) with minimum weight is
considered.
07-09-2024 11
Prim’s Algorithm
Continue till all
vertices are visited.
25 1 25
1 2 2
10 10
6 7 6 7
22 22
23 23
5 20 4 5 20 4
07-09-2024 12
Prim’s Algorithm
• If the input graph is represented using adjacency list, then the time complexity of
Prim’s Algorithm can be reduced to O(E log V)
07-09-2024 13
Example (Prim’s)
07-09-2024 14
Example (Prim’s)
07-09-2024 15
Kruskal’s Algorithm
Step 1:
07-09-2024 16
Kruskal’s Algorithm Step 2
Step 3
Step 1
Step 4
07-09-2024 17
Example (Kruskal’s Algo)
07-09-2024 18
Example (Kruskal’s Algo)
Total weight = 56
07-09-2024 19
Knapsack Problem (Fractional)
• Each object i has some positive weight wi and some profit value is associated with
each object which is denoted as pi. The knapsack carry at the most weight W.
• While solving the knapsack problem we have the capacity constraint. When we
try to solve this problem using Greedy approach our goal is,
1. Choose only those objects that give maximum profit
2. The total weight of selected objects should be <= W.
• And then we can obtain the set of feasible solutions. In other words,
• Where the knapsack can carry the fraction xi of an object i such that 0 <= xi <= 1 and 1 <= i <= n.
07-09-2024 20
Knapsack problem
Three approaches for solving the problem: First, item 1 is chosen which weight is 10 and profit is 60
1. Select object with maximum profit Weight = 10, Profit = 60
2. Select object with minimum weight
3. Select object with maximum p/ w ratio Out of 50 kg capacity, we have 10 kg. still the 40 kg is
required.
07-09-2024 21
Example (Knapsack)
• Solve the following problem using Greedy method. Number of items = 5. capacity
W = 100. weight vector w = {50, 40, 30, 20, 10} and profit vector p = {1, 2, 3, 4, 5}.
07-09-2024 22
Example (Knapsack)
• Solve the following problem using Greedy method. Number of items = 5. capacity
W = 100. weight vector w = {50, 40, 30, 20, 10} and profit vector p = {1, 2, 3, 4, 5}.
x1 x2 x3 x4 x5
0 1 1 1 1
Total profit = 14
07-09-2024 23
Job Scheduling Problem
• Consider that there are n jobs that are to be executed. At any time t = 1, 2, 3,..
Only exactly one job is to be executed. The profits pi are given. These profits are
gained by corresponding jobs. For obtaining feasible solution we should take care
that the jobs get completed within their given deadlines.
07-09-2024 24
Example
• Using greedy algorithm find an optimal schedule for following jobs with n = 7
profits: (P1, P2, P3, P4, P5, P6, P7) = (3, 5, 18, 20, 6, 1, 38) and deadlines (d1, d2,
d3, d4, d5, d6, d7) = (1, 3, 3, 4, 1, 2, 1)
Step 1: Arrange the profits in descending order. Then corresponding deadlines will appear.
07-09-2024 25
Example
Step 3: Add ith job in array J[ ] at the index denoted by its deadlines. First job is p7. the deadline for
this job is 1. hence insert p7 in the array J[ ] at 1st index.
Step 5: Next job is p5, it has deadline 1. But as 1 is already occupied and there is empty slot at
index < J[1]. Hence just discard job p5.
07-09-2024 26
Example
• N =6, Profits: (P1, P2, P3, P4, P5, P6) = (20, 15, 10, 7, 5, 3)
deadlines: (d1, d2, d3, d4, d5, d6) = (3, 1, 1, 3, 1, 3)
07-09-2024 27
Example
• N =6, Profits: (P1, P2, P3, P4, P5, P6) = (20, 15, 10, 7, 5, 3)
deadlines: (d1, d2, d3, d4, d5, d6) = (3, 1, 1, 3, 1, 3)
1 2 3 4 5 6
p2 p4 p1
07-09-2024 28
Graphs: Shortest Path
07-09-2024 29
Example
07-09-2024 30
Example
D (a, b) = 3
D (a, c) = 7
D (a, d) = 5
D (a, e) = 9
07-09-2024 31
Example
07-09-2024 32
Example
07-09-2024 33
Huffman Code
• This algorithm is basically a coding technique for encoding data. Such an encoded
data is used in data compression techniques.
• In Huffman’s coding method, the data is inputted as a sequence of characters.
Then a table of frequency of occurrence of each character in the data is built.
• From the table of frequencies Huffman’s tree is constructed.
• Basically there are two types of coding Fixed length coding and variable length
coding. If we use fixed length coding we will need fixed number of bits to
represent any character.
07-09-2024 34
Example (Fixed)
Step 2: the encoding start from top to down. If we follow the left branch then we
should encode it as “0” and if we follow the right branch then we should encode
it as “1”.
07-09-2024 35
Example (Fixed)
07-09-2024 36
Example (Fixed)
07-09-2024 37
Example (Variable length)
07-09-2024 38
Example (Variable length)
07-09-2024 39
Example (Variable length)
07-09-2024 40
Thank You..!!
07-09-2024 41