0% found this document useful (0 votes)
39 views41 pages

CH. 5 ADA PDF

Chapter 5 discusses greedy algorithms, highlighting their characteristics, problem-solving techniques, and comparisons with dynamic programming. Key applications include the activity selection problem, minimum spanning trees (using Prim's and Kruskal's algorithms), the knapsack problem, job scheduling, and Huffman coding. The chapter emphasizes constructing solutions step-by-step to achieve optimal results while adhering to constraints.

Uploaded by

abhaygohil1602
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
39 views41 pages

CH. 5 ADA PDF

Chapter 5 discusses greedy algorithms, highlighting their characteristics, problem-solving techniques, and comparisons with dynamic programming. Key applications include the activity selection problem, minimum spanning trees (using Prim's and Kruskal's algorithms), the knapsack problem, job scheduling, and Huffman coding. The chapter emphasizes constructing solutions step-by-step to achieve optimal results while adhering to constraints.

Uploaded by

abhaygohil1602
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

CHAPTER 5: GREEDY ALGORITHM

Prepared By: Ekta Unagar

Gyanmanjari Institute of Technology, Bhavnagar


Course Content

• General Characteristics of greedy algorithms


• Problem solving using Greedy Algorithm - Activity selection problem
• Minimum Spanning trees (Kruskal’s algorithm, Prim’s algorithm)
• Graphs: Shortest paths
• The Knapsack Problem
• Job Scheduling Problem
• Huffman code

Weightage: 15%

07-09-2024 2
Introduction

• In greedy technique, the solution is constructed through a sequence of steps,


each expanding a partially constructed solution obtained so far, until a complete
solution to the problem is reached.
• Feasible solution (it should be satisfy the problem’s constraints)
• Optimal solution (locally optimal)

• In short, while making a choice there should be a greed for the optimum solution.

07-09-2024 3
General Characteristics of Greedy Algorithm

1. First we select some solution from input domain.


2. Then we check whether the solution is feasible or not.
3. From the set of feasible solution, particular solution that satisfies or nearly
satisfies the objective of the function. Such a solution is called optimal solution.
4. As Greedy method works in stages. At each stage only one input is considered
at each time. Based on this input it is decided whether particular input gives the
optimal solution or not.

07-09-2024 4
Comparison between Greedy and Dynamic Program

Greedy Method Dynamic Programming


• It is used for obtaining optimum • It is also used for obtaining optimum
solution. solution.
• A set of feasible solutions and picks • There is no special set of feasible
up the optimum solution. solutions in this method.
• Considered all possible sequences in
• The optimum selection is without order to obtain the optimum solution.
revising previously generated
solutions.
• It is guaranteed that it will generate
• There is no as such guarantee of optimal solution using principal of
getting optimum solution. optimality.

07-09-2024 5
Problem Solving using Greedy Algorithm

1. Knapsack problem
2. Prim’s algorithm for minimum spanning tree
3. Kruskal’s algorithm for minimum spanning tree
4. Finding shortest path
5. Job sequencing with deadlines
6. Activity selection problem

For solving all above problems a set of feasible solutions is obtained. From these
solution optimum solution is selected. This optimum solution then becomes the
final solution for given problem.

07-09-2024 6
Activity Selection Problem

• Given a resource such as a CPU or a lecture hall and n set of activities, such as
task or lectures that want to utilize the resource.
• The activity i have the starting and finishing time which can be denoted by Si and
Fi.
• Then activity selection problem is to find max_size subset A of mutually
compatible activities. [maximize the number of lecture all in the lecture hall or
maximize number of tasks assigned to the CPU].

07-09-2024 7
Activity Selection Problem
• Consider the following set of activities denoted by S. select the compatible set of
activities.

Step 1: Sort Fi into non-decreasing order. After sorting F1 <= F2 <= F3 <=…<= Fn
Step 2: Select the next activity i to the solution set if i is compatible with each
activity in the solution set.
Step 3: Repeat step 2, until all activities get examined.
Select a1, finish time of a1 is 4, Hence we will search for the activity with Si >= 4. We will select a4
(because S4 = 5). The F4 is 7. Hence the next activity such that Si >= 7. we will get S8. the F8 = 11. Hence
select next activity such that Si >= 11. we obtain a11 activity.
Hence the solution set = {1, 4, 8, 11}
07-09-2024 8
Minimum Spanning Tree
• Spanning Tree: A spanning tree of a graph G is a sub-graph which is basically a
tree and it contains all the vertices of G containing no circuit.
• Minimum Spanning tree: A minimum spanning tree of a weighted connected
graph G is a spanning tree with minimum or smallest weight.
• Weight of the tree: A weight of the tree is defined as the sum of weights of all its
edges.

(b) is a minimum spanning tree


07-09-2024 9
Applications of spanning trees

• Spanning trees are very important in designing efficient routing algorithms.


• Spanning trees have wide applications in many areas such as network design.
Including computer networks, telecommunication networks, water supply
networks.
• Image segmentation

07-09-2024 10
Prim’s Algorithm

• Start with any vertex


• Selecting adjacent edges with minimum weight (keep including connected edges).
• Care should be taken for not forming circuit.

25
1 2
10

6
The edge (1-6) with minimum weight is
considered.

07-09-2024 11
Prim’s Algorithm
Continue till all
vertices are visited.
25 1 25
1 2 2
10 10

6 7 6 7
22 22
23 23
5 20 4 5 20 4

From the available edges (25, 22, 20),


edge 5-4 is selected which has minimum
weight. Total weight:
10+23+20+11+14+12 = 90

07-09-2024 12
Prim’s Algorithm

Time Complexity is = O(V2 )

• If the input graph is represented using adjacency list, then the time complexity of
Prim’s Algorithm can be reduced to O(E log V)

07-09-2024 13
Example (Prim’s)

07-09-2024 14
Example (Prim’s)

07-09-2024 15
Kruskal’s Algorithm

• Keep including edge with minimum weight [no cycle]

Step 1:

07-09-2024 16
Kruskal’s Algorithm Step 2
Step 3

Step 1

Step 4

Selecting edges with minimum


weight. Care should be taken for
not forming circuit.

07-09-2024 17
Example (Kruskal’s Algo)

07-09-2024 18
Example (Kruskal’s Algo)

Total weight = 56

07-09-2024 19
Knapsack Problem (Fractional)
• Each object i has some positive weight wi and some profit value is associated with
each object which is denoted as pi. The knapsack carry at the most weight W.
• While solving the knapsack problem we have the capacity constraint. When we
try to solve this problem using Greedy approach our goal is,
1. Choose only those objects that give maximum profit
2. The total weight of selected objects should be <= W.

• And then we can obtain the set of feasible solutions. In other words,

• Where the knapsack can carry the fraction xi of an object i such that 0 <= xi <= 1 and 1 <= i <= n.

07-09-2024 20
Knapsack problem
Three approaches for solving the problem: First, item 1 is chosen which weight is 10 and profit is 60
1. Select object with maximum profit Weight = 10, Profit = 60
2. Select object with minimum weight
3. Select object with maximum p/ w ratio Out of 50 kg capacity, we have 10 kg. still the 40 kg is
required.

Object (i) 1 2 3 Then we choose second highest ratio that is item 2.


W = 50 Weight = 10+20 = 30, Profit = 60 + 80 = 140
Profit (pi) 60 80 100
Weight (wi) 10 20 40 The greedy approach provides that we can divide the
Profit / weight 6 4 2.5 weights so that we take half of the 40 weights.
Weight = 50, Profit = 190

Select object with maximum profit ratio x1 x2 x3


1 1 1/2
The optimum solution is 190

07-09-2024 21
Example (Knapsack)

• Solve the following problem using Greedy method. Number of items = 5. capacity
W = 100. weight vector w = {50, 40, 30, 20, 10} and profit vector p = {1, 2, 3, 4, 5}.

07-09-2024 22
Example (Knapsack)

• Solve the following problem using Greedy method. Number of items = 5. capacity
W = 100. weight vector w = {50, 40, 30, 20, 10} and profit vector p = {1, 2, 3, 4, 5}.

x1 x2 x3 x4 x5
0 1 1 1 1

Total profit = 14

07-09-2024 23
Job Scheduling Problem

• Consider that there are n jobs that are to be executed. At any time t = 1, 2, 3,..
Only exactly one job is to be executed. The profits pi are given. These profits are
gained by corresponding jobs. For obtaining feasible solution we should take care
that the jobs get completed within their given deadlines.

Rules to obtain feasible solution


• Each job takes one unit of time.
• If job starts before or at its deadline, profit is obtained, otherwise no profit.
• Goal is to schedule jobs to maximize the total profit.

07-09-2024 24
Example

• Using greedy algorithm find an optimal schedule for following jobs with n = 7
profits: (P1, P2, P3, P4, P5, P6, P7) = (3, 5, 18, 20, 6, 1, 38) and deadlines (d1, d2,
d3, d4, d5, d6, d7) = (1, 3, 3, 4, 1, 2, 1)

Step 1: Arrange the profits in descending order. Then corresponding deadlines will appear.

Step 2: create an array J[ ] which stores the jobs. Initially J[ ] will be

07-09-2024 25
Example

Step 3: Add ith job in array J[ ] at the index denoted by its deadlines. First job is p7. the deadline for
this job is 1. hence insert p7 in the array J[ ] at 1st index.

Step 4: Next job is p4. insert it in array J[ ] at index 4.

Step 5: Next job is p5, it has deadline 1. But as 1 is already occupied and there is empty slot at
index < J[1]. Hence just discard job p5.

Job sequence as 7-2-3-4 with the profit 81.

07-09-2024 26
Example

• N =6, Profits: (P1, P2, P3, P4, P5, P6) = (20, 15, 10, 7, 5, 3)
deadlines: (d1, d2, d3, d4, d5, d6) = (3, 1, 1, 3, 1, 3)

07-09-2024 27
Example

• N =6, Profits: (P1, P2, P3, P4, P5, P6) = (20, 15, 10, 7, 5, 3)
deadlines: (d1, d2, d3, d4, d5, d6) = (3, 1, 1, 3, 1, 3)

1 2 3 4 5 6
p2 p4 p1

Answer: 2 - 4 - 1 with profit 15 + 7 +20 = 42

07-09-2024 28
Graphs: Shortest Path

• Graph is used to represent the distances between the two cities.


• In single source shortest path problem the shortest distance from a single vertex
called source is obtained. This shortest path algorithm is called Dijkstra’s shortest
path algorithm.
• Let G(V, E) be a graph, then in single source shortest path the shortest path from
vertex v0 to all remaining vertex is determined. The vertex v0 is then called as
source and the last vertex is called destination.
• It is assumed that all the distances are positive. This algorithm does not work for
negative distances.

07-09-2024 29
Example

• Find single source shortest path using Dijstra’s algorithm from a to e.

07-09-2024 30
Example

D (a, b) = 3
D (a, c) = 7
D (a, d) = 5
D (a, e) = 9

The shortest path a to e = 5. a-b-d-e

07-09-2024 31
Example

07-09-2024 32
Example

07-09-2024 33
Huffman Code

• This algorithm is basically a coding technique for encoding data. Such an encoded
data is used in data compression techniques.
• In Huffman’s coding method, the data is inputted as a sequence of characters.
Then a table of frequency of occurrence of each character in the data is built.
• From the table of frequencies Huffman’s tree is constructed.

• Basically there are two types of coding Fixed length coding and variable length
coding. If we use fixed length coding we will need fixed number of bits to
represent any character.

07-09-2024 34
Example (Fixed)

• a: 39, b: 10, c: 9, d: 25, e: 7, f: 3


Step 1: the symbols are arranged in ascending order of frequencies

Step 2: the encoding start from top to down. If we follow the left branch then we
should encode it as “0” and if we follow the right branch then we should encode
it as “1”.

07-09-2024 35
Example (Fixed)

aeed then we get 111001001110

07-09-2024 36
Example (Fixed)

07-09-2024 37
Example (Variable length)

Step 1: the symbols are arranged in ascending order of frequencies.

07-09-2024 38
Example (Variable length)

07-09-2024 39
Example (Variable length)

07-09-2024 40
Thank You..!!

07-09-2024 41

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy