3 Greedy
3 Greedy
1
The greedy method
• An optimization problem is one in which we want to find,
not just a solution, but the best solution
– A “greedy algorithm” works well for most optimization
problems
– Finds out of many options but you have to choose the
best option
• Greedy method suggests that one can devise an algorithm
that works in phases:
– At each phase, take one input at a time (from the
ordered input according to the selection criteria) and
decide whether it is an optimal solution.
2
Feasible vs. optimal solution
• Greedy method solves problem by making a sequence of
decisions.
• Decisions are made one by one in some order.
• Each decision is made using a greedy criterion.
• A decision, once made, is (usually) not changed later.
• Given n inputs we are required to obtain a subset that satisfies
some constraints
– Any subset that satisfies the given constraints is called a
feasible solution
– A feasible solution that either maximizes or minimizes a given
objective function is called an optimal solution.
3
Greedy algorithm
To apply greedy algorithm
• Decide optimization measure (maximization of profit or
minimization of cost)
– Sort the input in increasing or decreasing order based on the
optimization measure selected for the given problem
4
Greedy Choice Property
• Greedy algorithm always makes the choice that looks
best at the moment
– With the hope that a locally optimal choice will
lead to a globally optimal solution
5
The Problem of Making Coin Change
• Assume the coin denominations are: 50, 25, 10, 5, and 1.
• Problem: Make a change of a given amount using the smallest
possible number of coins
• Example: make a change for x = 92.
–Mathematically this is written as
x = 50a + 25b + 10c + 5d + 1e
So that a + b + c + d +e is minimum & a, b, c, d, e ≥ 0.
• Greedy algorithm for coin changing
–Order coins in decreasing order
–Select coins one at a time (divide x by denomination)
–Solution: contains a =1 , b = 1, c = 1 , d = 1 e = 2.
6
Characteristics of Greedy Algorithms:
• Local Optimum Choice: At each step, the algorithm
picks the best option available without considering the
overall problem.
• Feasibility: The chosen option must satisfy the problem's
constraints.
• Irrevocability: Once an option is chosen, it cannot be
undone.
7
Algorithm
procedure greedy (A, n)
Solution ← { }; // set that will hold the solution set.
FOR i = 1 to n DO
x = SELECT (A)
IF FEASIBLE (Solution, x) THEN
Solution = UNION (Solution, x)
end if
end FOR
RETURN Solution
end procedure
• SELECT function:
–selects the most promising candidates from A[ ] and removes it
from the list.
• FEASIBLE function:
–a Boolean valued function that determines whether x can be
included into the solution vector.
• UNION function:
–combines x with the solution 8
Algorithm for Coin Change
• Make change for n units using the least possible number of
coins.
9
A failure of the greedy algorithm
In some (fictional) monetary system, “coins” come in
1, 7, and 10 coins
Using a greedy algorithm to count out 15 coins, we
would get
A 10 coins piece
10
Minimum Spanning Trees
Problem: Laying Telephone Wire
Central office
11
Minimum Spanning Tree (MST)
• Assume we have an undirected graph G = (V,E) with weights
assigned to edges.
•The objective is “use smallest set of edges of the given graph to
connect everything together”. How?
•A minimum spanning tree is a least-cost subset of the edges of a
graph that connects all the nodes
• MST is a sub-graph of an undirected weighted graph G, such that:
•It is a tree (i.e., it is acyclic)
•It covers all the vertices V
•contains |V| - 1 edges
•The total cost associated with tree edges is the minimum among
all possible spanning trees
Applications of MST
• Network design, road planning, hydraulic, electric,
telecommunication etc.
12
How can we generate a MST?
• A MST is a least-cost subset of the edges of a graph that connects all
the nodes
• A greedy method to obtain a minimum-cost spanning tree builds this
tree edge by edge.
–The next edge to include is chosen according to some optimization
criterion.
• Criteria: to choose an edge that results in a minimum increase in the
sum of the costs of the edges so far included.
• General procedure: 4
6
2
– Start by picking any node and adding it to the tree
4
– Repeatedly: Pick any least-cost edge from a node in 1 5
the tree to a node not in the tree, and add the edge 3 2
and new node to the tree
3 2 3
– Stop when all nodes have been added to the tree 3 3
• Two techniques: Prim and Kruskal algorithms 4
2 13
4
Kruskal’s algorithm
• Kruskal algorithm: Always tries the lowest-cost remaining edge
• It considers the edges of the graph in increasing order of cost.
• In this approach, the set T of edges so far selected for the spanning
tree may not be a tree at all stages in the algorithm. But it is possible
to complete T into a tree.
• Create a forest of trees from the vertices
• Repeatedly merge trees by adding “safe edges” until only one tree
remains. A “safe edge” is an edge of minimum weight which does
not create a cycle
• Example:
9 Initially there is a forest:
b
a 2 6 V={a}, {b}, {c}, {d}, {e}
d
4 5
5 4 E = {(a,d), (c,d), (d,e), (a,c),
5 e (b,e), (c,e), (b,d), (a,b)}
c
14
Cont..
15
Cont..
16
Cont..
• MST cost=2+2+3+3+7=17
17
Kruskal Algorithm
procedure kruskalMST(G, cost, n, T)
i=0 // Number of edges in MST
mincost = 0 // Total cost of MST
Set T = {} (to store MST edges)
while i < n - 1 do
(u, v) = delete_min_cost_edge() // Find the edge with the minimum cost
j = find(u) // Find the parent of vertex u
k = find(v) // Find the parent of vertex v
if j ≠ k then // If including this edge doesn't form a cycle
i=i+1
T(i,1) = u // Add edge (u, v) to the MST
T(i,2) = v
mincost = mincost + cost(u,v) // Update total cost
union(j, k) // Union the sets of u and v
end if
end while
if i ≠ n - 1 then
return "No spanning tree" // If MST doesn't include all vertices
return mincost // Return the total cost of MST
end procedure
• After each iteration, every tree in the forest is a MST of the vertices it connects
• Algorithm terminates when all vertices are connected into one tree
• Running time is bounded by sorting (or findMin): O(n2)
18
Prim’s algorithm
• Prim’s: Always takes the lowest-cost edge between nodes in the
spanning tree and nodes not yet in the spanning tree
• If A is the set of edges selected so far, then A forms a tree.
–The next edge (u,v) to be included in A is a minimum cost edge
not in A such that A υ {(u,v)} is also a tree, where u is in the tree
& v is not.
• Property: At each step, we add the edge (u,v). the weight of (u,v) is
minimum among all edges
–This spanning tree grows by one new node and edge at each
iteration.
•Each step maintains a minimum spanning tree of the vertices that
have been included thus far
19
Prim’s algorithm
•Example: find the minimum spanning tree using Prim
algorithm
9 2 9 2
1 2 6 1 2 6
4 4
4 5 4 5
5 4 5 4
5 5 5 5
3 3
20
Prim’s Algorithm
22
Prim’s Algorithm
23
Single-Source Shortest Paths
• Given: A single source vertex in a
weighted, directed graph.
• Want to compute a shortest path for each
possible destination.
– Similar to BFS.
• We will assume either
– no negative-weight edges, or
– no reachable negative-weight cycles.
• Algorithm will compute a shortest-path tree.
Single-source SPs
24
Single source shortest path
• The length of a path is defined by the sum of the weights of the edges on
that path.
–The starting vertex of the path is referred as the source
–The last vertex is the destination
• To formulate a greedy approach to generate the shortest path, starting
source vertex v0 think of:
–A multi-stage solution: build the
shortest path one by one
–An optimization measure: Minimize
the sum of all the shortest paths. For
this measure to be minimized each
individual path must be of minimum
length
• Use the sum of the lengths of all
paths so far generated.
25
Relaxation
Algorithms keep track of d[v], [v]. Initialized as follows:
Initialize(G,
Initialize(G,s)s)
for eachvvV[G]
foreach V[G]do
do
d[v] :=;
d[v]:= ;
[v]
[v]:=
:=NIL
NIL
od;
od;
d[s]
d[s]:=
:=00
26
Dijkstra’s shortest-path algorithm
• Dijkstra’s algorithm finds the shortest paths from a given node to
all other nodes in a graph
– Always takes the shortest edge connecting a known node to an
unknown node
• Initially,
– Mark the given node as known (path length is zero)
– For each out-edge, set the distance in each neighboring node
equal to the cost (length) of the out-edge, and set its predecessor
to the initially given node
• Repeatedly (until all nodes are known),
– Find an unknown node containing the smallest distance
– Mark the new node as known
– For each node adjacent to the new node, examine its neighbors to
see whether their estimated distance can be reduced (distance to
known node plus cost of out-edge)
• If so, also reset the predecessor of the new node
27
Example
u v
1
10
9
2 3
s 0 4 6
5 7
2
x y
28
Example
u v
1
10
10
9
2 3
s 0 4 6
5 7
5
2
x y
29
Example
u v
1
8 14
10
9
2 3
s 0 4 6
5 7
5 7
2
x y
30
Example
u v
1
8 13
10
9
2 3
s 0 4 6
5 7
5 7
2
x y
31
Example
u v
1
8 9
10
9
2 3
s 0 4 6
5 7
5 7
2
x y
32
Example
u v
1
8 9
10
9
2 3
s 0 4 6
5 7
5 7
2
x y
33
Dijkstra’s shortest-path algorithm
1. Initialize:
For (each vertex v in V ):
currentDistance(v) = ∞ // Set initial distance to infinity
predecessor(v) = undefined // No predecessor initially
currentDistance(s) = 0 // Distance to source is zero
T=V // Set of all vertices
2. While ( T not empty set ): // While T is not empty
a. Select vertex ( v in T ) // with the smallest currentDistance(v).
b. Remove v from T.
c. For each neighbor ( u ) of ( v ):
If ( currentDistance(u) > currentDistance(v) + weight(v, u) ):
currentDistance(u) = currentDistance(v) + weight(v, u)
predecessor(u) = v
3. End.
34
Scheduling
• The Greedy algorithm scheduling is a heuristic approach to scheduling
problems, where the algorithm makes the locally optimal choice at each
step, hoping to achieve a globally optimal solution.
• The basic idea of the greedy algorithm scheduling is to sort the jobs by
some criteria and assign the jobs to the machines one by one, starting with
the job that has the earliest deadline or the shortest processing time.
• The algorithm then assigns the job to the machine that is available at that
time.
• For example, let's say we have three jobs J1, J2, and J3, with processing
times p1=2, p2=5, and p3=7, and deadlines d1=3, d2=5, and d3=8,
respectively. We also have two machines M1 and M2, and we want to
assign the jobs to the machines in such a way that the maximum lateness is
minimized.
35
Cont..
• The greedy algorithm scheduling can be applied in the following way:
• Step 1: Sort the jobs by the earliest deadline. In this case, the order is J1,
J2, and J3.
• Step 2: Assign job J1 to machine M1, since it has the earliest deadline. The
job completes at time t=2.
• Step 3: Assign job J2 to machine M1, since it has the earliest deadline
among the remaining jobs. The job completes at time t=7.
• Step 4: Assign job J3 to machine M2, since it has the earliest deadline
among the remaining jobs. The job completes at time t=14.
• The maximum lateness is the difference between the completion time and
the deadline. In this case, the lateness for job J1 is 0, for job J2 is 2, and for
job J3 is 6. The maximum lateness is 6, which is achieved when job J3 is
completed.
36
Cont..
• Another example where the greedy algorithm scheduling can be
applied is the interval scheduling problem. Suppose we have n tasks,
each with a start time and an end time. We want to find the maximum
number of non-overlapping tasks that can be scheduled.
• The greedy algorithm scheduling can be applied in the following way:
• Step 1: Sort the tasks by their end time.
• Step 2: Select the task with the earliest end time, and schedule it.
• Step 3: For each remaining task, if the start time is after the end time
of the previously scheduled task, then schedule it.
• Repeat steps 2 and 3 until no more tasks remain.
37
Cont..
• For example, suppose we have the following four tasks:
• Task 1: (1, 3)
• Task 2: (2, 4)
• Task 3: (3, 6)
• Task 4: (5, 7)
• The greedy algorithm scheduling can be applied in the following way:
• Step 1: Sort the tasks by their end time: Task 1, Task 2, Task 3, Task 4.
• Step 2: Select Task 1 and schedule it.
• Step 3: Task 2 cannot be scheduled since it overlaps with Task 1.
• Step 4: Task 3 can be scheduled since its start time is after the end time of Task 1.
• Step 5: Task 4 cannot be scheduled since it overlaps with Task 3.
• Thus, the maximum number of non-overlapping tasks that can be scheduled is 2,
which are Task 1 and Task 3.
38
Important Dates
• Presentation assignment
– As per the due date: (i) presentation PPT; (ii) report in
DOC and (iii) major e-resources, PDF
• Algorithm
– DC: Matrix Multiplication, Merge sort
– Greedy: Task scheduling, Dijkstra, Job sequencing
– DP: String matching, All Pair Shortest Path
• Project
– Select a problem and implement (i) Divide and conquer;
(b) Greedy; (iii) Dynamic programming
• Final exam
39