0% found this document useful (0 votes)
1 views71 pages

Lecture3 Greedy MethodFull

The document discusses greedy algorithms, which make locally optimal choices in hopes of achieving a globally optimal solution, though this approach does not always guarantee optimality. It covers specific applications such as the Knapsack Problem, Job Sequencing with Deadlines, Minimum-Cost Spanning Trees, and algorithms like Prim's and Kruskal's for constructing these trees. Additionally, it outlines the analysis of these algorithms, including their running times and operational characteristics.

Uploaded by

Huzaifa Siam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views71 pages

Lecture3 Greedy MethodFull

The document discusses greedy algorithms, which make locally optimal choices in hopes of achieving a globally optimal solution, though this approach does not always guarantee optimality. It covers specific applications such as the Knapsack Problem, Job Sequencing with Deadlines, Minimum-Cost Spanning Trees, and algorithms like Prim's and Kruskal's for constructing these trees. Additionally, it outlines the analysis of these algorithms, including their running times and operational characteristics.

Uploaded by

Huzaifa Siam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 71

GREEDY ALGORITHM

Outlines

 Introduction
 Knapsack Problem
 Job Sequencing with Deadlines
 Minimum-Cost Spanning Trees
 Prim’s Algorithm
 Kruskal’s Algorithm
 Single-Source Shortest Paths
 Dijkstra’s Algorithm
Introduction
 A greedy algorithm always makes the choice that looks best
at the moment.
 That is, it makes a locally optimal choice in the hope that
this choice will lead to a globally optimal solution.
 The greedy approach does not always lead to an optimal
solution.
 The problems that have a greedy solution are said to
posses the greedy-choice property.
 The greedy approach is also used in the context of hard
(difficult to solve) problems in order to generate an
approximate solution.
Knapsack Problem

A feasible solution is any set


(x1,…,xn) satisfying the
constraints. An optimal solution
is a feasible solution for which
the objective is maximized
Knapsack Problem (Cont..)
 Example: Consider the instance of a knapsack problem: n=3,
W=20, (p1,p2,p3)=(25,24,15) and (w1,w2,w3)=(18,15,10). Four
feasible solutions are:

 Of these four feasible solutions, solution 4 yields the


maximum profit.
Knapsack Problem (Cont..)
Greedy_Knapsack(W, n)
//p[1:n] and w[1:n] contain the profit and weights
for i = 1 to n do
respectively of the n objects s.t. p[i]/w[i]≥ p[i+1]/w[i+1]. W
x[i] = 0 is the knapsack size and x[1:n] is the solution vector
end for
M = W;
for i = 1 to n do
if w[i] > M then
break;
end if
x[i] = 1.0; M = M – w[i]
end for
if i ≤ n then x[i] = M/w[i] end if
End Greedy_Knapsack
Job Sequencing with Deadlines


Job Sequencing with Deadlines (Cont..)
 Example: let n=4, (p1,p2,p3,p4)=(100,10,15,27) and (d1,d2,d3,d4)=
(2,1,2,1). The feasible solution s and their values:

Solution 3 is optimal
Job Sequencing with Deadlines (Cont..)
Minimum cost spanning tree (MCST)
 Tree: No cycles; equivalently, for each pair of nodes u
and v, there is only one path from u to v
 Spanning: Contains every node in the graph

 Minimum cost: Smallest possible total weight of any


spanning tree
1
a b
5
2 4

c 3
d
MCST (Cont..)
1
a
1
b a b
5 5
2 4 2 4

c d c 3
d
3

 Black edges and nodes are in  Black edges and nodes are in
T. Is T a MCST? T. Is T a MCST?
 Not spanning; d is not in T.  Not minimum cost; can swap
edges 4 and 2
MCST (Cont..)

 Which edges form a MCST?

1 1
a b a b
3 3
4 2 4 2

c 3
d c 3
d
Application of MCST
 Electronic circuit designs
 Planning how to lay network cable to connect several
locations to the internet
 Planning how to efficiently bounce data from router to
router to reach its internet destination
 Creating a 2D maze (to print on cereal boxes, etc.)
Prim's Algorithm
 Prim’s algorithm takes a graph G=(V, E) and builds an
MCST T
 Major steps of the algorithm:
 Pick an arbitrary node r from V
 Add r to T
 While T contains < |V| nodes
 Find a minimum weight edge (u, v) where u T and v T
 Add node v to T
Complete Graph

B 4 C
4
2 1

A 4 E
1 F

D 2 3
10
G 5

5 6 3

4
I
H

2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Complete Graph Minimum Spanning Tree

B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A E F
1
D 2 3
10 D 2
G 5
G
5 6 3
3
4
I
H I
3 H
2 J
2 3
J
Analysis of Prim's Algorithm
 Running Time = O(m + n log n) (m = edges, n = nodes)
 If a heap is not used, the run time will be O(n^2) instead of
O(m + n log n). However, using a heap complicates the code
since you’re complicating the data structure. A Fibonacci heap
is the best kind of heap to use, but again, it complicates the
code.
 Unlike Kruskal’s, it doesn’t need to see all of the graph at
once. It can deal with it one piece at a time. It also doesn’t
need to worry if adding an edge will create a cycle since this
algorithm deals primarily with the nodes, and not the edges.
 For this algorithm the number of nodes needs to be kept to a
minimum in addition to the number of edges. For small
graphs, the edges matter more, while for large graphs the
number of nodes matters more
Kruskal's Algorithm
 This algorithm creates a forest of trees.
 Initially the forest consists of n single node trees (and
no edges).
 At each step, we add one edge (the cheapest one) so
that it joins two trees together.
 If it were to form a cycle, it would simply link two
nodes that were already part of a single connected
tree, so that this edge would not be needed.
Kruskal's Algorithm (Steps)
 The forest is constructed - with each node in a separate tree.
 The edges are placed in a priority queue.
 Until we've added n-1 edges,
 Extract the cheapest edge from the queue,
 If it forms a cycle, reject it,
 Else add it to the forest. Adding it to the forest will join two
trees together.
 Every step will have joined two trees in the forest together, so that
at the end, there will only be one tree in T
Complete Graph

B 4 C
4
2 1

A 4 E
1 F

D 2 3
10
G 5

5 6 3

4
I
H

2 3
J
A 4 B A 1 D

B 4 C B 4 D

B 4 C B 10 J C 2 E
4
2 1
C 1 F D 5 H
A 4 E
1 F

2 D 6 J E 2 G
D 3
10
G 5
F 3 G F 5 I
5 6 3

4
I G 3 I G 4 J
H

2 3
J H 2 J I 3 J
Sort Edges A 1 D C 1 F
(in reality they are placed in a priority queue - not
sorted - but sorting them makes the algorithm 2 2
C E E G
easier to visualize)

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Cycle A 1 D C 1 F

Don’t Add Edge


C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Cycle A 1 D C 1 F

Don’t Add Edge


C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F

C 2 E E 2 G

B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F

2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3

4
I F 5 I D 5 H
H

2 3
J D 6 J B 10 J
Minimum Spanning Tree Complete Graph

B 4 C 4
B C
4 4
2 1 2 1
A E A 4
1 F E F
1

D 2 2
D 3
10
G G 5

3 5 6 3

4
I I
H H
2 3 3
J 2 J
Analysis of Kruskal's Algorithm
 Running Time = O(m log n) (m = edges, n = nodes)
 Testing if an edge creates a cycle can be slow unless a
complicated data structure called a “union-find” structure
is used.
 It usually only has to check a small fraction of the edges,
but in some cases (like if there was a vertex connected to
the graph by only one edge and it was the longest edge) it
would have to check all the edges.
 This algorithm works best, of course, if the number of
edges is kept to a minimum
Single-Source Shortest Paths
 The problem of finding shortest paths from a
source vertex v to all other vertices in the graph.
Application of SSSP
 Maps (Map Quest, Google Maps)
 - Routing Systems
Dijkstra's Algorithm
 This is a solution to the single-source shortest path problem in
graph theory.

 Works on both directed and undirected graphs. However, all


edges must have nonnegative weights.

 Input: Weighted graph G={E,V} and source vertex v∈V, such


that all edge weights are nonnegative

 Output: Lengths of shortest paths (or the shortest paths


themselves) from a given source vertex v∈V to all other vertices
Approach
 The algorithm computes for each vertex u the distance to u
from the start vertex v, that is, the weight of a shortest path
between v and u.
 The algorithm keeps track of the set of vertices for which the
distance has been computed, called the cloud C
 Every vertex has a label D associated with it. For any vertex u,
D[u] stores an approximation of the distance between v and u.
The algorithm will update a D[u] value when it finds a shorter
path from v to u.
 When a vertex u is added to the cloud, its label D[u] is equal
to the actual (final) distance between the starting vertex v and
vertex u.
51
Example: Initialization

Distance(source) = 0 0 ∞ Distance (all vertices but


A
2
B source) = ∞

4 1 3 10

2 2 ∞
∞ C D E

5 8 ∞ 4 6

1
F G

∞ ∞

Pick vertex in List with minimum distance.


52
Example: Update neighbors' distance

0 2
2
A B

4 1 3 10

2 2 ∞
∞ C D E

5 8 1 4 6

Distance(B) = 2 1
F G
Distance(D) = 1
∞ ∞

53
Example: Remove vertex with minimum distance

0 2
2
A B

4 1 3 10

2 2 ∞
∞ C D E

5 8 1 4 6

1
F G

∞ ∞

Pick vertex in List with minimum distance, i.e., D


54
Example: Update neighbors

0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

Distance(C) = 1 + 2 = 3 1
F G
Distance(E) = 1 + 2 = 3
Distance(F) = 1 + 8 = 9 9 5
Distance(G) = 1 + 4 = 5
55
Example: Continued...

Pick vertex in List with minimum distance (B) and update neighbors
0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6
Note : distance(D) not updated
F
1
G since D is already known and
distance(E) not updated since it is
9 5 larger than previously computed

56
Example: Continued...

Pick vertex List with minimum distance (E) and update neighbors

0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
F G
No updating
9 5

57
Example: Continued...

Pick vertex List with minimum distance (C) and update neighbors

0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

Distance(F) = 3 + 5 = 8 1
F G

8 5

58
Example: Continued...

Pick vertex List with minimum distance (G) and update neighbors

0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
F G
Previous distance
6 5
Distance(F) = min (8, 5+1) = 6
59
Example (end)

0 2
2
A B

4 1 3 10

2 2
3 C D E 3

5 8 1 4 6

1
F G

6 5
Pick vertex not in S with lowest cost (F) and update neighbors
60
Another Example
Another Example
Another Example
Another Example
Another Example
Another Example
Another Example
Another Example
Another Example
Dijkstra’s Pseudo Code
 Graph G, weight function w, root s

relaxing
edges
70
Thanks for your Attention

Q&A

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy