0% found this document useful (0 votes)
21 views36 pages

CCCS314 - DAA - 22 - 23 - 3rd 07 Greedy Technique

Uploaded by

engmanalf98
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views36 pages

CCCS314 - DAA - 22 - 23 - 3rd 07 Greedy Technique

Uploaded by

engmanalf98
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Topic 7

Greedy Technique
Prof. Hanan Elazhary

Main source: A. Levitin, Introduction to the Design and Analysis of Algorithms, 3 rd edition
Greedy Technique
An algorithm design strategy that constructs a solution to an
optimization problem piece by piece through a sequence of
choices that are
feasible, i.e., satisfying the constraints problem defined by an
locally optimal (with respect to some neighborhood definition) objective function and
a set of constraints
irrevocable (permanent)

For some problems, it yields a globally optimal solution for every


instance (we are mostly interested in this case)
For most cases, it does not but can be useful for fast approximations

2
Greedy Technique, Contd.
Optimal solutions
change making problem (for ‘normal’ coin denominations)
minimum spanning tree (MST) (e.g., Prim’s MST Algorithm)
single-source shortest paths problem (e.g., Dijkstra’s single-source
Shortest Paths Algorithm)
simple scheduling problems
coding problem (e.g., Huffman codes)
In the knapsack problem for example we could
a) pick the items in order determined by the highest value
Approximations/heuristics b) pick the items in order determined by the highest
traveling salesman problem (TSP) value/weight ratio
c) pick the items in order determined by the minimum weight
knapsack problem but will most probably get an approximate solution

other combinatorial optimization problems


3
Change-Making Problem
Required change of a specific amount n with the least number out of
unlimited number of coins of normal denominations d1>d2 > . . .>dm
For example, considering the widely used coin denominations in the US
d1 = 25 (quarter), d2 = 10 (dime), d3 = 5 (nickel), and d4 = 1 (penny)
how would you give 48 cent change with the least number of these
denominations?
Greedy solution starts with the largest: 1 quarter, 2 dimes, and 3 pennies
feasible, locally optimal, irrevocable
Greedy solution
is optimal for any amount and ‘normal’ set of denominations (can you prove it?)
may not be optimal for arbitrary coin denominations
For example, d1 = 25c , d2 = 10c, d3 = 1c, and n = 30c

https://math.stackexchange.com/questions/2433735/proving-that-greedy-coin-change-algorithm-gives-optimal-solution-under-certain-c
4
Minimum Spanning Tree (MST)
Spanning tree of an undirected connected graph G
connected acyclic subgraph (tree) of G that includes all of G’s vertices
Minimum spanning tree of a weighted undirected connected graph G
spanning tree of G of the minimum total weight

T2 and T3 are spanning trees of T1


6 c 6 c c
a a a
4 1 4 1 1
2 2

d b d b d
b 3 3

5
Prim’s MST Algorithm
Prim’s algorithm constructs a minimum spanning tree
The idea of Prim’s algorithm is
to start with a tree T formed of any single vertex
and continuously attach to it a fringe vertex such that the edge connecting it to
the tree has minimum weight among all candidates,
until all vertices are exhausted a fringe is a vertex that is not in the tree
but adjacent to at least one tree vertex
you end up with a minimum spanning tree

It always ends up with a minimum spanning tree (a theorem with a


relatively long proof ☺)

6
Prim’s MST Algorithm, Contd.
In fact, Prim’s algorithm
processes a sequence of expanding trees T1, T2, … , Ti, … , Tn
such that on each iteration, it constructs Ti+1 from Ti by adding a vertex not in
Ti and is closest to those already in Ti (this is a “greedy” step!)
and stops when all vertices are included

7
Prim’s MST Algorithm, Contd.
Example 4 c 4 c 4 c
a a a
1 1 1
6 6 6
2 2 2

d d d
b 3 b 3 b 3

4 c
4 c
a
a 1
1 6
6
2
2
d
d b 3
b 3
8
Prim’s MST Algorithm, Contd.
Example

9
Prim’s MST Algorithm, Contd.
Proof by induction that this construction actually yields a MST
The Crucial Property behind Prim’s Algorithm
Claim: Let G = (V,E) be a weighted graph and (X,Y) be a partition of V (called
a cut). Suppose e = (x,y) is an edge of E across the cut, where x is in X and y
is in Y, and e has the minimum weight among all such crossing edges (called
a light edge). Then there is a MST containing e.

Y
y
x
X
https://gtl.csa.iisc.ac.in/dsa/node183.html
10
Prim’s MST Algorithm, Contd.
Needs priority queue for locating closest fringe vertex. The Detailed
algorithm can be found in Levitin, P. 320.

Efficiency
O(n2) for weight matrix representation of graph and array implementation of
priority queue
O(m log n) for adjacency lists representation of graph with n vertices and m
edges and min-heap implementation of the priority queue

11
Single Source Shortest Paths Problem
Single-source shortest-paths problem
For a given vertex called the source in a weighted connected graph, find
shortest paths to all its other vertices
transportation planning
packet routing in communication networks, including the Internet
social networks
speech recognition
document formatting
robotics
compilers
airline crew scheduling
pathfinding in video games
finding best solutions to puzzles

12
Dijkstra’s Shortest Paths Algorithm
Dijkstra’s algorithm finds single-source shortest-paths in a weighted
connected graph
The idea of Dijkstra’s algorithm is
to start with a boxed source vertex and find shortest distance from (the source
through) the box to each neighboring vertex,
box an unboxed vertex with the shortest distance (from the source)
and repeat until you obtain the single-source shortest paths to all destinations
boxed vertices are final and form up the solution tree

13
Dijkstra’s Shortest Paths Algorithm, Contd.
Example

14
Dijkstra’s Shortest Paths Algorithm, Contd.

15
Dijkstra’s Shortest Paths Algorithm, Contd.

16
Dijkstra’s Shortest Paths Algorithm, Contd.

B has the smallest distance from A and no future path from A to B can be shorter and so it is final

17
Dijkstra’s Shortest Paths Algorithm, Contd.

18
Dijkstra’s Shortest Paths Algorithm, Contd.

19
Dijkstra’s Shortest Paths Algorithm, Contd.

20
Dijkstra’s Shortest Paths Algorithm, Contd.

21
Dijkstra’s Shortest Paths Algorithm, Contd.

22
Dijkstra’s Shortest Paths Algorithm, Contd.

23
Dijkstra’s Shortest Paths Algorithm, Contd.

24
Dijkstra’s Shortest Paths Algorithm, Contd.

25
Dijkstra’s Shortest Paths Algorithm, Contd.
Boxed
Example

26
Dijkstra’s Shortest Paths Algorithm, Contd.
Dijkstra’s algorithm
Among vertices not already in the tree, it finds vertex u with the smallest sum
dv + w(v,u)
where
v is a vertex for which shortest path has been already found on preceding
iterations (form the tree root s)
dv is the length of the shortest path from source s to v
w(v,u) is the length (weight) of edge from v to u

It always ends up with single-source shortest paths to all destinations


because a boxed vertex has the shortest current distance from the source
and no other future such path can be shorter
27
Dijkstra’s Shortest Paths Algorithm, Contd.
Correctness can be proven by induction on the number of vertices
We prove the invariants: (i) when a vertex is added to the tree, its correct distance is
calculated and (ii) the distance is at least those of the previously added vertices
Applicable to both undirected and directed graphs
Cannot be adopted for graphs with negative weights (whereas Floyd’s
algorithm does, as long as there is no negative cycle).
Can you find a counterexample for Dijkstra’s algorithm?

Efficiency
O(|V|2) for graphs represented by weight matrix and array implementation of
priority queue
O(|E| log |V|) for graphs represented by adjacency lists and min-heap
implementation of priority queue
https://kgardner.people.amherst.edu/courses/s19/cosc211/handouts/dijkstra_slides.pdf; https://avivt.github.io/avivt/downloads/tutorials/Class3.pdf
28
Coding Problem
Coding
assignment of bit strings (codewords) to alphabet characters
e.g., we can code {a,b,c,d} as {00,01,10,11} or {0,10,110,111} or {0,01,10,101}

Two types of codes e.g., if P(a) = 0.4, P(b) = 0.3, P(c) =


0.2, P(d) = 0.1, then the weighted
fixed-length encoding (e.g., ASCII) average length of this code is 0.4 +
2*0.3 + 3*0.2 + 3*0.1 = 1.9 bits
variable-length encoding (e.g., Morse code)

Prefix-free codes (or prefix-codes)


no codeword is a prefix of another codeword
It allows for efficient (online) decoding!
e.g., consider the encoded string (msg) 10010110…
29
Coding Problem, Contd.
A binary prefix code can be represented by a binary tree
associate the alphabet’s symbols with the leaves of the binary tree
each left edge is labeled by 0 and each right edge is labeled by 1
and the codeword of a symbol can then be obtained by recording the labels on
the simple path from the root to the symbol’s leaf

0 1

0 1

represents {00, 011, 1}

30
Coding Problem, Contd.
If frequencies of the character occurrences are known, what is the
best binary prefix-free code?
The one with the shortest average code length
representing on the average how many bits are required to transmit or store a
character

Huffman code
represented by a Huffman tree
that would assign shorter bit strings to high-frequency symbols
and longer ones to low-frequency symbols
by the greedy Huffman algorithm invented by David Huffman while he was a
graduate student at MIT
31
Huffman Codes
Huffman Algorithm
Initialize n one-node trees and label them with the characters of the alphabet
Record the frequency of each character in its tree’s root to indicate the tree weight
the weight of a tree is equal to the sum of the frequencies in the leaves
Repeat n-1 times the following until a single tree is obtained
Find two trees with the smallest weight
Join them as left and right subtrees of a new tree and record the sum of
their weights in the root of the new tree as its weight
Mark edges leading to left and right subtrees with 0 and 1 respectively

32
Huffman Codes, Contd.
Example

initialization
forming a tree out of the two trees with
smallest weight

two trees with smallest weight

33
Huffman Codes, Contd.
two trees with smallest weight two trees with smallest weight

forming a tree out of the two trees with forming a tree out of the two trees with
smallest weight smallest weight

34
Huffman Codes, Contd.
two trees with smallest weight
forming a tree out of the two trees with
smallest weight

weighted average bits per character: 2.25 bits


for fixed-length encoding: we need 3
compression ratio: (3-2.25)/3*100% = 25%

35
Questions?

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy