0% found this document useful (0 votes)
43 views14 pages

Greedy Method

The document discusses the greedy method algorithm design paradigm. The greedy method works by making locally optimal choices at each step to find a global optimum. It is best suited to problems that have the greedy choice property, where a globally optimal solution can be reached through a series of local improvements. The document provides examples of problems that can be solved using the greedy method, including fractional knapsack, job scheduling, and making change.

Uploaded by

Sue Liu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views14 pages

Greedy Method

The document discusses the greedy method algorithm design paradigm. The greedy method works by making locally optimal choices at each step to find a global optimum. It is best suited to problems that have the greedy choice property, where a globally optimal solution can be reached through a series of local improvements. The document provides examples of problems that can be solved using the greedy method, including fractional knapsack, job scheduling, and making change.

Uploaded by

Sue Liu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 14

The Greedy Method

The Greedy Method 1


Outline
The Greedy Method Technique
Fractional Knapsack Problem
Task Scheduling (§5.1.2)
Examples

The Greedy Method 2


The Greedy Method
Technique
The greedy method is a general algorithm
design paradigm, built on the following
elements:
 configurations: different choices, collections, or
values to find
 objective function: a score assigned to
configurations, which we want to either maximize or
minimize
It works best when applied to problems with the
greedy-choice property:
 a globally-optimal solution can always be found by a
series of local improvements from a starting
configuration.
The Greedy Method 3
The Greedy Method
An optimisation problem (OP) is a problem
that involves searching through a set of
configurations to find one that minimises or
maximizes an objective function defined on
these configurations
The greedy method solves a given OP going
through a sequence of (feasible) choices
The sequence starts from well-understood
starting configuration, and then iteratively
makes the decision that seems best from all
those that are currently possible.
The Greedy Method 4
The Greedy Method

The greedy approach does not always


lead to an optimal solution.
The problems that have a greedy
solution are said to posses the greedy-
choice property.
The greedy approach is also used in the
context of hard (difficult to solve)
problems in order to generate an
approximate solution.

The Greedy Method 5


Making Change
Problem: A dollar amount to reach and a collection of
coin amounts to use to get there.
Configuration: A dollar amount yet to return to a
customer plus the coins already returned
Objective function: Minimize number of coins returned.
Greedy solution: Always return the largest coin you can
Example 1: Coins are valued $.32, $.08, $.01

Example 2: Coins are valued $.30, $.20, $.05, $.01

The Greedy Method 6


Greedy Algorithm always makes the choice (greedy
criteria) looks best at the moment, to optimize a given
objective. This algorithm doesn't always guarantee the
optimal solution however it generally produces a solution
that is very close in value to the optimal.

Job Sequencing Problem

“Greedy” in this context means “always doing the locally


optimal thing”

The Greedy Method 7


The Fractional Knapsack
Problem
Given: A set S of n items, with each item i having
 bi - a positive benefit
 wi - a positive weight
Goal: Choose items with maximum total benefit but with
weight at most W.
If we are allowed to take fractional amounts, then this is
the fractional knapsack problem.
 In this case, we let xi denote the amount we take of item i

 Objective: maximize b (x / w )
iS
i i i

 Constraint: x
iS
i W
The Greedy Method 8
Example
Given: A set S of n items, with each item i having
 bi - a positive benefit
 wi - a positive weight
Goal: Choose items with maximum total benefit but with
weight at most W.
“knapsack”

Solution:
• 1 ml of 5
Items:
1 2 3 4 5 • 2 ml of 3
• 6 ml of 4
Weight: 4 ml 8 ml 2 ml 6 ml 1 ml • 1 ml of 2
Benefit: $12 $32 $40 $30 $50 10 ml
Value: 3 4 20 5 50
($ per ml)
The Greedy Method 9
The Fractional Knapsack
Algorithm
Greedy choice: Keep taking
item with highest value Algorithm fractionalKnapsack(S, W)
(benefit to weight ratio) Input: set S of items w/ benefit bi
 Since  bi ( xi / wi )   (bi / wi ) xi and weight wi; max. weight W
iS iS
 Run time: O(n log n). Why? Output: amount xi of each item i
to maximize benefit w/ weight
Correctness: Suppose there at most W
is a better solution for each item i in S
 there is an item i with higher xi  0
value than a chosen item j, vi  bi / wi {value}
but xi<wi, xj>0 and vi<vj w0 {total weight}
 If we substitute some i with j, while w < W
we get a better solution remove item i w/ highest vi
 How much of i: min{wi-xi, xj} xi  min{wi , W - w}
 Thus, there is no better w  w + min{wi , W - w}
solution than the greedy one
The Greedy Method 10
Task Scheduling
Given: a set T of n tasks, each having:
 A start time, si
 A finish time, fi (where si < fi)
Goal: Perform all the tasks using a minimum number of
“machines.”

Machine 3
Machine 2
Machine 1

1 2 3 4 5 6 7 8 9

The Greedy Method 11


Job Scheduling
Algorithm
Greedy choice: consider tasks by
their start time and use as few
machines as possible with this
order. Algorithm taskSchedule(T)
 Run time: O(n log n). Why? Input: set T of tasks w/ start time si
Correctness: Suppose there is a and finish time fi
better schedule. Output: non-conflicting schedule
 We can use k-1 machines with minimum number of machines
 The algorithm uses k m0 {no. of machines}
 Let i be first task scheduled on
while T is not empty
machine k remove task i w/ smallest si
 Machine i must conflict with k-1 if there’s a machine j for i then
other tasks schedule i on machine j
 But that means there is no non-
else
conflicting schedule using k-1 mm+1
machines
schedule i on machine m
The Greedy Method 12
Example
Given: a set T of n tasks, each having:
 A start time, si
 A finish time, fi (where si < fi)
 [1,4], [1,3], [2,5], [3,7], [4,7], [6,9], [7,8] (ordered by start)
Goal: Perform all tasks on min. number of machines

Machine 3
Machine 2
Machine 1

1 2 3 4 5 6 7 8 9

The Greedy Method 13


Greedy Choice Property: from a local
optimum we can reach a global optimum,
without having to reconsider the decisions
already taken.

Optimal Substructure Property: the optimal


solution to a problem can be determined from
the optimal solutions to its subproblems.

The Greedy Method 14

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy