0% found this document useful (0 votes)
11 views16 pages

Daa 3

The document discusses various optimization problems, including the Traveling Salesman Problem, Knapsack Problem, Network Flow Optimization, and Portfolio Optimization, along with techniques like Greedy Algorithms, Dynamic Programming, and Linear Programming. It explains the components of optimization problems, such as objective functions, decision variables, and constraints, and provides detailed examples of the Fractional Knapsack Problem and Job Sequencing Problem using Greedy Algorithms. Additionally, it touches on Huffman Coding for lossless data compression.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views16 pages

Daa 3

The document discusses various optimization problems, including the Traveling Salesman Problem, Knapsack Problem, Network Flow Optimization, and Portfolio Optimization, along with techniques like Greedy Algorithms, Dynamic Programming, and Linear Programming. It explains the components of optimization problems, such as objective functions, decision variables, and constraints, and provides detailed examples of the Fractional Knapsack Problem and Job Sequencing Problem using Greedy Algorithms. Additionally, it touches on Huffman Coding for lossless data compression.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

17-09-2024

Optimisation Problems

An optimization problem involves finding the best solution from all feasible solutions,
according to a specific criterion or objective function.

1. Traveling Salesman Problem (TSP): Finding the shortest possible route that
visits a set of cities and returns to the origin city.

1. Knapsack Problem: Given a set of items, each with a weight and a value,
determine the number of each item to include in a collection so that the total
weight is less than or equal to a given limit and the total value is maximized.

1. Network Flow Optimization: Finding the maximum flow in a network, such as


the maximum data that can be sent from one point to another in a computer
network.

1. Portfolio Optimization: Selecting the best combination of investments to


maximize returns while minimizing risk.

Common Techniques:

1. Greedy Algorithms: As discussed earlier, these make the locally optimal


choice at each step with the hope of finding a global optimum.

1. Dynamic Programming: Solves complex problems by breaking them down


into simpler subproblems, which are solved just once and stored for future use.

1. Linear Programming: Uses linear equations to represent the problem and


employs methods like the Simplex algorithm to find the optimal solution.

Components of an Optimization Problem:

1. Objective Function: The function that needs to be maximized or minimized.


For example, maximizing profit or minimizing cost.

1. Decision Variables: The variables that can be controlled or adjusted to


optimize the objective function. For example, the amount of resources allocated
to a task.

1. Constraints: The conditions or limitations that the solution must satisfy. For
example, budget limits, resource availability, or time constraints.

1
17-09-2024

Greedy Technique
• The Greedy Technique is an algorithmic approach used to solve optimization
problems by making a series of choices, each of which looks the most
immediately beneficial.

• The technique follows the principle of making the best possible decision at each
step (locally optimal choice), with the hope that these choices will lead to a
globally optimal solution.

• Most of the problem have n input and require us to obtain a subset that satisfy
some constraints. Any subset that satisfy these constraint is called feasible
solution. We need to find a feasible solution that either maximizes or minimizes
a given objective function which is declared as a optimal solution.
1. Solution space : All possible solution ( for given n input all possible
solution may or may not be correct.
2. Feasible solution : Any subset of solution that satisfy our constraints is
called feasible solution.
3. Optimal solution : The feasible solution, which optimise our problem by
minimising or maximising.

• Limitations: Not Always Optimal: The Greedy Technique does not


always produce the most optimal solution.

Key Characteristics:

1. Local Optimality: At each step, the algorithm picks the best option available
without considering the future consequences.

1. Feasibility: The algorithm ensures that the choice made at each step is
feasible within the problem’s constraints.

1. Irrevocability: Once a choice is made, it cannot be undone or revisited.

2
17-09-2024

Fractional Knapsack Problem

• It is a classic optimization problem where we are given a set of items, each with a
weight and a profit value, and we need to determine the most profitable combination
of items to include in a knapsack that has a weight capacity limit.

• Given the weights and profits of N items, in the form of {profit, weight} put these
items in a knapsack of capacity W to get the maximum total profit in the knapsack.

• Example 1: Items with profit , weight = {{60, 30}, {100, 20}, {120, 30}}, W = 60

Item Profit Weight Item Fraction Weight Profit


1 60 30 1 1 30 60
Total Profit = 200
2 100 20 2 1 20 100 Is this optimised?
3 120 30 3 1/3 10 40

• Brute-force approach: The brute-force approach tries all the possible solutions
with all the different fractions but it is a time-consuming approach.

• Greedy approach: In Greedy approach, we calculate the ratio of profit/weight, and


accordingly, we will select the item. The item with the highest ratio would be
selected first.

Objective of knapsack problem is to put item into knapsack up to its capacity to


maximise the total profit.

Let Xi fraction of Ith item put into knapsack then the remaining capacity equal to
M - Wi * Xi

After filling knapsack with object:

Total Weight

Total Profit

Constraint : M >=

Objective Function : must be maximum.

3
17-09-2024

Example 1: Weight capacity of the knapsack (W): 15, No of items (n): 7

Objects: 1 2 3 4 5 6 7 Objects: 1 5 7 2 6 4 3
Profit (P): 5 10 15 7 8 9 4 Profit (P): 5 8 4 10 9 7 15
Weight(w): 1 3 5 4 1 3 2 Weight(w): 1 1 2 3 3 4 5

Approach 1: Low Weighted Object Taken First. Approach 2: High Weighted Object Taken First.

Obj W x Profit Remaining Obj W x Profit Remaining


1 1 1 5 15 - 1 = 14 3 5 1 15 15 - 5 = 10

4 4 1 7 10 - 4 = 6
5 1 1 8 14 - 1 = 13
6 3 1 9 6-3=3
7 2 1 4 13 - 2 = 11
2 3 1 10 3-3=0
2 3 1 10 11 - 3 = 8

6 3 1 9 8-3=5 The total profit = 41

4 4 1 7 5-4=1

3 1 1/5 1/5 * 15 = 3 1-1=0

The total profit = 46

Example 1: Weight capacity of the knapsack (W): 15, No of items (n): 7

Objects: 1 2 3 4 5 6 7 Objects: 3 2 6 5 4 1 7
Profit (P): 5 10 15 7 8 9 4 Profit (P): 15 10 9 8 7 5 4
Weight(w): 1 3 5 4 1 3 2 Weight(w): 5 3 3 1 4 1 2

Approach 3: Low Profits Object Taken First. Approach 4: High Profits Object Taken First.

Obj W x Profit Remaining Obj W x Profit Remaining


7 2 1 4 15 - 2 = 13 3 5 1 15 15 - 5 = 10

2 3 1 10 10 - 3 = 7
1 1 1 5 13 - 1 = 12
6 3 1 9 7-3=4
4 4 1 7 12 - 4 = 8
5 1 1 8 4-1=3
5 1 1 8 8 -1=7
4 3 3/4 ¾ * 7 = 5.25 3-3=0
6 3 1 9 7-3=4
The total profit = 47.25
2 3 1 10 4-3=1

3 1 1/5 1/5 * 15 = 3 1-1=0

The total profit = 46

4
17-09-2024

Example 1: Weight capacity of the knapsack (W): 15, No of items (n): 7


Objects: 1 2 3 4 5 6 7
Profit (P): 5 10 15 7 8 9 4
Weight(w): 1 3 5 4 1 3 2
P/W 5/1 10/3 15/3 7/4 8/1 9/3 4/2
=5 =3.33 =3 =1.75 =8 =3 =2

2 3 4 7 1 5 6

Approach 5: High Profit / Weight ratio (P/W) based Object Taken First.

Obj W x Profit Remaining

5 1 1 8 15 - 1 = 14

1 1 1 5 14 - 1 = 13

2 3 1 10 13 - 3 = 10

3 5 1 15 10 - 5 = 5

6 3 1 9 5-3=2

7 2 1 4 2-2=0

The total profit = 51

Greedy Algorithm for the Fractional Knapsack Problem:

1. Calculate Profit per Weight: For each item, calculate its profit per unit of
weight.
2. Sort Items: Sort all items in decreasing order based on their profit per unit
of weight.
3. Add to Knapsack:
a) Start adding items to the knapsack,
b) beginning with the item with the highest profit per unit of weight.
c) If the whole item can’t be added due to weight constraints,
d) add as much of it as possible (i.e., take a fraction of it).

4. Stop when Full: Continue until the knapsack is full.

5
17-09-2024

Job Sequencing with Deadline using Greedy Algorithm


• The Job Sequencing Problem is a classic optimization problem,
• where you are given a set of jobs, each with a deadline and profit, and
• you need to schedule them in such a way as to maximize total profit,
• subject to the constraint that a job can only be scheduled before its deadline,
where each job need only 1 unit of time to process.
• The Greedy approach works efficiently to solve this problem.

Problem Statement
• You are given n jobs, where each job i has:
Job Profit Deadline
P [ i ]: Profit of job i
J1 100 2
D [ i ]: Deadline of job i J2 19 1

• Each job takes 1 unit of time, and the J3 27 2


J4 25 1
objective is to find a sequence of jobs that
J5 15 3
maximizes total profit without violating
deadlines.

Example 1: suppose we have 5 jobs with the Job Profit Deadline


following profit and deadline. Find out the job J1 100 2
execution sequence and total profit. J2 19 1
J3 27 2
Solution:
J4 25 1

1. Sort jobs based on their profits in descending J5 15 3


order:
Job Profit Deadline
2. Now, start scheduling the jobs based on their J1 100 2
deadlines:
J3 27 2
• J1 can be scheduled before its deadline
J4 25 1
(time slot 2).
• J3 can also be scheduled before its J2 19 1
deadline (time slot 2). J5 15 3
• J4 cannot be scheduled because its
deadline (time slot 1) is already occupied. Job Time Profit
• J2 also cannot be scheduled for the same
reason. J1 0-1 100
• J5 can be scheduled before its deadline J3 1-2 27
(time slot 3) after J1 and J3. J5 2-3 15
3. Thus, the jobs scheduled are J1, J3, and J5.
Total Profit : 142
The total profit is 100 + 27 + 15 = 142.
Job sequence is < J1, J3 , J5 >

6
17-09-2024

Example 2: Suppose we have 4 jobs with the Job Profit Deadline


following profit and deadline: J1 100 2
Solution: J2 10 1
J3 15 2
1. Sort jobs based on their profits in descending J4 27 1
order:
Job Profit Deadline
2. Now, start scheduling the jobs based on their
deadlines: J1 100 2
• J1 can be scheduled up to its deadline J4 27 1
(time slot 2). J3 15 2
• J4 can be scheduled before its deadline
J2 10 1
(time slot 1).
• J3 cannot be scheduled because its
deadline (time slot 2) is already occupied. Job Time Profit
• J2 also cannot be scheduled for the same
reason. J1 1-2 100
J4 0-1 27
3. Thus, the jobs scheduled are J1 and J4.
Total Profit : 127
The total profit is 100 + 27 = 127.
Job sequence is < J4, J1 >

Greedy Algorithm Steps for Job Sequencing Problem:

1. Sort the jobs in decreasing order of profit.


2. Schedule the jobs in a time slot such that the job is placed before its
deadline, if possible.
3. Maximize the total profit by adding the profit of the jobs scheduled
within their deadlines.

7
17-09-2024

Huffman Coding
 It is used for lossless data compression.
 It efficiently, reduce the average size of character in a message by
assigning value length code to input character based on their
frequency.

Message of total A 00 A 50 A 0
100 characters that B 01 B 40 B 10
includes only four C 10 C 5 C 110
distinct characters D 11 D 5 D 111
Total bits = 50 x 1 + 40 x 2
Total bits = 100 x 2 = 200
+ 5 x 3 + 5 x 3 = 160

 Given N messages and their frequencies.


 Objective is two find code bit for each message, such that total
number of bit transmitted to be minimum.
 Constraint is to give a small code for high frequent message and big
code for low frequent message.

Huffman Coding Algorithm:


1. Frequency Analysis: Count the frequency of each character in the input
data.
2. Build a Priority Queue: Insert each character and its frequency into a
priority queue (or min-heap). The priority queue is used to efficiently extract
the least frequent characters.
3. Build the Huffman Tree:
a. While there is more than one node in the priority queue:
i. Extract the two nodes with the lowest frequencies.
ii. Create a new internal node with these two nodes as children and
the frequency equal to the sum of the two nodes' frequencies.
iii. Insert the new node back into the priority queue.
b. The remaining node is the root of the Huffman tree.
4. Generate Codes: Traverse the Huffman tree to assign binary codes to each
character. The path from the root to a character determines its code: a left
branch might be '0', and a right branch '1', or vice versa.

8
17-09-2024

Time Complexity Analysis


1. Frequency Analysis:
• This step takes O(n) time where n is the number of characters in the
input.
2. Building the Priority Queue:
• Building the priority queue takes O(n logn) time.
• Insertion into the priority queue (or heap) and extracting the minimum
both take O(logn) time.
3. Building the Huffman Tree:
• Each extraction and insertion operation on the priority queue involves
O(logn) time.
• Since there are n−1 operations, the total time for building the Huffman
tree is O(n logn).
4. Generating Codes:
• Traversing the Huffman tree takes O(n) time, where nnn is the number of
characters.

Total Time Complexity : O(n logn)


• The overall time complexity of the Huffman coding algorithm is O(n logn),
where n is the number of unique characters or symbols in the input data.
• This time complexity arises primarily from building the priority queue and the
Huffman tree.

Example 1: Suppose the message have total length of 100 characters.


It have six distinct characters with following frequencies.

Message Characters A B C D E F
Frequency 45 13 12 16 9 5

Hoffman(m)
{
n = |m|
Make a min heap Q with m.
For i = 1 to n
Allocate a new node Z
Z.left=x=Extract-min(Q)
Z.right=y=Extract-min(Q)
Z.freq=x.freq+y.freq
Insert(Q,Z)
Return(Extract-min(Q))
}

9
17-09-2024

 Example 2 : for seven message, there are relative frequencies given below
Message M1 M2 M3 M4 M5 M6 M7
Frequency 4 5 7 8 10 12 20

 Example 2 : for seven message, there are relative frequencies given below
Message M1 M2 M3 M4 M5 M6 M7
Frequency 4 5 7 8 10 12 20

10
17-09-2024

Exam. 3: suppose the letter ABCDEF have probability 1/2 , 1/4, 1/8 , 1/16 , 1/32 , 1/32
i. Find out the Huffman code for the given letters.
ii. Find out average code length per letter. A. 1.8579 B. 1.9375 C. 2.087 D. Non
Letters A B C D E F
Frequency 1/2 1/4 1/8 1/16 1/32 1/32

16 8 4 2 1 1

 Example 4 : for seven message, there are relative frequencies given below

Message A E I O U S T

Frequency 10 15 12 3 4 13 1

11
17-09-2024

Spanning Tree
 Let G = ( V , E ) is the undirected graph.
 A subgraph t = ( V , E’ ) of G is a spanning tree of G iff T is tree.

 Any connected subgraph with n vertices must have n-1 edges and all
connected subgraphs with n-1 edges are spanning tree.

Minimum Cost Spanning Tree: Prims Algorithms : G (V,E) and |V| = 7

E 1-6 1-2 2-7 2-3 3-4 4-5 4-7 5-7 5-6


W 10 28 14 16 12 22 18 24 25

Step 1: Determine an arbitrary vertex


as the starting vertex of the MST.
Step 2: Follow steps 3 to 5 till there
are vertices that are not included in
the MST (known as fringe vertex).
Step 3: Find edges connecting any
tree vertex with the fringe vertices.
Step 4: Find the minimum among
these edges.
Step 5: Add the chosen edge to the
MST if it does not form any cycle.
Step 6: Return the MST and exit

12
17-09-2024

Step 1.1 Step 2.1 Step 3.1 Step 4.1 Step 5.1

Step 1.2 Step 2.2 Step 3.2 Step 4.2 Step 5.2

Step 6

13
17-09-2024

Spanning Tree: Kruskal’s Algorithms : G (V,E) and |V| = 7

E 1-6 1-2 2-7 2-3 3-4 4-5 4-7 5-7 5-6


W 10 28 14 16 12 22 18 24 25 1.Sort all the edges in
non-decreasing order of
1-6 3-4 2-7 2-3 4-7 4-5 5-7 5-6 1-2 their weight.
10 12 14 16 18 22 24 25 28
2.Pick the smallest edge.
Check if it forms a cycle
with the spanning tree
formed so far. If the
cycle is not formed,
include this edge. Else,
discard it.

3.Repeat step#2 until


there are (V-1) edges in
the spanning tree.

• Overall Time Complexity: O(E log E), dominated by the heap operations.
• In the worst case, E = O(V²) for a dense graph, giving O(V² log V).

Single Source Shortest Paths : Dijkstra’s Algorithm

14
17-09-2024

15
17-09-2024

16

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy