0% found this document useful (0 votes)
9 views

Daa Mid 2

Uploaded by

vijaykrishna2k24
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Daa Mid 2

Uploaded by

vijaykrishna2k24
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 17

TIME COMPLEXITY OF OBST ALGORITHM

To demonstrate that the computing time of the Optimal Binary Search Tree (OBST)
algorithm is O(n³), we analyze the dynamic programming approach used in the
algorithm. The OBST algorithm constructs a binary search tree with the minimum
expected search cost given the probabilities of searching for each key.

Steps in OBST Algorithm:

1. Input:

o A sequence of n keys k1,k2,…,knk_1, k_2, \dots, k_nk1,k2,…,kn.

o A probability array p[i]p[i]p[i] for the keys and q[i]q[i]q[i] for dummy keys.

2. Goal:

o Minimize the expected cost E(i,j)E(i, j)E(i,j) of a binary search tree


containing keys ki,ki+1,…,kjk_i, k_{i+1}, \dots, k_jki,ki+1,…,kj.

3. Recurrence Relation: The cost C[i,j]C[i, j]C[i,j] of the OBST is given by:

C[i,j]=w[i,j]+min⁡r∈[i,j](C[i,r−1]+C[r+1,j]),C[i, j] = w[i, j] + \min_{r \in [i, j]} (C[i, r-1] +


C[r+1, j]),C[i,j]=w[i,j]+r∈[i,j]min(C[i,r−1]+C[r+1,j]),

where w[i,j]=∑k=ij(p[k])+∑k=ij+1(q[k])w[i, j] = \sum_{k=i}^j (p[k]) + \


sum_{k=i}^{j+1} (q[k])w[i,j]=∑k=ij(p[k])+∑k=ij+1(q[k]) is the total weight
(probabilities) of keys and dummy keys in the range.

4. Dynamic Programming Approach:

o Compute w[i,j]w[i, j]w[i,j] for all i≤ji \leq ji≤j in O(n2)O(n^2)O(n2) time.

o Solve C[i,j]C[i, j]C[i,j] for all pairs (i,j)(i, j)(i,j) by iterating over all possible
roots r∈[i,j]r \in [i, j]r∈[i,j].

Time Complexity Analysis:

1. Computation of w[i,j]w[i, j]w[i,j]:

o Computing w[i,j]w[i, j]w[i,j] involves summing probabilities p[k]p[k]p[k]


and q[k]q[k]q[k].

o There are O(n2)O(n^2)O(n2) such pairs of (i,j)(i, j)(i,j) and summing takes
O(1)O(1)O(1) time (due to prefix sums).

o Total time: O(n2)O(n^2)O(n2).

2. Computing C[i,j]C[i, j]C[i,j]:

o For each pair (i,j)(i, j)(i,j), the algorithm iterates over all possible roots
r∈[i,j]r \in [i, j]r∈[i,j], which takes O(n)O(n)O(n) time.

o There are O(n2)O(n^2)O(n2) such pairs of (i,j)(i, j)(i,j).

o Total time: O(n2⋅n)=O(n3)O(n^2 \cdot n) = O(n^3)O(n2⋅n)=O(n3).

3. Overall Time Complexity:


o The dominant term is from computing C[i,j]C[i, j]C[i,j], which is
O(n3)O(n^3)O(n3).

o Other steps, like computing w[i,j]w[i, j]w[i,j], are O(n2)O(n^2)O(n2) and


do not affect the overall asymptotic complexity.

Conclusion:

The time complexity of the OBST algorithm is O(n3)O(n^3)O(n3). This is due to the
nested loops:

1. Two loops for all pairs (i,j)(i, j)(i,j) (O(n2)O(n^2)O(n2)).

2. One inner loop to iterate over all roots r∈[i,j]r \in [i, j]r∈[i,j] (O(n)O(n)O(n)).

Hence, the total running time of the algorithm is O(n3)O(n^3)O(n3).

Using a dynamic programming approach coupled with the set generation approach, show how
to obtain an 0(2n/2) algorithm for the 0/1 knapsack problem.
Problem Statement

Given:

 nnn items, each with a weight w[i]w[i]w[i] and a value v[i]v[i]v[i].

 A knapsack with a capacity WWW.

Objective:

 Maximize the total value of items in the knapsack such that the total weight
does not exceed WWW.

Meet-in-the-Middle Algorithm

1. Divide the Items:

o Split the nnn items into two halves:

 Group 1: First n/2n/2n/2 items.

 Group 2: Last n/2n/2n/2 items.

2. Generate Subsets:

o For Group 1, generate all possible subsets. For each subset:

 Calculate the total weight and value.

 Store the results as pairs (weight,value)(weight, value)


(weight,value) in a list L1L_1L1.

o Similarly, for Group 2, generate all subsets and store them in a list
L2L_2L2.

3. Prune Lists (Efficiently Store Dominant Solutions):


o Sort L1L_1L1 by weight. For subsets with the same weight, retain only the
one with the maximum value.

o Do the same for L2L_2L2.

4. Combine Results:

o For each pair (w1,v1)∈L1(w_1, v_1) \in L_1(w1,v1)∈L1, find the best
match (w2,v2)∈L2(w_2, v_2) \in L_2(w2,v2)∈L2 such that: w1+w2≤Ww_1
+ w_2 \leq Ww1+w2≤W

 Use a binary search or two-pointer technique to efficiently find the


maximum v2v_2v2 for a given w1w_1w1.

5. Compute the Maximum Value:

o Iterate over all valid combinations of subsets from L1L_1L1 and L2L_2L2
and find the combination with the maximum total value.

Time Complexity

1. Subset Generation:

o Each group has n/2n/2n/2 items.

o Generating all subsets for each group takes O(2n/2)O(2^{n/2})O(2n/2)


time.

2. Pruning (Sorting):

o Sorting L1L_1L1 and L2L_2L2 takes


O(2n/2log⁡(2n/2))=O(2n/2⋅n/2)O(2^{n/2} \log(2^{n/2})) = O(2^{n/2} \
cdot n/2)O(2n/2log(2n/2))=O(2n/2⋅n/2).

3. Combining Results:

o For each element in L1L_1L1, perform a binary search on L2L_2L2, which


takes O(log⁡(2n/2))=O(n/2)O(\log(2^{n/2})) = O(n/2)O(log(2n/2))=O(n/2).

o Total: O(2n/2⋅n/2)O(2^{n/2} \cdot n/2)O(2n/2⋅n/2).

Total Complexity:

The dominant term is O(2n/2)O(2^{n/2})O(2n/2), making the overall complexity:

O(2n/2)O(2^{n/2})O(2n/2)

Example

Suppose we have n=4n = 4n=4, items with weights [2,3,4,5][2, 3, 4, 5][2,3,4,5],


values [3,4,5,6][3, 4, 5, 6][3,4,5,6], and W=5W = 5W=5.

1. Divide Items:

o Group 1: Items {1,2}\{1, 2\}{1,2}.


o Group 2: Items {3,4}\{3, 4\}{3,4}.

2. Generate Subsets:

o L1L_1L1: Subsets {},{1},{2},{1,2}\{\}, \{1\}, \{2\}, \{1,2\}{},{1},{2},


{1,2}, with corresponding (weight,value)(weight, value)(weight,value).

o L2L_2L2: Subsets {},{3},{4},{3,4}\{\}, \{3\}, \{4\}, \{3,4\}{},{3},{4},


{3,4}, with (weight,value)(weight, value)(weight,value).

3. Prune Lists:

o Remove dominated subsets.

4. Combine Results:

o Match subsets from L1L_1L1 and L2L_2L2 to maximize the value while
keeping weight≤Wweight \leq Wweight≤W.

This approach is efficient for moderately large nnn and is much faster than the brute-
force O(2n)O(2^n)O(2n) approach for the 0/1 Knapsack Problem.

Algorithm: LC Branch and Bound

Inputs:

 Problem instance.

 Initial state S0S_0S0.

 A cost function f(x)f(x)f(x) that evaluates the cost of a solution xxx.

 A set of feasible states SSS (partially or fully expanded nodes).

 Termination condition for reaching an answer node.

Outputs:

 The least-cost solution xxx (answer node).

Steps:

1. Initialization:

o Create a priority queue PQPQPQ (min-heap) to store nodes based on their


cost.

o Insert the root node S0S_0S0 into PQPQPQ with its cost f(S0)f(S_0)f(S0).

2. Branching:

o While PQPQPQ is not empty:

1. Dequeue the least-cost node NNN from PQPQPQ.

2. Check if NNN is an answer node:


 If yes, return NNN as the least-cost solution and terminate.

3. Expand node NNN:

 Generate all child nodes N1,N2,…,NkN_1, N_2, \dots, N_kN1


,N2,…,Nk by applying valid actions to NNN.

4. Evaluate children:

 For each child node NiN_iNi:

 Compute its cost f(Ni)f(N_i)f(Ni) using the cost


function.

 If NiN_iNi is feasible (i.e., satisfies problem


constraints), add it to PQPQPQ.

3. Termination:

o If PQPQPQ becomes empty and no answer node is found, report failure


(no solution exists).

Details of the Cost Function f(x)f(x)f(x):

 f(x)=g(x)+h(x)f(x) = g(x) + h(x)f(x)=g(x)+h(x), where:

o g(x)g(x)g(x): Cost from the root node to the current node xxx.

o h(x)h(x)h(x): Heuristic estimate of the cost from xxx to the nearest


answer node.

Algorithm LC_Branch_and_Bound(S_0):

Initialize priority queue PQ

Insert S_0 into PQ with cost f(S_0)

while PQ is not empty:

N = PQ.dequeue() # Get the least-cost node

if N is an answer node:

return N # Return the least-cost solution

for each child N_i of N:

Compute cost f(N_i) = g(N_i) + h(N_i)

if N_i is feasible:

PQ.enqueue(N_i, f(N_i)) # Add to the priority queue


return "No solution found"

Explanation

1. Priority Queue:

o Maintains the list of nodes sorted by their cost f(x)f(x)f(x).

o Ensures the algorithm always explores the node with the least cost next.

2. Branching:

o Each node represents a partial solution, and its children represent further
exploration of that solution space.

3. Pruning:

o Nodes that are infeasible or dominated by better solutions are not added
to PQPQPQ.

4. Heuristic h(x)h(x)h(x):

o Guides the search towards the most promising nodes.

o The quality of h(x)h(x)h(x) affects the efficiency of the algorithm but not
its correctness.

Complexity

 Time Complexity: Depends on the branching factor bbb and the depth ddd of
the solution.

o Worst case: O(bd)O(b^d)O(bd).

o Pruning and heuristic selection reduce the effective number of nodes


explored.

 Space Complexity: O(bd)O(b^d)O(bd) for storing all nodes in the priority


queue.

Example: Solving a Traveling Salesman Problem (TSP)

Problem:

 S0S_0S0: Start city.

 f(x)=g(x)+h(x)f(x) = g(x) + h(x)f(x)=g(x)+h(x):

o g(x)g(x)g(x): Total distance traveled so far.

o h(x)h(x)h(x): Minimum spanning tree heuristic for remaining cities.

Using LC Branch and Bound, the algorithm explores the partial tours and finds the
shortest tour satisfying the TSP constraints.
BACKTRACKING ~

The recursive control abstraction of backtracking is a general framework that


outlines how backtracking algorithms systematically explore all possible solutions to a
problem. Backtracking is particularly useful for solving problems with constraints,
where the solution space can be represented as a tree.

Below is the recursive control abstraction of backtracking, along with an


explanation.

Recursive Control Abstraction

The recursive structure of a backtracking algorithm can be outlined as follows:

Pseudocode

python

Copy code

Backtrack(X):

if X is a solution:

Process the solution # Perform the desired operation on the solution

else:

for each option in Options(X): # Generate all possible options

if IsValid(option, X): # Check if adding the option is feasible

Add option to X # Make the move

Backtrack(X) # Recur to explore further

Remove option from X # Undo the move (backtrack)

Explanation

Key Components:

1. Input:

o XXX: A partial solution or state that evolves during the recursion.

o The algorithm starts with an initial state (usually an empty or minimal


configuration).

2. Base Case:

o If XXX is a complete and valid solution:


 Perform an action on the solution (e.g., print it, count it, store it,
etc.).

 Terminate the recursion for this branch.

3. Recursive Exploration:

o For the current state XXX, generate all feasible options (choices).

o Each option represents a potential move or extension of the current


partial solution.

4. Feasibility Check:

o Before extending XXX with an option, check if the option satisfies the
problem's constraints.

o Prune infeasible options early to reduce unnecessary computations.

5. Recursive Call:

o If the option is valid, extend XXX and recursively call the backtracking
function to explore further possibilities.

6. Backtracking (Undo the Move):

o After exploring one branch, undo the move by removing the option from
XXX.

o This ensures the algorithm can explore other branches of the solution
tree.

Example: N-Queens Problem

Problem:

Place nnn queens on an n×nn \times nn×n chessboard such that no two queens attack
each other.

Pseudocode:

python

Copy code

def Backtrack(row, board):

if row == n: # Base case: All queens are placed

ProcessSolution(board) # Print or store the board

return

for col in range(n): # Explore all columns in the current row

if IsValid(row, col, board): # Check if placing a queen is valid


board[row][col] = 'Q' # Make a move

Backtrack(row + 1, board) # Recur for the next row

board[row][col] = '.' # Undo the move (backtrack)

Key Elements for N-Queens:

 XXX: The current state is represented by the board and the current row.

 Base Case: When all rows are filled with valid queen placements, XXX is a
solution.

 Options: The columns in the current row.

 Feasibility Check: Ensure no queen in the same column or diagonal.

Characteristics of Backtracking

1. Recursive Nature:

o The algorithm breaks the problem into smaller subproblems, recursively


solving each.

2. Search Tree:

o The solution space can be visualized as a tree where each node


represents a state, and branches represent options.

3. Pruning:

o By checking constraints early, backtracking avoids exploring invalid or


unnecessary branches, making it more efficient than brute force.

4. Reversibility:

o Backtracking relies on the ability to undo moves to explore alternative


options.

Applications

1. Constraint Satisfaction Problems:

o Sudoku

o N-Queens

o Crossword puzzles

2. Combinatorial Problems:

o Subset generation

o Permutations

o Graph coloring

3. Optimization Problems:
o Knapsack problem

o Traveling Salesman Problem (TSP)

Backtracking is a general and powerful approach for systematically exploring solution


spaces, particularly when constraints significantly limit feasible solutions.

GRAPH COLORING ~ ALGORITHM

The M-Coloring Problem is a graph-coloring problem where the goal is to determine


whether the vertices of a graph can be colored using at most MMM colors such that no
two adjacent vertices share the same color.

To implement the solution, we use a backtracking approach where generating the


"next color" for a vertex is a critical step.

Algorithm to Generate the Next Color

The algorithm ensures that a vertex is assigned the next valid color that satisfies the
graph's constraints.

Inputs:

1. Graph: An adjacency matrix or list representing the graph.

2. Colors: An array where Colors[i] is the color assigned to vertex iii.

3. Vertex: The current vertex for which a color is to be assigned.

4. M: The maximum number of colors allowed.

Output:

 The next valid color for the vertex, or None if no color is valid.

Algorithm:

python

Copy code

def GenerateNextColor(Graph, Colors, Vertex, M):

for color in range(1, M + 1): # Colors are labeled from 1 to M

if IsValidColor(Graph, Colors, Vertex, color):

return color # Return the first valid color


return None # No valid color available

Helper Function:

python

Copy code

def IsValidColor(Graph, Colors, Vertex, color):

for neighbor in range(len(Graph)):

if Graph[Vertex][neighbor] == 1 and Colors[neighbor] == color:

# If there's an edge and the neighbor has the same color

return False

return True

Explanation

1. Color Iteration:

o The algorithm iterates through all possible colors from 111 to MMM.

2. Validity Check:

o For each color, it checks if the color can be assigned to the current vertex
without violating the coloring constraints:

 No two adjacent vertices should have the same color.

3. Return Value:

o If a valid color is found, it is returned immediately.

o If no valid color exists, the algorithm returns None, indicating failure to


assign a color under the given constraints.

SUM OF SUBSETS ~ ALGORITHM

Recursive Backtracking Algorithm

Problem Statement:

 Given:

o A set of nnn positive integers S={s1,s2,…,sn}S = \{s_1, s_2, \dots,


s_n\}S={s1,s2,…,sn}.

o A target sum TTT.

 Find all subsets of SSS whose elements sum up to TTT.

Pseudocode:

python
Copy code

def SubsetSum(S, T):

n = len(S)

result = [] # To store all valid subsets

def Backtrack(index, current_sum, subset):

if current_sum == T: # Base case: Found a valid subset

result.append(list(subset)) # Add a copy of the subset to the result

return

if current_sum > T or index == n: # Prune if the sum exceeds T or no more


elements

return

# Include the current element

subset.append(S[index])

Backtrack(index + 1, current_sum + S[index], subset)

# Exclude the current element (backtrack)

subset.pop()

Backtrack(index + 1, current_sum, subset)

Backtrack(0, 0, [])

return result

Explanation:

1. Input Parameters:

o S: The set of integers.

o T: The target sum.

2. Recursive Function (Backtrack):

o Arguments:

 index: The current position in the set.


 current_sum: The sum of elements in the current subset.

 subset: The current subset being considered.

o Base Cases:

 If current_sum == T: Add the subset to the result.

 If current_sum > T or index == n: Stop exploring this branch.

o Recursive Calls:

 Include the current element in the subset and recurse.

 Exclude the current element from the subset and recurse.

3. Result:

o The result list contains all subsets whose sum equals TTT.

Example

Input:

python

Copy code

S = [3, 5, 6, 7]

T = 10

Execution:

python

Copy code

result = SubsetSum(S, T)

print(result)

Output:

python

Copy code

[[3, 7], [5, 5]]

Time Complexity:

1. Subset Generation:

o In the worst case, all 2n2^n2n subsets are explored.

o Each subset is checked for feasibility.

2. Pruning:
o The algorithm prunes branches when current_sum > T.

3. Overall Complexity:

o O(2n)O(2^n)O(2n) in the worst case, though pruning reduces this in


practice.

Space Complexity:

 The maximum depth of the recursion is O(n)O(n)O(n) (due to the subset stack).

 Space for storing results depends on the number of valid subsets.

This algorithm is a clear and efficient recursive solution to the Sum of Subsets
Problem, utilizing backtracking to explore and prune the solution space.

Q5 (a): Algorithm to Generate a Hamiltonian Cycle

A Hamiltonian cycle is a cycle in a graph that visits each vertex exactly once and
returns to the starting vertex. Finding a Hamiltonian cycle can be done using
backtracking.

Algorithm: Hamiltonian Cycle

Inputs:

1. Graph: An adjacency matrix Graph[u][v] where 1 indicates an edge between


vertices u and v, and 0 otherwise.

2. n: Number of vertices in the graph.

Outputs:

 A Hamiltonian cycle if it exists, otherwise report failure.

Pseudocode:

python

Copy code

def HamiltonianCycle(Graph):

n = len(Graph) # Number of vertices

path = [-1] * n # Stores the Hamiltonian path

path[0] = 0 # Start at vertex 0

def IsSafe(v, pos):

# Check if v can be added to the path at position pos

if Graph[path[pos - 1]][v] == 0: # No edge from the last vertex


return False

if v in path: # Vertex already in the path

return False

return True

def Backtrack(pos):

if pos == n: # All vertices are in the path

# Check if there is an edge from the last vertex to the first

return Graph[path[pos - 1]][path[0]] == 1

for v in range(1, n): # Try all vertices except the starting vertex

if IsSafe(v, pos):

path[pos] = v

if Backtrack(pos + 1):

return True

path[pos] = -1 # Backtrack: Remove the vertex

return False

if Backtrack(1):

path.append(path[0]) # Complete the cycle by adding the starting vertex

return path

else:

return "No Hamiltonian Cycle exists"

Steps:

1. Path Initialization:

o Start the cycle at vertex 0 and initialize all other positions in the path as -
1.

2. Recursive Backtracking:

o Place vertices in the path one by one, checking feasibility:

 The vertex must be adjacent to the previous vertex in the path.


 The vertex must not already be in the path.

o Recursively attempt to construct the cycle by placing the next vertex.

3. Cycle Completion Check:

o If all vertices are included in the path, check if the last vertex connects
back to the starting vertex to complete the cycle.

4. Backtracking:

o If no valid vertex can be added, remove the last vertex (backtrack) and
try the next possibility.

5. Output:

o If a Hamiltonian cycle is found, return the path.

o Otherwise, return a message indicating no cycle exists.

Example

Input:

python

Copy code

Graph = [

[0, 1, 1, 1],

[1, 0, 1, 0],

[1, 1, 0, 1],

[1, 0, 1, 0]

Execution:

python

Copy code

result = HamiltonianCycle(Graph)

print(result)

Output:

csharp

Copy code

[0, 1, 2, 3, 0] # Hamiltonian cycle found

Q5 (b): Time Complexity


1. Number of Recursive Calls:

o The algorithm attempts to place n−1n-1n−1 vertices after the first


vertex.

o For each vertex, it tries n−1n-1n−1 possible choices in the worst case.

2. Pruning:

o Feasibility checks using IsSafe prune some branches of the recursion tree,
reducing the actual number of calls.

3. Worst-Case Complexity:

o The recursion tree has a maximum branching factor of n−1n-1n−1 at


n−1n-1n−1 levels: O((n−1)!)O((n-1)!)O((n−1)!)

o Simplifies to O(n!)O(n!)O(n!), as the first vertex is fixed.

4. Space Complexity:

o Space for the path array: O(n)O(n)O(n).

o Recursion depth: O(n)O(n)O(n).

Conclusion

 Time Complexity: O(n!)O(n!)O(n!) in the worst case.

 Space Complexity: O(n)O(n)O(n).

 Practical Improvements:

o Use heuristics like branch and bound or dynamic programming (e.g.,


Held-Karp algorithm) for better efficiency on specific graph types.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy