BCS 042
BCS 042
An algorithm is a step-by-step finite sequence of well-defined Linear Search is a simple searching algorithm where each element
instructions used to perform a task or solve a problem. It takes in the list is checked one by one from the beginning to the end until
input, processes it, and produces the desired output. the desired element is found or the list ends.
Importance in Computer Science: Time Complexity:
-Problem Solving: Algorithms provide a logical method to solve -Best Case: O(1) → When the element is found at the first position.
problems efficiently. -Worst Case: O(n) → When the element is at the last position or not
-Optimization: They help in choosing the best solution among present at all.
many. -Average Case: O(n)
-Reusability: Well-designed algorithms can be reused in different Here, n is the number of elements in the array.
programs. Example:
-Foundation of Programming: Algorithms form the core of every Let's say we have an array:
computer program and software. cpp
-Performance Analysis: Allows measurement of time and space CopyEdit
complexity. int arr[] = {10, 25, 30, 45, 50};
Concept of Recursion: int key = 30;
Recursion is a programming technique where a function calls itself Linear Search Code:
directly or indirectly to solve a problem. cpp
In recursion: CopyEdit
-There is always a base case that stops the recursion. int linearSearch(int arr[], int n, int key) {
-The recursive case reduces the problem into smaller sub- for (int i = 0; i < n; i++) {
problems. if (arr[i] == key)
Example of Recursive Algorithm: return i; // Element found at index i
Factorial of a Number (n!) }
int factorial(int n) { return -1; // Element not found
if (n == 0 || n == 1) // base case }
return 1; Explanation:
else • In this example, the key 30 is found at index 2.
return n * factorial(n - 1); // recursive case • Number of comparisons made = 3
} So, in the worst case, if the element was not present, the algorithm
would make n comparisons.
Q1 c -(i) Backtracking Q1 d- Given currency denominations:
Backtracking is a problem-solving algorithm used to find all {2, 5, 10, 15, 20, 100, 500}
(or some) solutions by trying out different possibilities and need to make ₹437 using the minimum number of notes with a Greedy
eliminating the ones that fail to satisfy the problem Approach.
constraints. Greedy Approach:
-It builds the solution step by step and removes a solution The greedy strategy selects the highest possible denomination first, then
(backtracks) when it determines that it cannot be proceeds with the next lower denomination for the remaining amount,
completed. continuing this process until the target amount is 0.
-Commonly used in puzzles, combinatorial problems, and Step-by-Step Solution:
optimization. Start with amount = ₹437
Example: Solving the N-Queens problem, Sudoku, or
Step Chosen Denomination No. of Notes Remaining Amount
generating permutations.
1 ₹100 4 ₹37
(ii) Fractional Knapsack Problem
The Fractional Knapsack problem is a greedy algorithm 2 ₹20 1 ₹17
problem where: 3 ₹15 1 ₹2
-You can break items into smaller parts (fractions). 4 ₹2 1 ₹0
-You aim to maximize the total value in the knapsack with a
Final Result:
given weight limit.
• ₹100 × 4 = ₹400
Approach:
• ₹20 × 1 = ₹20
1. Calculate the value/weight ratio for each item.
• ₹15 × 1 = ₹15
2. Sort items by this ratio in descending order.
• ₹2 × 1 = ₹2
3. Pick items starting from the highest ratio.
Total = ₹437 using 7 notes
4. If the whole item can't fit, take the fractional part
Conclusion:
that fits.
Using the greedy algorithm, the minimum number of notes required to
Time Complexity: O(n log n)
make ₹437 is 7 notes, and the sequence of notes used is:
(iii) Connected Graph
₹100, ₹100, ₹100, ₹100, ₹20, ₹15, ₹2
A Connected Graph is a type of graph in which there is a
path between every pair of vertices.
• Undirected Graph: It is connected if there is at least
one path between every pair of vertices.
• Directed Graph: Strongly connected if every vertex
is reachable from every other vertex via a directed
path.
Example:
A—B
| |
C—D
3(a) Analyze the time complexity of the Bubble Sort algorithm Step 1: Initialization
Bubble Sort is a simple comparison-based sorting algorithm that We start from vertex A.
works by repeatedly swapping adjacent elements if they are in the
Previ
wrong order. Vertex Distance from A
ous
Time Complexity Analysis:
Let the array size be n. A 0 —
Worst Case: B ∞ —
• Occurs when the array is sorted in reverse order. C ∞ —
• Bubble Sort performs n−1 passes. D ∞ —
• In each pass, it makes multiple comparisons and swaps. Visited Set: {}
• Number of comparisons = Step 2: Visit A (distance = 0)
From A:
• A → B = 10 → update distance of B to 10
Worst Case Time Complexity: • A → C = 3 → update distance of C to 3
Vertex Distance Previous
A 0 —
B 10 A
C 3 A
Best Case: D ∞ —
• Occurs when the array is already sorted.
Visited: {A}
• Optimized Bubble Sort can detect no swaps were made
Step 3: Visit C (distance = 3)
in the first pass and terminate early.
From C:
• So, only one pass is needed and no swaps are
• C → B = 4 → total = 3+4 = 7 → update B to 7
performed.
• C → D = 8 → total = 3+8 = 11 → update D to 11
Best Case Time Complexity:
Vertex Distance Previous
A 0 —
B 7 C
Time C 3 A
Case
Complexity
D 11 C
Best Case O(n)O(n)
Visited: {A, C}
Worst Case O(n2)O(n^2) Step 4: Visit B (distance = 7)
From B:
• B → C = 1 → 7+1 = 8 → but C is already at 3, so no update
• B → D = 2 → 7+2 = 9 → update D to 9
Vertex Distance Previous
A 0 —
B 7 C
C 3 A
D 9 B
Visited: {A, C, B}
Step 5: Visit D (distance = 9)
From D:
• D → C = 3 → total = 9+3 = 12 > 3 → no update
Visited: {A, B, C, D} Done.
Final Shortest Distances from A:
Vertex Distance from A Path
A 0 A
B 7 A→C→B
C 3 A→C
D 9 A→C→B→D
Q2 a - Step 1: Understand the recurrence structure Q2 b - We are given:
At each level: -Algorithm A has recurrence:
-The problem divides into two subproblems of sizes ≈ n/3 and
≈ 2n/3. -Algorithm A′ has recurrence:
-It does n work at the current level. We are asked:
Step 2: Build recursion tree levels
Let’s estimate the cost at each level of recursion:
• Level 0: What is the largest integer value of ∣a∣|a|∣a∣ such that A′ is
Work = n asymptotically faster than A?
Subproblems: T(n/3), T(2n/3) Step 1: Solve each recurrence using Master's Theorem
• Level 1: For A:
o T(n/3) does ≈ n/3 work
o T(2n/3) does ≈ 2n/3 work
Total work = n/3 + 2n/3 = n Compare with Master’s Theorem:
• Level 2:
Each of T(n/3) and T(2n/3) split further into two
subproblems, continuing the same pattern. Where:
Total work at level 2 = n • a=7a = 7a=7,
• And so on… • b=2b = 2b=2,
At every level, the total work remains n. • f(n)=n2f(n) = n^2f(n)=n2
Step 3: Determine the height of the recursion tree Now compute:
Since the problem is split into T(n/3) and T(2n/3) at each level, the
largest subproblem is of size 2n/3.
At each level, the size of the largest subproblem reduces by a
factor of 2/3. Since
Let’s compute the number of levels L until the size becomes 1: this is Case 1 of Master’s Theorem:
So,
So:
So:
Final Answer:48
Because the largest integer value of ∣a∣|a|∣a∣ such that A′ is
asymptotically faster than A is: |a| = 48∣=48