0% found this document useful (0 votes)
9 views12 pages

Abdeta Beshana

Uploaded by

jabessafufa2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views12 pages

Abdeta Beshana

Uploaded by

jabessafufa2
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

ODA BULTUM UNIVERSITY

COLLAGE OF NATURAL AND COMPUTATIONAL SCIENCE


DEPATMENT OF COMPUTER SCIENCE
Course: Design and Analysis of Algorithm
Assignment Type: Individual
Target Students: 3rd year Computer science
STUDENT NAME ……………………...IDNO
ABDETA BESHANA…………………………..0873
SUBMIT MIR…HADI H.
SUBMISSION DATE:SEPTEMBER ..12,2024

1. Binary Search

Binary Search is an efficient algorithm for finding an item from a sorted list of items. It works by
repeatedly dividing the search interval in half.

Algorithm:

1. Initialize:
o Set low to 0 (the start index of the list).
o Set high to the length of the list minus 1 (the end index).
2. While loop:
o While low is less than or equal to high:
1. Compute the middle index: mid = low + (high - low) / 2.
2. Compare the target value with the middle element:
 If the target is equal to the middle element, return mid.
 If the target is less than the middle element, update high to mid -
1.
 If the target is greater than the middle element, update low to mid
+ 1.
3. Return:
o If the loop exits without finding the target, return -1 (indicating the target is not in
the list).

2. Merge Sort

Merge Sort is a divide-and-conquer algorithm that divides the array into smaller subarrays, sorts
those subarrays, and then merges them back together.

Algorithm:

1
1. Divide:
o If the array has one element or is empty, return it (base case).
o Find the middle index: mid = length of array / 2.
o Recursively sort the two halves: left_half and right_half.
2. Merge:
o Create two pointers: one for each half (i for left_half, j for right_half).
o Compare elements from both halves and build a sorted array:
 While both pointers have elements:
 Compare the elements pointed to by i and j.
 Append the smaller element to the merged array and move the
pointer forward.
 Append any remaining elements from left_half and right_half to the
merged array.
3. Return:
o Return the merged and sorted array.

Explanation

4. Initialization:
o low is set to 0, and high is set to the last index of the array (9 in this case).
5. First Iteration:
o Compute mid: mid = 0 + (9 - 0) // 2 = 4.
o arr[4] is 9, which is less than 15, so we update low to mid + 1 (5).
6. Second Iteration:
o Compute mid: mid = 5 + (9 - 5) // 2 = 7.
o arr[7] is 15, which matches the target.
7. Return:
o The function returns the index 7, where the target value 15 is found

3. Quick Sort

Quick Sort is another divide-and-conquer algorithm that picks an element as a pivot and
partitions the array around the pivot.

Algorithm:

1. Partitioning:
o Choose a pivot element (often the last element in the array).
o Reorder the array so that elements less than the pivot are on the left, elements
greater than the pivot are on the right.
o Place the pivot in its correct position.
2. Recursively Sort:
o Recursively apply the quick sort to the subarrays formed by partitioning:
 The subarray to the left of the pivot.
 The subarray to the right of the pivot.
3. Base Case:
2
oIf the array has one or zero elements, it is already sorted.
4. Return:
o Return the sorted array.

4. Selection Sort

Selection Sort is a simple comparison-based sorting algorithm that repeatedly selects the smallest
(or largest) element from the unsorted part and moves it to the sorted part.

Algorithm:

1. Iterate:
o For each index i from 0 to length of array - 2:
1. Assume the smallest element is at index i.
2. Iterate through the unsorted portion of the array (from i+1 to the end):
 Find the index of the smallest element.
3. Swap the smallest found element with the element at index i.
2. Repeat:
o Continue this process until the entire array is sorted.
3. Return:
o Return the sorted array.

1. Binary Search

Time Complexity: O(log n)

Example:

Let's consider a sorted array with 16 elements:


Array=[1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31]\text{Array} = [1, 3, 5, 7, 9, 11, 13, 15, 17,
19, 21, 23, 25, 27, 29, 31]Array=[1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31]

Suppose we want to find the element 19.

Steps:

1. Initial Array: [1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31][1, 3, 5, 7, 9, 11, 13, 15, 17,


19, 21, 23, 25, 27, 29, 31][1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31]
o Middle index: 7 (Element: 15)
2. Search Interval Update: Since 19 > 15, update the interval to [17,19,21,23,25,27,29,31]
[17, 19, 21, 23, 25, 27, 29, 31][17,19,21,23,25,27,29,31]
3. Updated Array: [17,19,21,23,25,27,29,31][17, 19, 21, 23, 25, 27, 29, 31]
[17,19,21,23,25,27,29,31]
o Middle index: 3 (Element: 23)

3
4. Search Interval Update: Since 19 < 23, update the interval to [17,19][17, 19][17,19]
5. Updated Array: [17,19][17, 19][17,19]
o Middle index: 1 (Element: 19)
6. Element Found: 19 is found at index 1.

2. Merge Sort
Time Complexity: O(n log n)

Example:

Let's consider an array with 8 elements: Array=[38,27,43,3,9,82,10,44]\text{Array} = [38, 27,


43, 3, 9, 82, 10, 44]Array=[38,27,43,3,9,82,10,44]

Steps:

1. Initial Array: [38,27,43,3,9,82,10,44][38, 27, 43, 3, 9, 82, 10, 44][38,27,43,3,9,82,10,44]


2. Divide: Split into:
o [38,27,43,3][38, 27, 43, 3][38,27,43,3]
o [9,82,10,44][9, 82, 10, 44][9,82,10,44]
3. Divide Further:
o [38,27][38, 27][38,27] and [43,3][43, 3][43,3]
o [9,82][9, 82][9,82] and [10,44][10, 44][10,44]
4. Divide to Base Cases:
o [38][38][38] and [27][27][27]
o [43][43][43] and [3][3][3]
o [9][9][9] and [82][82][82]
o [10][10][10] and [44][44][44]
5. Merge:
o [27,38][27, 38][27,38] and [3,43][3, 43][3,43]
o [9,82][9, 82][9,82] and [10,44][10, 44][10,44]
6. Final Merge:
o Merge the sorted subarrays: [3,27,38,43][3, 27, 38, 43][3,27,38,43] and
[9,10,44,82][9, 10, 44, 82][9,10,44,82]
o Resulting in: [3,9,10,27,38,43,44,82][3, 9, 10, 27, 38, 43, 44, 82]
[3,9,10,27,38,43,44,82]

4. Quick Sort

In the average case, the partitioning operation takes linear time, and there are log⁡n\log nlogn
levels of recursion, resulting in O(nlog⁡n)O(n \log n)O(nlogn). In the worst case (e.g., already
sorted array), time complexity becomes O(n2)O(n^2)O(n2).

4
Time Complexity: Average Case: O(n log n); Worst Case: O(n^2)

Example:

Let's consider an array with 5 elements: Array=[4,2,7,1,3]\text{Array} = [4, 2, 7, 1,


3]Array=[4,2,7,1,3]

Steps:

1. Initial Array: [4,2,7,1,3][4, 2, 7, 1, 3][4,2,7,1,3]


o Choose pivot (e.g., 3).
2. Partitioning:
o Rearrange: [2,1,3,7,4][2, 1, 3, 7, 4][2,1,3,7,4] (elements less than pivot on left,
greater on right).
3. Recursive Sort:
o Left of pivot: [2,1][2, 1][2,1]
 Pivot: 1, resulting in [1,2][1, 2][1,2]
o Right of pivot: [7,4][7, 4][7,4]
 Pivot: 4, resulting in [4,7][4, 7][4,7]
4. Combining:
o Final sorted array: [1,2,3,4,7][1, 2, 3, 4, 7][1,2,3,4,7]

4. Selection Sort

Selection Sort performs nnn passes through the array, with each pass involving scanning the
remaining elements, leading to O(n2)O(n^2)O(n2) time complexity Time Complexity: O(n^2)

Example:

Let's consider an array with 5 elements: Array=[64,25,12,22,11]\text{Array} = [64, 25, 12, 22,
11]Array=[64,25,12,22,11]

Steps:

1. Initial Array: [64,25,12,22,11][64, 25, 12, 22, 11][64,25,12,22,11]


2. Find Minimum and Swap:
o Minimum: 11. Swap with first element: [11,25,12,22,64][11, 25, 12, 22, 64]
[11,25,12,22,64]
3. Next Iteration:
o Minimum in [25,12,22,64][25, 12, 22, 64][25,12,22,64]: 12. Swap with second
element: [11,12,25,22,64][11, 12, 25, 22, 64][11,12,25,22,64]
4. Continue:
o Minimum in [25,22,64][25, 22, 64][25,22,64]: 22. Swap with third element:
[11,12,22,25,64][11, 12, 22, 25, 64][11,12,22,25,64]

5
5. Final Array: [11,12,22,25,64][11, 12, 22, 25, 64][11,12,22,25,64]

Summary Table

Best Case Time Average Case Time Worst Case Time


Algorithm
Complexity Complexity Complexity
Binary
O(1) O(log n) O(log n)
Search
Merge Sort O(n log n) O(n log n) O(n log n)
Quick Sort O(n log n) O(n log n) O(n^2)
Selection
O(n^2) O(n^2) O(n^2)
Sort

1. Prim's Algorithm

Prim's Algorithm is a greedy algorithm that builds the MST by starting from an arbitrary node
and growing the MST one edge at a time.

Algorithm:

1. Initialization:
o Start with an arbitrary vertex and add it to the MST.
o Create a priority queue (min-heap) to store edges with their weights.
o Create a set to keep track of vertices included in the MST.
o Initialize the MST set with the starting vertex.
2. Process:
o While there are vertices not yet included in the MST:
1. Extract the edge with the smallest weight from the priority queue.
2. If the edge connects a vertex in the MST to a vertex not yet included:
 Add this vertex to the MST.
 Add all edges connecting this new vertex to vertices not in the
MST to the priority queue.
3. Add the edge to the MST.
3. Return:
o The MST formed by the edges added during the process.

Example:

Consider a graph with vertices V={A,B,C,D}V = \{A, B, C, D\}V={A,B,C,D} and edges with
weights:

 A−B:1A-B: 1A−B:1
 A−C:4A-C: 4A−C:4

6
 B−C:2B-C: 2B−C:2
 B−D:5B-D: 5B−D:5
 C−D:3C-D: 3C−D:3

1. Start from vertex AAA.


2. Priority Queue: (A−B:1),(A−C:4)(A-B: 1), (A-C: 4)(A−B:1),(A−C:4)
3. Choose Edge A−BA-BA−B:
o Add BBB to MST.
o Add edges from BBB: (B−C:2),(B−D:5)(B-C: 2), (B-D: 5)(B−C:2),(B−D:5)
4. Priority Queue: (B−C:2),(A−C:4),(B−D:5)(B-C: 2), (A-C: 4), (B-D: 5)(B−C:2),
(A−C:4),(B−D:5)
5. Choose Edge B−CB-CB−C:
o Add CCC to MST.
o Add edges from CCC: (C−D:3)(C-D: 3)(C−D:3)
6. Priority Queue: (C−D:3),(A−C:4),(B−D:5)(C-D: 3), (A-C: 4), (B-D: 5)(C−D:3),
(A−C:4),(B−D:5)
7. Choose Edge C−DC-DC−D:
o Add DDD to MST.
8. MST Complete with edges (A−B),(B−C),(C−D)(A-B), (B-C), (C-D)(A−B),(B−C),
(C−D).

2. Kruskal's Algorithm

Kruskal's Algorithm is another greedy algorithm that finds the MST by sorting all the edges and
adding them one by one, ensuring no cycles are formed.

Algorithm:

1. Initialization:
o Sort all edges in non-decreasing order of their weights.
o Initialize a union-find data structure to manage and find cycles.
o Create an empty list to store the MST edges.
2. Process:
o For each edge in the sorted list:
1. Check if adding the edge to the MST would form a cycle using the union-
find data structure.
2. If no cycle is formed:
 Add the edge to the MST.
 Union the sets of the two vertices of the edge.
3. Return:
o The MST formed by the edges added during the process.

Example:

Consider the same graph with vertices V={A,B,C,D}V = \{A, B, C, D\}V={A,B,C,D} and edges
with weights:

7
 A−B:1A-B: 1A−B:1
 A−C:4A-C: 4A−C:4
 B−C:2B-C: 2B−C:2
 B−D:5B-D: 5B−D:5
 C−D:3C-D: 3C−D:3

1. Sort Edges: (A−B:1),(B−C:2),(C−D:3),(A−C:4),(B−D:5)(A-B: 1), (B-C: 2), (C-D: 3),


(A-C: 4), (B-D: 5)(A−B:1),(B−C:2),(C−D:3),(A−C:4),(B−D:5)
2. Process Edges:
o Add A−BA-BA−B to MST.
o Add B−CB-CB−C to MST.
o Add C−DC-DC−D to MST.
o Skip A−CA-CA−C (would form a cycle).
o Skip B−DB-DB−D (would form a cycle).
3. MST Complete with edges (A−B),(B−C),(C−D)(A-B), (B-C), (C-D)(A−B),(B−C),
(C−D).

Summary

 Prim's Algorithm: Starts from a vertex and grows the MST by adding the smallest edge
connecting the MST to an outside vertex.
 Kruskal's Algorithm: Starts with all edges and adds the smallest edge to the MST,
ensuring no cycles are formed using a union-find structure.

Parallel algorithms are designed to execute tasks concurrently, which helps in solving problems
faster by utilizing multiple processors or cores. Here’s a look into the functionality, applications

Parallel algorithms are designed to execute tasks concurrently, which helps in solving problems
faster by utilizing multiple processors or cores. Here’s a look into the functionality, applications,
and an example algorithm for parallel processing:

1. Functionality of Parallel Algorithms

Functionality:

 Concurrency: Parallel algorithms divide a problem into smaller sub-tasks that can be
executed simultaneously on multiple processors or cores.
 Speedup: By executing multiple tasks at once, these algorithms can significantly reduce
computation time compared to sequential algorithms.
 Scalability: Efficient parallel algorithms can scale with the number of processors or
cores, improving performance as more computational resources are added.

8
 Synchronization: They include mechanisms to handle dependencies between tasks,
ensuring correct results despite simultaneous executions.

2. Applications of Parallel Algorithms

Parallel algorithms are widely used in various fields due to their ability to handle large-scale and
complex problems efficiently:

 Scientific Computing: Simulation of physical systems, climate modeling, and molecular


dynamics.
 Data Analysis: Big data processing, real-time analytics, and machine learning model
training.
 Computer Graphics: Rendering images, video processing, and simulations in graphics-
intensive applications.
 Engineering: Computational fluid dynamics, structural analysis, and optimization
problems.
 Database Systems: Query processing, indexing, and transaction management.
 Search Engines: Indexing web pages, ranking results, and processing search queries.

3. Example Parallel Algorithm: Parallel Merge Sort

Parallel Merge Sort is an extension of the traditional Merge Sort algorithm, designed to run
efficiently on multi-core processors by dividing the sorting task across multiple threads.

Algorithm:

1. Divide:
o Recursively divide the array into two halves until each sub-array contains a single
element or no elements.
2. Parallel Merge:
o Use multiple threads to sort the two halves of each sub-array in parallel.
o Once sorted, merge the two halves concurrently.
 Divide the merging process into smaller tasks that can be performed in
parallel.
3. Merge:
o Perform merging of the sorted sub-arrays by combining them into a single sorted
array using multiple threads if necessary.
4. Combine:
o Continue merging sorted sub-arrays in parallel until the entire array is sorted.

Detailed Steps:

1. Divide the Array:


o Split the array into two halves.
o Create two threads to sort each half.
2. Sort in Parallel:

9
o Each thread recursively sorts its assigned half.
o This recursion continues until the base case is reached (sub-array with one or zero
elements).
3. Merge Sorted Halves:
o Once the sub-arrays are sorted, merge them.
o Use multiple threads to perform merging operations concurrently.
o Implement a parallel merging strategy to reduce the overall merge time.

Explanation:

 Divide Step: The array is divided into halves recursively.


 Parallel Sorting: Sorting of the two halves happens concurrently using separate threads.

Merge Step: The sorted halves are merged concurrently, using threads to speed up the
merging process.

 Combine: The sorted halves are combined to form the final sorted array.

Summary

 Functionality: Parallel algorithms enhance performance by executing tasks concurrently,


improving speed and scalability.
 Applications: Used in various fields including scientific computing, data analysis, and
computer graphics.
 Example: Parallel Merge Sort showcases how to utilize multiple processors to perform
sorting more efficiently.

1. Parallel Matrix Multiplication

Example:

Matrix multiplication is a common operation in many scientific and engineering applications.


Given two matrices AAA and BBB, the goal is to compute their product CCC.

Algorithm:

1. Initialization:
o Let AAA be an m×nm \times nm×n matrix.
o Let BBB be an n×pn \times pn×p matrix.
o Let CCC be an m×pm \times pm×p matrix to store the result.
2. Parallel Computation:
o Divide the computation of each element of CCC across multiple processors.
o Each processor calculates one or more elements of CCC by performing the dot
product of the corresponding row of AAA and column of BBB.

10
3. Combine Results:
o Gather the results from all processors to form the final matrix CCC.

Detailed Steps:

1. Matrix Multiplication Formula:

C[i][j]=∑k=1nA[i][k]×B[k][j]C[i][j] = \sum_{k=1}^{n} A[i][k] \times B[k][j]C[i]


[j]=k=1∑nA[i][k]×B[k][j]

2. Parallel Execution:
o Distribute the calculation of different elements C[i][j]C[i][j]C[i][j] to different
processors.

2. Parallel Quick Sort

Example:

Quick Sort is a popular sorting algorithm. Parallel Quick Sort enhances the performance of
sorting large datasets by sorting sub-arrays in parallel.

Algorithm:

1. Partition:
o Choose a pivot element.
o Partition the array into elements less than the pivot and elements greater than the
pivot.
2. Parallel Sort:
o Recursively sort the sub-arrays on either side of the pivot in parallel.
3. Combine:
o Merge the sorted sub-arrays into a single sorted array.

Detailed Steps:

1. Partition Step:
o Rearrange elements around the pivot.
2. Parallel Recursive Sort:
o Use multiple threads to sort the left and right sub-arrays.

3. Parallel Breadth-First Search (BFS)

Example:

BFS is used for traversing or searching tree or graph data structures. Parallel BFS can be used to
explore multiple levels of a graph simultaneously.

11
Algorithm:

1. Initialization:
o Start from a source node.
o Initialize a queue and a visited set.
2. Parallel Level Processing:
o While there are nodes in the queue:
 Process all nodes at the current level in parallel.
 Enqueue their unvisited neighbors to be processed in the next level.
3. Combine Results:
o Collect results from all parallel executions.

Detailed Steps:

1. Initialization:
o Mark the source node as visited and enqueue it.
2. Level-wise Processing:
o Use multiple threads to process nodes at the current level.

Summary

 Parallel Matrix Multiplication: Efficiently computes the product of two matrices by


dividing the task among multiple processors.
 Parallel Quick Sort: Enhances the Quick Sort algorithm by sorting sub-arrays
concurrently, improving performance on large datasets.
 Parallel BFS: Speeds up the traversal of a graph by processing nodes at each level in
parallel.

12

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy