0% found this document useful (0 votes)
20 views

DSA I Week 1 Lecture 2

The document outlines a lecture on Data Structures and Algorithms, focusing on algorithms, their correctness, and complexity analysis. It discusses the concept of algorithms, the importance of analyzing running time and space, and provides an example of Insertion Sort with its complexity analysis. The document also introduces asymptotic notations like Big-O, Omega, and Theta to describe algorithm performance.

Uploaded by

nsalman223675
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

DSA I Week 1 Lecture 2

The document outlines a lecture on Data Structures and Algorithms, focusing on algorithms, their correctness, and complexity analysis. It discusses the concept of algorithms, the importance of analyzing running time and space, and provides an example of Insertion Sort with its complexity analysis. The document also introduces asymptotic notations like Big-O, Omega, and Theta to describe algorithm performance.

Uploaded by

nsalman223675
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 34

CSE 2215: Data Structures and Algorithms I

Sherajul Arifin
Lecturer, Department of Computer Science and Engineering,
United International University
Today’s Goals

• Discuss Runtime of Programs.


• Compute and Classify growth of
functions.
• Analyze complexity classes for
algorithms.

2
What is an Algorithm?
An algorithm is a sequence of computational steps
that solves a well-specified computational problem.
• An algorithm is said to be correct if, for every
input instance, it halts with the correct output
• An incorrect algorithm might not halt at all on
some input instances, or it might halt with other
than the desired output.
4
What is a Program?
A program is the expression of an algorithm in a programming
language

A set of instructions which the computer will follow to solve a


problem

5
Define a Problem, and Solve It
Problem: Description of Input-Output relationship
Algorithm: A sequence of computational steps that
transform the input into the output.
Data Structure: An organized method of storing and
retrieving data.

Our Task:
Given a problem, design a correct and good algorithm
that solves it.
6
Define a Problem, and Solve It
Problem: Input is a sequence of integers stored in an array.
Output the minimum.

INPUT Algorithm

instance OUTPUT
m:= a[1];
for I:=2 to size of input
25, 90, 53, 23, 11, 34 if m > a[i] then
11
m:=a[i];
return m

m, a[i] Data-Structure

7
What do we Analyze?
o Correctness
o Does the input/output relation match algorithm requirement?
o Amount of work done (complexity)
o Basic operations to do task
o Amount of space used
o Memory used
o Simplicity, clarity
o Verification and implementation.
o Optimality
o Is it impossible to do better?
8
Running Time
- Number of primitive steps that are executed
- Except for time of executing a function call most statements roughly
require the same amount of time
● y=m*x+b
● c = 5 / 9 * (t - 32 )
● z = f(x) + g(y)

We can be more exact if needed

9
An Example: Insertion Sort

10
An Example: Insertion Sort

A = {5, 2, 4, 6, 1, 3}

11
An Example: Insertion Sort
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
12
An Example: Insertion Sort
How many times will
InsertionSort(A, n) {
for i = 2 to n {
this loop execute?
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
13
Analyzing Insertion Sort
Statement Cost Times

InsertionSort(A, n) {

for i = 2 to n { c1 n

key = A[i] c2 (n-1)

j = i – 1; c3 (n-1)

while (j > 0) and (A[j] > key) { c4 T

A[j+1] = A[j] c5 (T – (n – 1))

j=j–1} c6 (T – (n – 1))

A[j+1] = key c7 (n-1)

}} 14
Analyzing Insertion Sort

T = t2 + t3 + … + tn, where ti is the number of


while expression evaluations for the ith for
loop iteration

15
Analyzing Insertion Sort
T(n) = c1n + c2(n-1) + c3(n-1) + c4T + c5(T - (n-1)) + c6(T - (n-1)) +
c7(n-1)
= c8T + c9n + c10

What can T be?


Best case: the array is sorted (inner loop body never executed)
ti = 1 🡺 T = n
T(n) = an + b, a linear function of n

16
Worst case: the array is reverse sorted (inner loop body
executed for all previous elements)
ti = i 🡺 T = n(n + 1)/2 - 1
T(n) = an2 + bn + c, a quadratic function of n
Average case:
???

17
Asymptotic Performance
We care most about asymptotic performance
• How does the algorithm behave as the problem
size gets very large?
o Running time
o Memory/storage requirements
o Bandwidth/power requirements/logic
gates/etc.

18
Asymptotic Analysis
Worst case
o Provides an upper bound on running time
o An absolute guarantee of required resources
Average case
o Provides the expected running time
o Very useful, but treat with care: what is “average”?
o Random (equally likely) inputs
o Real-life inputs
Best case
o Provides a lower bound on running time
19
Upper Bound Notation
We say Insertion Sort’s run time is O(n2)
Properly we should say run time is in O(n2)
Read O as “Big-O” (you’ll also hear it as “order”)
In general a function
f(n) is O(g(n)) if there exist positive constants c and n0
such that 0 ≤ f(n) ≤ c ⋅ g(n) for all n ≥ n0
Formally
O(g(n)) = { f(n): ∃ positive constants c and n0 such that 0
≤ f(n) ≤ c ⋅ g(n) ∀ n ≥ n0 }
20
Upper Bound Notation
time
c.g(n)

f(n)

n0 n
We say g(n) is an asymptotic upper bound for
f(n) 21
Insertion Sort is O(n2)
Proof
The run-time is an2 + bn + c
If any of a, b, and c are less than 0, replace the constant with its
absolute value
an2 + bn + c ≤ (a + b + c)n2 + (a + b + c)n + (a + b + c)
≤ 3(a + b + c)n2 for n ≥ 1

Let c’ = 3(a + b + c) and let n0 = 1. Then


an2 + bn + c≤ c’ n2 for n ≥ 1
Thus an2 + bn + c = O(n2).
Question
Is InsertionSort O(n3) ?
Is InsertionSort O(n) ?
22
Lower Bound Notation
We say InsertionSort’s run time is Ω(n)
In general a function
f(n) is Ω(g(n)) if ∃ positive constants c and n0 such
that
0 ≤ c⋅g(n) ≤ f(n) ∀ n ≥ n0
Proof:
Suppose run time is an + b
Assume a and b are positive
an ≤ an + b
23
Lower Bound Notation
time
f(n)

c.g(n)

n0 n
We say g(n) is an asymptotic lower bound for
f(n) 24
Asymptotic Tight Bound

A function f(n) is Θ(g(n)) if ∃ positive constants c1, c2,


and n0 such that

0 ≤ c1 g(n) ≤ f(n) ≤ c2 g(n) ∀ n ≥ n0


Theorem
f(n) is Θ(g(n)) iff f(n) is both O(g(n)) and Ω(g(n))
Proof:

25
Asymptotic Tight Bound
c2.g(n) f(n)
time

c1.g(n)

n0 n
We say g(n) is an asymptotic tight bound for f(n)
26
Practical Complexity
For large input sizes, constant terms are insignificant
Program A with running time TA(n)= 100n
Program B with running time TB(n)= 2n2
TP(n)

TB(n) = 2n2

TA(n) = 100n
5000

Input Size n
50
27
Practical Complexity

28
Practical Complexity

29
Practical Complexity

30
Practical Complexity

31
Practical Complexity

32
Practical Complexity
Function Descriptor Big-Oh
c Constant O( 1 )
log n Logarithmic O( log n )
n Linear O( n )
n log n n log n O( n log n )
n2 Quadratic O( n2 )
n3 Cubic O( n3 )
nk Polynomial O( nk )
2n Exponential O( 2n )
n! Factorial O( n! )

33
Other Asymptotic Notations
A function f(n) is o(g(n)) if ∃ positive constants c and n0 such
that
f(n) < c g(n) ∀ n ≥ n0
A function f(n) is ω(g(n)) if ∃ positive constants c and n0 such
that
c g(n) < f(n) ∀ n ≥ n0
Intuitively,
■ o( ) is like < ■ ω( ) is like > ■ Θ( ) is like =
■ O( ) is like ≤ ■ Ω( ) is like ≥
34
Thank you!

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy