0% found this document useful (0 votes)
4 views60 pages

DS Week# 04 & 05_B Algorithm Complexity.pptx

The document provides an overview of algorithms, focusing on their time complexity and space complexity. It explains how to measure time complexity using Big-O notation and discusses various time complexities such as constant, linear, logarithmic, and quadratic. Additionally, it includes examples of algorithms and their corresponding time complexities, emphasizing the importance of analyzing algorithm efficiency.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views60 pages

DS Week# 04 & 05_B Algorithm Complexity.pptx

The document provides an overview of algorithms, focusing on their time complexity and space complexity. It explains how to measure time complexity using Big-O notation and discusses various time complexities such as constant, linear, logarithmic, and quadratic. Additionally, it includes examples of algorithms and their corresponding time complexities, emphasizing the importance of analyzing algorithm efficiency.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 60

ALGORITHM TIME COMPLEXITY

& LOOP INVARIANT

1
ALGORITHM
The word "algorithm" refers to a step-by-step method for performing some action.

OR

It’s a recipe to solve a particular problem.

A computer program is, similarly, a set of instructions that are executed step-by-step for performing
some specific task.

2
INFORMATION ABOUT
ALGORITHM
The name of the algorithm, together with a list of input and output variables.

The input variables names, labeled by data type.

The statements that make the body of the algorithm, with explanatory comments.

The output variable names, labeled by data type.

An end statement.

3
ANALYSIS OF ALGORITHM
Efficiency of an algorithm can be measured in terms of:
Execution time (time complexity)
The amount of memory required (space complexity)

4
TIME COMPLEXITY
The time complexity of an algorithm can be expressed in terms of number of operations used by an
algorithm when the input has a particular size.

Time complexity is described in terms of number of operations required instead of actual computer
time.

5
SPACE COMPLEXITY
An analysis of computer memory required involves the space complexity of an algorithm.

Considerations of space complexity are tied in with the particular data structures used to implement
the algorithm.

We will restrict our attention to time complexity.

6
Tricks to Measure Time
Complexity
Drop the non dominant terms
Drop the constant terms
Break the code into fragments

7
MEASURING TIME
COMPLEXITY
Counting number of operations involved in the algorithms to handle n items.

Objectives of time complexity analysis:

To determine the feasibility of an algorithm by estimating an upper bound on the amount of work
performed.

To compare different algorithms before deciding on which one to implement.

8
To simplify analysis, we sometimes ignore work that takes a constant amount of time, independent
of the problem input size

When comparing two algorithms that perform the same task, we often just concentrate on the
differences between algorithms.

Big-O notation is used to measure the time complexity.


To deal with n items, time complexity can be O(1), O(log n), O(n), O(n log n), O(n2), O(n3), O(2n),
even O(nn).

9
Simplified analysis can be based on:

Number of arithmetic operations performed


Number of comparisons made
Number of times through a critical loop
Number of array elements accessed
etc

10
10
EXAMPLE: POLYNOMIAL EVALUATION
Suppose that exponentiation is carried out using multiplications.

p(x) = 4x4 + 7x3 – 2x2 + 3x + 6


are:

P(x) = 4*x*x*x*x + 7*x*x*x – 2*x*x +3*x + 6

Method of Analysis:
Basic operations are multiplication, addition, and subtraction

11
11
We’ll only consider the number of multiplications, since the number of additions and subtractions are
the same in each solution.

We’ll examine the general form of a polynomial of degree n, and express our result in terms of n.

We’ll look at the worst case (max number of multiplications) to get an upper bound on the work

12
12
GENERAL FORM OF
POLYNOMIAL
General form of polynomial is:

p(x) = anxn + an-1xn-1 + an-2xn-2 + … + a1x1 + a0

where an is non-zero for all n >= 0

13
13
Analysis:
p(x) = an * x * x * … * x * x + n multiplications

an-1 * x * x * … * x * x + n-1 multiplications

an-2 * x * x * … * x * x + n-2 multiplications

…+ …
a2 * x * x + 2 multiplications

a1 * x + 1 multiplication

a0

Number of multiplications needed in the worst case is


T(n) = n + n-1 + n-2 + … + 3 + 2 + 1
= n(n + 1)/2

= n2/2 + n/2

14
14
BIG – O Notation
Formally, the time complexity T(n) of an algorithm is O(f(n)) (of the order f(n))

Example:
Q: What is the Big-O notation for polynomial evaluation?

We chose the highest-order term of the expression:

T(n) = n2/2 + n/2 ,

so, T(n) is O(n2)

15
15
BIG – O Analysis in General
With independent nested loops: The number of iterations of the inner loop is independent of the
number of iterations of the outer loop.

Example:

16
16
BIG – O Analysis in General
With dependent nested loops: The number of iterations of the inner loop is depends on the value
from the outer loop.

Example:

As we discussed earlier in polynomial equations


17
17 1+2+3+4…+n=n2/2+n/2
BIG – O Analysis in General
Assume that a computer executes a million instructions a second. This chart summarizes the
amount of time required to execute f(n) instructions on this machine for various values of n.

18
18
To determine the time complexity of an algorithm:

Express the amount of work done as a sum:


f1(n) + f2(n) + … + fk(n)

Identify the dominant term: the fi such that fj is O(fi) and for k different from j
fk(n) < fj(n) (for all sufficiently large n)

Then the time complexity is O(fj)

19
19
INTRACTABLE PROBLEM
An intractable problem is a problem that is so complex that it would take an impractically long time to
solve, even with powerful computers.
Example:

Algorithms with time complexity O(2n) take too long to solve even for moderate values of n; a
machine that executes 100 million(10 crore) instructions per second can execute 260 instructions in
about 365 years.

This results in a very large number. Specifically, 2 60 equals 1,152,921,504,606,846,976

20
20
CONSTANT TIME: O (1)
An algorithm is said to run in constant time if it requires the same amount of time regardless of the
input size.

Constant time complexity is denoted as O(1).

21
21
LINEAR TIME: O (n)
An algorithm is said to run in linear time if its time execution is directly proportional to the input size,
i.e. time grows linearly as input size increases.
Example:
Imagine you're reading a list of names one by one. If there are 10 names, it might take you 10
seconds. If there are 20 names, it might take you 20 seconds. Here, the time is directly proportional to
the number of names (input size), so this would be linear time.
Linear time complexity is denoted as O(n).

Examples:

Array: linear search, traversing, find minimum

22
22
LOGARITHMIC TIME: O (log
n)
An algorithm is said to run in logarithmic time if its time execution is proportional to the logarithm of
the input size.
In simpler terms, as the input size n increases, the execution time increases much more slowly. For example, if an
algorithm has a time complexity of O(log⁡n), doubling the input size will only add a small constant amount to the running
time.
For example:
If n=8, then log⁡28=3.
If n=16 then log⁡216=4.

Example:
binary search

23
23
QUADRATIC TIME: O (n2)
An algorithm is said to run in quadratic time if its time execution is proportional to the square of the
input size.
In simpler terms, as the input size n increases, the execution time increases much more rapidly,
specifically as the square of n.
For example:
If n=10, then the time would be proportional to 10 2=100.
If n=20, then the time would be proportional to 20 2=400.

Examples:

bubble sort, selection sort, insertion sort

24
24
EXAMPLE
Calculate the time complexity of the algorithm.
Main()
{
int a, b, c, d, i;
for (i=0, i<=9, i++) 1
{
1
a = a + b; 1
b = b + 1; 1
} 1
0
d = c + d;
}
0
T() = 1 + 11 + 10 +10 + 1
1
T() = O(11) = O(1)

25
25
EXAMPLE
Calculate the time complexity of the algorithm.
Main()
{
int i, j, k, n;
i = i + 1; 1
for (i=1, i<=n, i++) 1
{
n+1
j = j + 1;
}
k = k + 1; n
}
T() = 1 + 1 + n + 1 + n + 1 = 2n + 4
1
T() = O(n)

26
26
EXAMPLE
Describe the time complexity of an algorithm.

for ( i=0 ; i<n ; i++ )


for( j=0 ; j<n ; j++ )
n Inner Loop:
sum[i] += entry[i][j];
I() = n
n
N-1 Outer Loop:
J() = n
2
T() = O(n )
Total time:
T() = I().J() = n.n
= n2

27
27
EXAMPLE
Calculate the time complexity of the algorithm.
Main()
{
int i, j, a, b; Inner Loop:
for (i=1, i<=n, i++)
1 I() = 2n + 1
{ n+1
for (j=1, j<=n, j++)
Outer Loop:
{ n+1 J() = 2n + 1
a = a + 1;
}
k = k + 1;
n Total time:
} T() = I().J() = (2n+1)
} n (2n+1)+1
T() = O(n2 ) = 4n2 + 4n + 2 + 1
= 4n2 + 4n + 3

28
28
EXAMPLE
Describe the time complexity of an algorithm.

for( i=0 ; i <= n ; i++)


m += i;
n+1
n of this algorithm is O(n).
It takes 2n + 1 operations. The time complexity

29
29
EXAMPLE
Describe the time complexity of an algorithm.
for ( i=1 ; i<n ; i++ )
for( j=0 ; j<i ; j++ )
m += j;

when i = 1, the inner loop executes 1 times, when i = 2, the inner loop executes 2
times, … when i = n, the inner loop executes n times. In all the inner loop executes:

1 + 2 + 3 + ... + n = n(n+1)/2 = n2/2 + n/2

So, T() = O(n2)

30
30
EXAMPLE
Describe the time complexity of an algorithm.

for ( i=0 ; i<=n ; i++ ) Inner Loop:


for( j=0 ; j<=n ; j++ ) n+1 I() = 2n + 1
for( k=0 ; k<=n ; k++ )
n+1
sum[i][j] += entry[i][j][k]; Middle Loop:
n+1 M() = n + 1
n
Outer Loop:
J() = n + 1
T() = O(n3 )
Total Time:
T() = I().M(). J()
= (2n+1)(n+1)(n+1)
= 2n3+5n2+4n+1

31
31
EXAMPLE
Describe the time complexity of algorithm for finding the maximum element in a finite
set of integers.

int maximum(int B[ ],int n)


{
int max = B[0];
for(int j=0; j<=n; j++) 1
{ n+1
if(B[j] > max)
max = [j];
}
n
return max;
n
}
1
It takes 3n + 3 operations. Time complexity is
32
32
O(n)
EXAMPLE
Describe the time complexity of linear search algorithm.

int search(int B[ ], int size, int value)


{
for(int j=0; j<=size; j++)
{
if(B[j] = = value) n+1
return 1;
} n
return 0;
1
}

1
It takes 2n + 3 operations. Time complexity is
O(n).
33
33
BINARY SEARCH
ALGORITHM
found = false; low = 0; high = N – 1;
while (( ! found) && ( low <= high))
{
mid = (low + high)/2;
if (A[mid] = = key)
found = true;
else
if (A[mid] > key)
high = mid – 1;
else
low = mid + 1;
}
In Binary search array must be in sorted order

34
34
found = false; low = 0; high = N – 1;
while (( ! found) && ( low <= high))
{
mid = (low + high)/2;
if (A[mid] = = key) found = true;
else if (A[mid] > key) high =
mid – 1;
else low = mid
+ 1;
}

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
2 3 5 7 10 12 15 22 28 29 32 47 48 50 55 73

key = 29
Average number of comparisons ?
Order of “log(N)” as compared to N.
35
Performance Comparison

NADRA database: ~80,000,000 records


Computer which can perform 10,000 comparisons per second
Linear search: ~2.22 hours
Binary search: ~0.005 seconds

Roughly 1.6 million times less

36
EXAMPLE
Describe the time complexity of following algorithm.
Inner Loop:
grandTotal=0;
I() = 3n – 2
for(k=0; k<n-1; ++k) 1
{
n–1+1 Outer Loop:
rows[k]=0;
for(j = 0; j<n-1; ++j) J() = 2n – 1
{ n–1
rows[k]=rows[k]+matrix[k][j];
n–1+1 Total Time:
grandTotal=grandTotal+matrix[k][j]; T() = I() . J()
} = (3n – 2)(2n –
} n – 1 1)+1
n – 1= 6n2 – 7n + 3

O(n2))
37
37
EXAMPLE (SUMMING THE
LIST OF TWO NUMBERS)
int main() 1
{
int size=10; 1
int a[size]; 1
for (int n =0; n <= size; n++)
n+1
{
cin >> a[n]; n
}

int sum = 0;
1
n+1
for (int i=0; i<=size; i++)
{
n
sum = sum + a[i];
}
1
cout << sum << endl;
1
return 0;
} It takes 3n + 8 operations. The time complexity is O(n).
38
38
EXAMPLE
int search_min(int list[],int from, int to)
{
int i; int min= from;
for (i=from; i<=to; i++)
If (list[i] < list[min])
min = i;
return min;
}

O(n)
39
39
EXAMPLE
void swap(int &a, int &b)
{
int temp=a;
a = b;
b = temp;
}

O(1)
40
40
EXAMPLE
int main()
{ int i,j,k;
for(k=1;k<=n;k++)
{
for(i=k;i<=n-2;i++)
cout<<"*";
for(j=2;j<=k;j++)
cout<<" ";
for(j=2;j<=k;j++)
cout<<" ";
for(i=k;i<=n-2;i++)
cout<<"*";
cout<<endl;
} O(n2))
}
41
41
EXAMPLE
for (int i=1; i<=n; i++)
{ int sum = 0;
if (i != n)
{
for (j=1; j<=n; j++)
{
cin>>num;
cout<<num<< " ";
sum =sum+num;
}
cout <<"sum = “<<sum<<endl;
}
}

O(n2))
42
42
Big Oh (O)
O (1) - constant
O (log n) - logarithmic
O (n) - linear
O (n log n) - log linear

O (n2) - quadratic

O (n3) - cubic

O (2n) - exponential

43
COMPARE RUNNING TIME
GROWTH RATES

44
44
EFFICIENCY OF
ALGORITHMS
Summary: count the number of instructions ignoring constants order of magnitudes

O(1) << O(log n) << O(n) << O(n²)

 We cannot compare two algorithms in the same class


But we know that a running time of O(n) is faster than O(n²), for large enough values of n.
If algorithm1 has a worst-case efficiency of O(n) and algorithm2 has a worst-case efficiency of O(n²), then algorithm1 is
faster (more efficient) in the worst-case

45
45
LOOP INVARIANT
Loop invariant is method to proof the correctness of while loops. Develop a rule of inference for
program segments of the type

while
S is repeatedly executed until condition becomes false.
condition
S must be chosen. Such an assertion is called
An assertion that remains true each time S is executed
Loop Invariant.

46
46
Loop body S executed as long as loop condition holds.

Loop invariant is a condition that is necessarily true immediately before and immediately after each
iteration of a loop.

47
47
THEOREM
Let a loop with guard G be given, together with pre- and post conditions that are predicates in the
algorithm variables.

Also let a predicate I(n), called the loop invariant, be given.

If the following four properties are true, then the loop is correct with respect to its pre- and post-
conditions.

48
48
I. Basis Property:
The pre-condition for the loop implies that I(0) is true before the first iteration of the loop.

II. Inductive property:


If the guard G and the loop invariant I(k) are both true for an integer k ≥ 0 before an iteration of the
loop, then I(k + 1) is true after iteration of the loop.

49
49
III. Eventual Falsity of Guard:
After a finite number of iterations of the loop, the guard
becomes false.

IV. Correctness of the Post-Condition:


If N is the least number of iterations after which G is
false and I(N) is true, then the values of the algorithm
variables will be as specified in the post-condition of the
loop.

50
50
EXAMPLE
A loop invariant is needed to verify that the program segment.

int i = 1;
int factorial = 1;
while (i < n)
{
i=i+1
factorial = factorial· i
}

51
51
52
52
SOLUTION
Let the loop invariant be:
p : “factorial = i! and i ≤ n”

Suppose n = 5 factorial we want to find

Basis Property:
At i=1
p : “factorial = 1! and 1 ≤ 5”

Which are preconditions. Hence already satisfied.


Which is true before the first iteration of the loop.

53
53
Inductive property:
we are assuming that
i=k
p : “factorial = k! and k ≤ n”

we have to show that after execution of loop.


i=k+1
p : “factorial = (k+1)! and k+1 ≤ n”

54
54
Before execution of statement 1.
iold = k

Thus the execution of statement 1 has the following effect:


inew = iold + 1

=k+1

Similarly, before statement 2 is executed,


factorialold = k!

55
55
So after execution of statement 2,
factorialnew = factorialold· inew

= k!· (k+1)
= (k+1)k!
= (k+1)!

Hence after the loop iteration, the loop invariant is true for i = k +1.

This is what we needed to show.

56
56
EXAMPLE
A loop invariant is needed to verify that the program segment.

int power = 1;
int i = 1;
while (i ≤ n)
{
power = power· x
i=i+1
}

57
57
SOLUTION
Let the loop invariant be:
p : i ≤ n + 1 and power = xi-1

58
58
Matrix Multiplication

59
59
60
60

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy