0% found this document useful (0 votes)
2 views22 pages

1.Complexity

The document outlines guidelines for analyzing computational complexity, including the running time of various programming constructs such as loops, conditionals, and function invocations. It introduces asymptotic complexity as a measure of algorithm efficiency as input size increases and explains Big-Oh notation for upper bounding functions. Examples illustrate how to approximate running times and compare growth rates of different functions.

Uploaded by

Anas Aqeel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views22 pages

1.Complexity

The document outlines guidelines for analyzing computational complexity, including the running time of various programming constructs such as loops, conditionals, and function invocations. It introduces asymptotic complexity as a measure of algorithm efficiency as input size increases and explains Big-Oh notation for upper bounding functions. Examples illustrate how to approximate running times and compare growth rates of different functions.

Uploaded by

Anas Aqeel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Computational Complexity

Guideline 1

The running time of a comments, declarative and function


statements count to zero

template <class Type>


Declarative Statements
class List ;
int age;
comments //a variable to store age
Function statements void Print();

Total time = zero


2
Guideline 2

Expressions, memory management & assignment statements for


primitive data types have constant running time i.e. one.

int a = 4+5;
Expressions and
assignment statements
char b =‘a’;

Memory Management
int * a = new int;
statements delete a;

Total time = constant


3
Guideline 3

Memory management & assignment statements for objects have


constant running time i.e. size of object.

Student a,b;
Memory Management
statement of object Student * a = new Student;
delete a;
Assignment statement
of object a = b;

Total time = constant


4
Guideline 4

Function invocations count as 1 step

Total time = constant

5
Guideline 5: Loops

The running time of a loop is, at most, the running time of the
statements inside the loop (including tests) multiplied by the number
of iterations.

for (i=1; i<=n; i++)


executed {
n times constant time
m = m + 2;
}
Total time = (constant) c * n = cn

6
Guideline 6: Nested loops

Analyse inside out. Total running time is the product of the


sizes of all the loops.

for (i=1; i<=n; i++) {


outer loop for (j=1; j<=n; j++) { inner loop
executed executed
n times
k = k+1; n times
}
} constant time

Total time = c * n * n * = cn2

7
Guideline 7: Consecutive statements

Add the time complexities of each statement.

constant time c0 x = x +1;


for (i=1; i<=n; i++) {
m = m + 2;
execute
constant time c1
} d
for (i=1; i<=n; i++) { n times
outer for (j=1; j<=n; j++) { inner
loop k = k+1; loop
executed } executed
constant time c2
n times } n times
Total time = c0 + c1n + c2n2

8
Guideline 8: If-then-else statements
if( exp)
statements1;
else
statements2;
Cost is the number of steps corresponding to exp, statements1 and
statements2.

If exp: if (depth( ) != otherStack.depth( ) ) {


Constant c0 return false; Statements1:
} Constant c1
else {
for (int n = 0; n < depth( ); n++) {
else part:
another if : if (!list[n].equals(otherStack.list[n]))
return false;
(constant c2 +
constant +
constant } Constant c3) * n
(no else part) }

Total time = c0 + c1 + (c2 + c3) * n


9
Problems with T(n)
•T(n) is difficult to calculate
•T(n) is also not very meaningful as step size is not
exactly defined
•T(n) is usually very complicated so we need an
approximation of T(n)….close to T(n).
•This measure of efficiency or approximation of T(n)
is called ASYMPTOTIC COMPLEXITY or
ASYMPTOTIC ALGORITHM ANALYSIS

Asymptotic complexity studies the efficiency of an algorithm as


the input size becomes large

10
Example
► If T(n) = 7n+100
► What is T(n) for different values of n???

n T(n) Comment
1 107 Contributing factor is 100
5 135 Contributing factor is 7n and 100
10 170 Contributing factor is 7n and 100
100 800 Contribution of 100 is small
1000 7100 Contributing factor is 7n
10000 70100 Contributing factor is 7n
106 7000100 What is the contributing factor????

When approximating T(n) we can IGNORE the 100 term for very
large value of n and say that T(n) can be approximated by 7(n)
Example 2

• T(n) = n2 + 100n + log10n +1000


n T(n) n2 100n log10n 1000

Val % Val % Val % Val %


1 1101 1 0.1% 100 9.1% 0 0% 1000 90.8%
10 2101 100 5.8% 1000 47.6% 1 0.05% 1000 47.6%
100 21002 10000 47.6% 10000 47.6% 2 0.99% 1000 4.76%

105 10,010,001,005 1010 99.9% 107 .099% 5 0.0% 1000 0.00%

When approximating T(n) we can IGNORE the last 3 terms and


say say that T(n) can be approximated by n2
Big-Oh
► Definition:

f(n) is O(g(n)) if there exist positive numbers c & N such that


f(n)<=cg(n) for all n>=N

g(n) is called the upper bound on f(n) OR


f(n) grows at the most as large as g(n)
Example:
T(n) = n2 + 3n + 4
n2 + 3n + 4 <= 2 n2 for all n>10

so we can say that T(n) is O(n2) OR


T(n) is in the order of n2.

T(n) is bounded above by a + real multiple of n2


Properties of Big-Oh
• If f(n) is O(g(n)) and g(n) is O(h(n)) then f(n) is O(h(n))
• If f(n) is O(h(n)) and g(n) is O(h(n)) then f(n) + g(n) is O(h(n))
• ank is O(nk) where a is a constant
• nk is O(nk+j ) for any positive j
• If f(n)=cg(n) then f(n) is O(g(n))
Growth rates
Which growth rate is better???

Graph from Adam Drozdek’s


Comparison of Growth Rates
N log2N N log2N N2 N3 2N

1 0 0 1 1 2

2 1 2 4 8 4

8 3 24 64 512 256

64 6 384 4096 262,144 18446744073709551616

128 7 896 16,384 2,097,152 Approx 6 billion years,


600,000 times more than
age of univ.

(If one operation takes 10-11


seconds)
Big O??
Running Time: O(1)+O(n)+O(n)=
O(n) void print(int * array, int size){
cout<<"Array: ";
for(int i = 0; i < size; i++)
O(n) cout <<array[i]<<" ";
cout<<endl;
}
int smallest(int array[], int size, int start){
if(start == size-1)
O(1) return start;
else
{
int small = start;
O(n for(int i = start+1; i <size; i++){
) if(array[small]>array[i])
O(n) small= i;
}
return small;
}
}
void main(){
O(1) const int n= 5;
int array[n]={3,5,9,6,1};
cout<<"Smallest:
O(n
"<<array[smallest(array,n,0)]<<endl;
) print(array,n);
O(n }
)
Big-O?
void sortArray(int * a, int size){
for(int i = 0;i<size;i++){
O(n2) O(n int small = smallest(a,size,i);
O(n) ) int temp = a[i];
O(1) a[i] = a[small];
a[small] = temp;
}
}

void main(){
const int n= 5;
int array[n]={3,5,9,6,1};
sortArray(array,n);
print(array,n);
}

Running Time: O(1)+O(n2)+O(n)= O(n2)


Big-O?

void main(){
const int n= 5;
int array[n]={3,5,9,6,1};
for(int i =0; i <n; i++)
O(n3 O(n2) sortArray(array,n);
) print(array,n);
}

Running Time: O(1)+ O(n3) = O(n3)


Big-O?

bool BinarySearch(int * a,int size, int search){


int high = size-1;
int low = 0;
int mid ;
while (low<=high){
mid = (high+low)/2;
if(a[mid]==search)
return true;
else if(search < a[mid]){
high = mid-1;
}
else
low = mid+1;
}
return false;
}
Iteration 1
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Low Mid High

Iteration 2
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Low Mid High

Iteration 3
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Low High
Mid
Iteration 4
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15

Low High n =16 = 24


Mid Iterations = 4
Log224= 4 O(log2n)

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy