01 - Complexity Analysis of Algorithms
01 - Complexity Analysis of Algorithms
1
Algorithm Analysis
❑ I have some computational problems need to solve.
❑ Emre and Selin have different ideas for how to solve it.
❑ How do we know which is better?
❑ Easy! Have them both write the code and run it,
and see which is faster.
2
Motivations for Complexity Analysis
❑ There are often many different algorithms which can be used to solve
the same problem.
❑ Develop some techniques that allow us to:
▪ compare different algorithms with respect to their “efficiency”
▪ choose the most efficient algorithm for the problem
3
Motivations for Complexity Analysis
❑ Complexity analysis helps evaluate the
efficiency of an algorithm in terms of
time and space.
▪ time efficiency: the time it takes to execute
▪ space efficiency: the memory space it uses
5
Machine independence
❑ The evaluation of efficiency should be as machine independent as possible.
❑ It is not useful to measure how fast the algorithm runs as this depends on which
particular computer, OS, programming language, compiler, and kind of inputs are
used in testing
❑ Instead,
▪ we count the number of basic operations the algorithm performs.
▪ we calculate how this number depends on the size of the input.
6
Example of Basic Operations:
❑ Arithmetic operations: *, /, %, +, -
❑ Assignment statements of simple data types x=5
❑ Reading of primitive types x = scanner.nextInt()
❑ Writing of a primitive types System.out.print(x)
❑ Simple conditional tests if (x < 12) ...
❑ Method call (the execution time of the method itself may depend on the value of
parameter and it may not be constant) c = calculate(a,b)
❑ A method's return statement return x
❑ We consider an operation such as ++ , += , and *=
as consisting of two basic operations
❑ Memory Access
Note: To simplify complexity analysis we will not consider memory access (fetch or
store) operations.
7
Example of Basic Operations:
❑ Consider an algorithm that finds the maximum value in an unsorted array of
integers.
❑ The algorithm simply scans through the entire array and keeps track of the
largest number found so far.
Algorithm:
❑ Input: an array arr[] of size n.
❑ Process: start by assuming the first element is the maximum.
then iterate through the rest of the array, updating the maximum if a larger
element is found.
❑ Output: the maximum value in the array.
8
Example of Basic Operations:
Pseudocode:
9
Example of Basic Operations:
Step 1: Identify Basic Operations
❑ In this algorithm, the basic operations can be considered as:
1.Comparison: arr[i] > max_value
2.Assignment: max_value = arr[i]
❑ These are constant-time operations because their execution time does not depend on
the size of the input.
10
Example of Basic Operations:
Step 2: Count Basic Operations
❑ For an array of size n, the loop runs from index 1 to n-1, so it iterates n-1 times.
❑ In each iteration, two basic operations are performed:
• one comparison arr[i] > max_value
• one assignment max_value = arr[i] (when the comparison is true)
❑ Thus, in the worst case, the total number of basic operations will be:
• comparison: n-1 comparisons.
• assignment: In the worst case, an assignment will occur in each of the n-1 iterations, so
there are n-1 assignments.
11
Example of Basic Operations:
Step 3: Calculate Efficiency
The total number of basic operations in the algorithm is:
❑ Total operations = (n−1)(comparisons) + (n−1)(assignments) = 2(n−1)
❑ This result gives us a linear function of the input size n.
12
Best ( ), Average ( ) and Worst (O) case complexities
❑ We are usually interested in the worst case complexity ➔ what are the
maximum operations that might be performed for a given problem size.
❑ We will not discuss the other cases -- best case and average case.
❑ Why we focus on worst case analysis
▪ easier to compute
▪ usually close to the actual running time
▪ crucial for the real-time systems
(e.g. air-traffic control)
13
Best ( ), Average ( ) and Worst (O) case complexities
14
Asymptotic Complexity
❑ Finding exact complexity, number of basic operations ➔ f(n),
of an algorithm is difficult.
❑ The function g(n) is sufficiently close to f(n) for large values of the
input size n.
15
Asymptotic Complexity
❑ Asymptotic complexity measure does not give the exact number
of operations of an algorithm, but it shows how that number grows
with the size of the input.
time
❑ This allows to understand the efficiency
of an algorithm in the long term,
independent of specific machine
parameters (such as CPU speed or
memory size).
input, n
16
Big-O (asymptotic) Notation
❑ Defn: f(n) = O(g(n)) ➔ if there exists positive constants c and n0 such that:
f(n) c*g(n) for all n n0
17
Big-O (asymptotic) Notation
Implication of the definition:
❑ For all sufficiently large n, c*g(n) is an upper bound of f(n)
Note: By the definition of Big-O :
f(n) = 3n + 4 is O(n)
it is also O(n2),
it is also O(n3),
...
it is also O(nn)
❑ However, when Big-O notation is used, the function g(n) is CHOSEN
TO BE AS SMALL AS POSSIBLE
❑ We call such a function g(n) a tight asymptotic bound of f(n)
18
Big-O (asymptotic) Notation
O(1) Constant
❑ Some Big-O complexity classes in O(log(n)) Logarithmic
order of magnitude from smallest
O(n) Linear
to highest:
O(n log(n)) Linearithmic
O(nx) {e.g., O(n2), O(n3), etc} Polynomial
19
Examples of Algorithms and their big-O complexity
20
Warnings about O-Notation
❑ Big-O notation cannot compare algorithms in the same complexity class.
❑ It only gives sensible comparisons of algorithms in different complexity
classes when n is large.
❑ Consider two algorithms for same task:
Linear: f(n) = 1000 n
21
Rules for using big-O
For large values of input n , the constants and terms with lower degree of n are ignored.
1.Multiplicative Constants Rule: Ignoring constant factors
O(c f(n)) = O(f(n)), where c is a constant;
Example: O(20 n3) = O(n3)
22
Proving Big-O Complexity
To prove that f(n) is O(g(n)), we find any pair of values n0 and c that satisfy:
f(n) ≤ c*g(n) for n n0
Note: The pair (n0 , c) is not unique. If such a pair exists then there are infinite number
of such pairs.
Example: Prove that f(n) = 3n2 + 5 is O(n2)
We try to find some values of n0 and c by solving the following inequality:
3n2 + 5 cn2 OR 3 + 5/n2 c
(By putting different values for n0, we get corresponding values for c)
n0 1 2 3 4
c 8 4.25 3.55 3.3125 3
23
Proving Big-O Complexity
Example:
Prove that f(n) = 3n2 + 4n log n + 10 is O(n2) by finding appropriate values for c and n0
(We used Log of base 2, but another base can be used as well)
n0 1 2 3 4
24
More Big-O Examples
7n-2
7n-2 is O(n)
need c > 0 and n0 1 such that 7n-2 c*n for n n0
this is true for c = 7 and n0 = 1
◼ 3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0 1 such that 3n3 + 20n2 + 5 c*n3 for n n0
this is true for c = 4 and n0 = 21
◼ 3 log n + 5
3 log n + 5 is O(log n)
need c > 0 and n0 1 such that 3 log n + 5 c*log n for n n0
this is true for c = 8 and n0 = 2
25
How to determine complexity of code structures
i=1;
while (i < n) {
sum = sum + i; O(log n)
i = i*2
}
26
How to determine complexity of code structures
Nested Loops:
Complexity of inner loop * complexity of outer loop.
Examples:
sum = 0
for(int i = 0; i < n; i++)
for(int j = 0; j < n; j++) O(n2)
sum += i * j ;
i = 1;
while(i <= n) {
j = 1;
while(j <= n){
statements of constant complexity O(n log n)
j = j*2;
}
i = i+1;
}
27
How to determine complexity of code structures
Example:
for (int j = 0; j < n*n; j++)
sum = sum + j;
for (int k = 0; k < n; k++)
sum = sum - l;
System.out.print("sum is now ” + sum);
28
How to determine complexity of code structures
Switch: Take the complexity of the most expensive case
char key;
int[] X = new int[n];
int[][] Y = new int[n][n];
........
switch(key) {
case 'a':
for(int i = 0; i < X.length; i++)
sum += X[i]; o(n)
break;
case 'b':
for(int i = 0; i < Y.length; j++)
for(int j = 0; j < Y[0].length; j++) o(n2)
sum += Y[i][j];
break;
} // end of switch block
Overall Complexity: O(n2)
29
How to determine complexity of code structures
else
System.out.println("Error! Enter '+' or 'x'!"); O(1)
30
How to determine complexity of code structures
Note: Sometimes a loop may cause the if-else rule not to be applicable.
• Consider the following loop:
while (n > 0) {
if (n % 2 == 0) {
System.out.println(n);
n = n / 2;
} else{
System.out.println(n);
n = n – 1;
}
}
• The else-branch has more basic operations; therefore one may conclude that the loop is O(n).
• However, the if-branch dominates. For example if n is 61, then the sequence of n is:
60, 30, 15, 14, 7, 6, 3, 2, 1, and 0. Hence the loop is logarithmic and its complexity is O(log n)
32
O – Omega - Theta
❑ Big-O is an upper bound
▪ code will run up to this maximum limit
❑ Big-Omega is a lower bound
Big-Omega
34
Graph Examples: Omega ()
35
Graph Examples: Theta ()
36
Omega () notation example
Big-Omega
37
Omega () notation example
❑ f(n) = 5n2-3n, g(n) = n2 → is f(n) = (g(n)) ? How?
▪ A function f(n) is Ω(g(n)), if there exist positive constants c and n0 such that
for all n≥n0, the following inequality holds: f(n)≥c⋅g(n)
▪ Applying the Definition: f(n)=5n2−3n g(n)=n2
▪ We need to find constants c>0 and n0 such that for all n≥n0: 5n2−3n ≥ c⋅n2
▪ Simplifying the Inequality: Divide both sides by n2 ➔ 5−3/n ≥ c
▪ As n increases, the term 3/n decreases and approaches 0
▪ Therefore, for sufficiently large n, 3/n becomes negligible
▪ Choosing Constants: Let's choose c=4, then we need ➔ 5−3/n≥4
▪ Subtract 4 from both sides ➔ 1≥3/n
▪ Multiply both sides by n ➔ n≥3
▪ Therefore, for n≥3 and c=4 ➔ f(n) = (g(n))
38
Omega () notation example
39
Theta () notation example
Big-Theta c1*g(n) ≤ f(n) ≤ c2*g(n)
n ≥ n0
41