timeandspacecomplexity-121202055245-phpapp02
timeandspacecomplexity-121202055245-phpapp02
Time Complexity
The total number of steps involved in a solution to solve a problem is the function of the size of the
problem, which is the measure of that problem’s time complexity.
(c) < O(log n) < O(n) < O(n log n) < O(nc) < O(cn) < O(n!), where c is some
constant.
Space Complexity
Space complexity is measured by using polynomial amounts of memory, with an infinite amount of time.
The difference between space complexity and time complexity is that space can be reused.
Space complexity is not affected by determinism or nondeterminism.
Amount of computer memory required during the program execution, as a function of the
input size
Estimation/Prediction
When you write/run a program, you need to be able to predict its
needs, its requirements.
Usual requirements
- execution time
- memory space
Quantities to estimate
execution time time complexity
memory space space complexity
(i) If m?n, then m is replaced by m-n which is smaller than the previous value of m, and still
non-negative.
(ii) If n>m, m and n are exchanged, and at the next iteration case (i) will apply.
So at each iteration, max(m,n) either remains unchanged (for just one iteration) or it decreases.
This cannot go on for ever because m and n are integers (this fact is important), and eventually a
lower limit is reached, when m=0 and n=g.
So the algorithm does terminate.
The objective in testing is to "exercise" all paths through the code, in different combinations.
A convenient way of describing the growth rate of a function and hence the time complexity
of an algorithm.
Let n be the size of the input and f (n), g(n) be positive functions of n.
The time efficiency of almost all of the algorithms we have discussed can be characterized by
only a few growth rate functions:
This means that the algorithm requires the same fixed number of steps regardless of the size
of the task.
Examples:
A. Some more simplistic sorting algorithms, for instance a selection sort of n elements;
B. Comparing two two-dimensional arrays of size n by n;
C. Finding duplicates in an unsorted list of n elements (implemented with two nested loops).
Examples:
A. Binary search in a sorted list of n elements;
B. Insert and Find operations for a binary search tree with n nodes;
C. Insert and Remove operations for a heap with n nodes.
Examples:
A. More advanced sorting algorithms - quicksort, mergesort
The best time in the above list is obviously constant time, and the worst is exponential time which, as
we have seen, quickly overwhelms even the fastest computers even for relatively small n.
Polynomial growth (linear, quadratic, cubic, etc.) is considered manageable as compared to
exponential growth.
O(l) < O(log n) < O(n) < O(n log n) < O(n2) < O(n3) < O(an)
Therefore,
loga n = C logb n
Since functions that differ only by a constant factor have the same order of growth, O(log2 n) is the
same as O(log n).
Therefore, when we talk about logarithmic growth, the base of the logarithm is not important, and we
can say simply O(log n).
If a function (which describes the order of growth of an algorithm) is a sum of several terms, its order
of growth is determined by the fastest growing term. In particular, if we have a polynomial
p(n) = O(nk)
Example: