Beruflich Dokumente
Kultur Dokumente
Analysis of Algorithm provides tools to analyze the efficiency of different methods of solutions. The
efficiency of a computer Algorithm can be measured in terms of two qualities:
Answer often depends on the limitations of the technology available at time of analysis.
Time complexity is more important and interesting than space complexity.
Growth
Elementary Operations
Function
O(1) Time requirement is constant, and it is independent of the problems size.
Time requirement for a linear algorithm increases directly with the size of the
O(n)
problem.
Time requirement for a quadratic algorithm increases rapidly with the size of the
O(n2)
problem.
Time requirement for a cubic algorithm increases more rapidly with the size of
O(n3)
the problem than the time requirement for a quadratic algorithm.
As the size of the problem increases, the time requirement for an exponential
n
O(2 ) algorithm increases too rapidly to be practical.
n! Factorial
Time requirement for a logarithmic algorithm increases slowly as the problem
O(log2n)
size increases.
Time requirement for an n*log2n algorithm increases more rapidly than a linear
O(n*log2n)
algorithm.
Generally, we determine the time complexity nothing but a running time for a piece of code but the code
may contain deferent types of statements. So we find the time complexity of each statement then
counting the number of statements performed by a code.
1. Sequence of statements :
The total time is found by adding the times for all statements:
Total time = time(statement 1) + time(statement 2) + ... + time(statement k)
If each statement is simple because only involves basic operations, then the time for each
statement is constant and the total time is also constant: O(1).
2. if-then-else statements:
3. for loops:
4. Nested loops:
Consider loops where the number of iterations of the inner loop is independent of the value of
the outer loop's index.
The outer loop executes N times. Every time the outer loop executes, the inner loop executes M
times. As a result, the statements in the inner loop execute a total of N M times. Thus, the
complexity is O N M . In a common special case where the stopping condition of the inner
loop is j < N instead of j < M. that is the inner loop also executes N times, Therefore, the total
complexity for the two loops is O(N2).
Example1: If the progression of the loop is loop is logarithmic such as the following:
Count=1
While(count<n)
{
Count*=2;
/* Some sequence of O(1) steps */
}
This loop running rime is O(log n).Note that we use a logarithm in an algorithm complexity, we
almost always mean log base 2. This can be explicitly written as O log 2 n . Since each time through the
loop the value of count is multiplied by 2, The number of times the loop is executed is O log 2 n .
Example 2: Consider the following code and find the time complexity.
Now, eliminate the constants and all but the dominant term, then the time complexity is
O n2 .
Asymptotic Bounds:
Computational Complexity:
a) Best case: The best case complexity of the algorithm is the function defined by the minimum
number of steps taken on any instance of size n.
b) Average case: The average-case complexity of the algorithm is the function defined by an
average number of steps taken on any instance of size n.
c) Worst case: The worst case complexity of the algorithm is the function defined by the maximum
number of steps taken on any instance of size n.