Sie sind auf Seite 1von 30

Analyzing the Algorithm

Chapter 1

Introduction
An algorithm is an effective method for solving a problem expressed as a finite sequence of instructions. In world of computer science the word algorithm has a great significant. Algorithm is developed first , before writing the program. Efficiency of the program is improved by using Good Algorithm.

Algorithm
An algorithm is a finite set of instructions that if followed , accomplishes a particular task in a finite amount of time. Every algorithm must satisfy the following criteria. 1)Input 2)Output 3)Definiteness 4)Finiteness 5)Effectiveness

Characteristics
1) Input: Zero or more quantities are externally supplied 2) Output: At least one quantity is produced 3) Definiteness: Each instruction is clear and unambiguous 4) Finiteness: The algorithm terminates in a finite number of steps. 5) Effectiveness: Every step in the algorithm should be easy to understand & can be implemented using any programming language.

Analysis Of algorithm
Analysis is the process of breaking a complex topic into a smaller parts to gain a better understanding of it. To analyze an algorithm is to determine the amount of resources (such as time and storage) necessary to execute it. Analysis of algorithm means developing a prediction of how fast the algorithm works based on the problem size.

Algorithms Performance
An algorithms performance depends on 1) Internal factors 2) External factors External Factors:a) Speed of the computer on which it is run b) Quality of the compiler Internal Factors:a) Time Required to run b) Space

Performance of Program
To develop a program, we need two major factors: Algorithm Data Structure

The choice of good data structures and algorithm design methods impacts the performance of programs.

The performance of a program can be analyzed by its complexity i.e. the amount of computer memory and time needed to run a program. Space Complexity Time Complexity

Complexity
Computational complexity is a characterization of the time or space requirements for solving a problem by a particular algorithm. The analysis of the program requires two main considerations. 1) Time complexity. 2)Space complexity.

Time Complexity
The time complexity of a program/algorithm is the amount o computer time that it needs to run to completion. OR The total amount of time taken by a program for execution is the time complexity of a program. For measuring the time complexity of an algorithm, we concentrate on developing only the frequency count for all key statements (i.e statements that are important & are the basic instructions of an algorithm.) Complexity of any program is represented by Big O notation.

Time Complexity contd..


Frequency count is defined as the total number of times statements are executed. Thus, Execution time is proportional to the frequency count of an active operation. Eg. Algorithm A contains a= a+1 The statement a=a+1 is independent. The number of times this shall be executed is 1. Therefore frequency count of algorithm A is 1. Eg. 2) for(j=1; j<=n ;j++) b=b*c; The statement b=b*c executes n time. Thus the frequency count is n.

Time Complexity contd..


Amount of Computer time required by a program to execute

Linear Complexity Quadratic Complexity Cubic Complexity Logarithmic Complexity Exponential Complexity

O (n) O (n2) O (n3) O (log n) O (2n)

Space Complexity
The space complexity of an algorithm or program is the amount of memory that it needs to run to completion. The space needed by the algorithm is the sum of the following components. 1) Fixed Part 2) Variable space requirement

Space Complexity contd


Fixed Part: Fixed part is not dependent on the characteristics of the inputs and outputs. Fixed part consist of space for fixed size structure variables & constants.

Variable Space Requirement: This includes a space, needed by variables whose size depends upon the particular problem being solved and the stack space required for recursion.

To Calculate the Time Complexity


for(int i= 0; i< n; i++) { DoProcessing(); for(int j= i; j< n; j++) { EnterStuffinDBMS(); for(int p=1; p< n; p++ ) { QuerySomeResults(); } for(int l= 0; l< m; l++) { } } for(int k= 2; k< m; k++) { } }

p j l i

To Calculate the Time Complexity (continued)


O ( n * (n * (n + m) + m)) O (n * (n2 + nm + m)) O (n3 + n2m + nm) O (n3) Time Complexity of the program
k i j l

Space Complexity V/s Time Complexity


A software which executes in less time requires more space and vice versa. This is known as tradeoff between Space and Time complexity.
S (Space)

T (Time)

Best Case ,Worst Case & Average Case


In some cases, it is important to consider the best, worst and/or average (or typical) performance of an algorithm. There are different types of time complexities which can be analyzed for an algorithm. 1) Best Case Time Complexity 2) Average Case Time Complexity 3) Worst case Time Complexity.

Best Case Time Complexity


It is a measure of the minimum time that the algorithm will require for an input of size n. If an input data of n items is presented in sorted order ,the operations performed by the algorithm will take the least time.

Worst Case Time Complexity


It is a measure of the maximum time that the algorithm will require for an input of size n. If n input data items are supplied in reverse order for any sorting algorithm, then the algorithm will require n^2 operations to perform the sort which will correspond to the worst case time complexity of the algorithm.

Average Case Time Complexity


The average time that an algorithm will require to execute a typical data of size n is known as average case time complexity. The value that is obtained by averaging the running time of an algorithm for all possible inputs of size n can determine average case time complexity.

Example
For example, consider the deterministic sorting algorithm quicksort . This solves the problem of sorting a list of integers which is given as the input. The best-case scenario is when the input is already sorted, and the algorithm takes time O(n log n) for such inputs. The worst-case is when the input is sorted in reverse order, and the algorithm takes time O(n2) for this case. If we assume that all possible permutations of the input list are equally likely, the average time taken for sorting is O(n log n).

Asymptotic Notation
Running time of an algorithm as a function of input size n (for large n). Expressed using only the highest-order term in the expression for the exact running time. f(n)=1+n+n^2 Order of polynomial is the degree of the highest term. O(f(n))=O(n^2)

Asymptotic Notation
1) Big-Oh 2) Omega 3)Theta

Big-oh Notation
The time complexity of a algorithm can be expressed in terms of the order of magnitude of frequency count using the Big-Oh notation When we have only an asymptotic upper bound , we use O-notation. O(g(n)) = { f(n) :there exist positive constants c & n0 such that 0<=f(n)<=cg(n) for all n>=n0 }

Big-oh Notation

Theta Notation ()
The theta notation asymptotically bounds a function from above & below. A function f(n) belongs to the set O(g(n)) if there exist positive constants c1 & c2 such that it can be sandwiched between c1g(n) & c2g(n) , for sufficiently large n. (g(n)) = {f(n) : there exist positive constants c1, c2,& n0 such that 0 c1g(n) f(n) c2g(n) for all n>=n0 }

Theta Notation

Omega Notation
Omega notation provides an asymptotic lower bound. For function g(n), we define (g(n)),Omega of n, as the set: (g(n)) = { f(n) :there exist positive constants c & n0 such that 0 cg(n) f(n) for all n n0}

Omega Notation

Complexity Of Algorithms
Measure the time complexity (in terms of steps) as a function of the input size. f(n)=12n3+1

Das könnte Ihnen auch gefallen