Beruflich Dokumente
Kultur Dokumente
Pass1.
1. Compare 1st element with 2nd element. If 1st element > 2nd element, then exchange it.
2. Compare 2nd element with 3rd element. If 2nd element > 3rd element, then exchange it.
3. Compare 3rd element with 4th element. If 3rd element > 4th element, then exchange it
Compare (n-1)th element with nth element. If less, then exchange it.
Pass 1
In first pass, n-1 comparisons will made as a result of which one largest element will bubble
out to the end.
Pass2
Repeat Pass 1 with one less comparison that is n-2 number of comparisons will carry out in
this pass. One another largest element will bubble out in this step.
Pass3
Repeat Pass 1 with two less comparisons that is n-3 number of comparisons will made in
this pass.
Pass n-1
In the last pass, one comparison is made that is Compare 1st element with 2nd element.
Bubble-Sort Algorithm(Array: A, Size: n)
Step 1. Input List A
Step 2. Repeat Steps 3 & 4 for I = 1 to n-1
Step 3. Set J=1
Step4. Repeat while J <= n-I
i.If A[J] > A[J+1] then
Exchange A[J] and A[J+1]
( End if )
ii. Set J = J + 1
( End of Inner loop )
( End of Outer loop )
Step5. Exit
Time Complexity of Bubble Sort Algorithm
The running time/cost of the Bubble Sort algorithm depends upon the number of iterations
carried out in arranging elements in the array. Assume each iteration cost = c = O(1)
Pass Number of iterations
1 n–1
2 n–2
3 n-3
! !
! !
n-2 2
n-1 1
Cost = Total Number of iterations
= (n-1) + (n-2) + (n-3) + ……+2+1
= n(n-1)/2
= (n2 – n)/2
= n2/2 – n/2
Total cost = cost per iteration*Total number of iterations
= O(1) *[ n2/2 – n/2]
= (n2/2) – (n/2)
f(n) = O(n2)
Best Case: if the input list is already in sorted order. In this case, the comparisons in the
inner loop of bubble sort algorithm are made but the condition for exchanging the items will
remain false each time. So the running time/cost of algorithm will reduce by some constant
that we neglect, hence Cost in best case = O(n2)
Worst Case: The bubble sort algorithm will run in worst case, if the input list is in reverse-
sorted order. In this case the condition for exchanging the elements in the inner loop will
remains true in each case. So running time or cost = O(n2)
Average Case: The bubble sort algorithm will run in average case, if the input list is in un-
sorted order. In this case, the condition inside the inner loop will true for some iterations
and false for some.
For false condition, no cost of exchanging elements will incur. So the cost will little bit
reduce as compare to worst case. So the running time/cost in average case = O(n2).
Quick Sort Algorithm/Partition-Exchange Sort: Quick sort or Partition-exchange sort, is
a sorting algorithm developed by Tony Hoare that, on average, makes O(n log n)
comparisons to sort n items. Quick Sort algorithm is one of the best sorting algorithms
because it is remarkably efficient on the average because its average cost is O(n logn). It has
also the advantage of sorting in place. It works well even in virtual memory environment.
Strategy: Quicksort is a divide-and-conquer sorting algorithm in which division of list is
dynamically carried out. The three steps of Quicksort algorithm are as follows:
Divide: In this step, we take an element as pivot and divide the array into two sub lists
logically in such a way that each element in the left sub array is less than the pivot and each
element in the right sub array is greater than the pivot point.
Conquer: Repeat the process recursively for the left and right sub arrays.
Combine: Since the sub arrays are sorted in place, no cost is needed to combine them.
Steps
1. Select an element A[q] as a pivot from the array A[p…r]. The leftmost/rightmost
element in the array is usually used as pivot.
2. Scan the array A[r…p] from the right to left and flush to the right all the keys that are
≥ pivot.
3. Scan the array A[p…r] from the left to the right and flush to the left all the keys that
are ≤ pivot.
4. Arrange the elements around the pivot as the array splits into two sub arrays A[p…q-
1] and A[q+1…r] such that
A[p…q-1] ≤ pivot ≤ A[q+1…r]
5. Recursively repeat the steps for each of the sub lists A[p…q-1] and A[q+1…r].
Algorithm: Quicksort (Array: A, p, r)
1. if p ≥ r then Return
2. q = Partition (A, p, r)
3. Quicksort (A, p, q-1)
4. Quicksort (A, q + 1, r)
Partition(A, p, r)
Piv = A[p]
j=p
for i = p+1 to r do
if (A[i] ≤ Piv) then
Exchange (A[i], A[j])
j=j+1
( End if ) ( end of loop ) Exchange(pivot, A[j])
Analysis of Quick Sort Algorithm
Worst Case
The quick sort algorithm will run in worst case, if the original list is already sorted or reverse
order sorted. In this case, the Partitioning routine will produce one sub list of n-1 elements.
Another Partitioned routine produces another sub list of n-2 elements and so on.
n
1
n-1
1 n-
!
1
!
!
1 1
c = 1, k=0
Now logb a = log2 2 = 1
Here c = logb a
So it is the 2nd case of Master theorem which states that for some constant k ≥ 0
f(n) = O( nc logk n) where c = logb a then
T(n) = O( nc logk+1 n)
Or T(n) = O(n1 log0+1 n)
= O(n log n) Hence in best case, the quick sort algorithm runs in O(n log n).
Average Case
The Quick sort algorithm runs in average case, if it works on a random input array. In this
case, we expect that some of the splits will be reasonably well-balanced and some will be
fairly unbalanced.
If the recurrence is represented in this case by tree, then the good and bad splits are
distributed randomly through out the tree in such a way that good and bad splits are at
alternate level.
n
1 n-1
(n-1)/2 (n-1)/2
Their combine cost = 1 + (n-1) + (n-1)/2
= n + ( n-1)/2
= (2n + n - 1)/2
= 3n-1/2
= O(n)
Implementation issues
Choice of pivot: In very early versions of quicksort, the leftmost/rightmost element of the
partition would often be chosen as the pivot element. Unfortunately, this causes worst-case
behavior on already sorted arrays, which is a rather common use-case.
Solution: The problem can be solved by choosing either a random index for the pivot or
choosing the middle index of the partition (especially for longer partitions) or choosing
the median of the first, middle and last element of the partition for the pivot.
This "median of three" rule counters the case of sorted (or reverse-sorted) input, and gives a
better estimate of the optimal pivot (the true median) than selecting any single element,
when no information about the ordering of the input is known.
More Repeated elements: With a partitioning algorithm (even with one that chooses good
pivot values), quicksort exhibits poor performance for inputs that contain many repeated
elements.
Solution: To solve this quicksort problem, an alternative linear-time partition routine can be
used that separates the values into three groups:
Values less than the pivot, values equal to the pivot, and values greater than the pivot
(Bentley and McIlroy call this a "fat partition").
The values equal to the pivot are already sorted, so only the less-than and greater-than
partitions need to be recursively sorted as a result of which the algorithm will run in average
case. For repeated elements, the pseudocode quicksort algorithm becomes as
The best case for the algorithm now occurs when all elements are equal. In the case of all
equal elements, the modified quicksort will perform at most two recursive calls on empty
subarrays and thus finish in linear time.
Prim’s Algorithm
It is a greedy algorithm.
Time complexity of Prims Algorithm
The time complexity of Prim's algorithm depends on the data structures used for the graph
and for ordering the edges by weight, which can be done using a priority queue. The
following table shows the typical choices:
Graph is a non-linear data structure, made up of set of nodes and lines, where nodes
are called Vertices or points and lines are called Edges or arcs.
Each Edge E is identified by a unique unordered pair [u,v] i.e. e = [u, v] where u and v
denotes the start and end nodes of edge e respectively. u and v are also called
adjacent nodes or neighbors.
The degree of a node u, deg(u), is the number of edges containing u.
If deg(u) = 0 i.e. u does not belong to any edge, then u is called an isolated node.
Types of Graph
Undirected Graph: A graph having edges that have no direction, is called undirected graph.
This kind of graph is also called as undigraph.
The out degree of a node u in G, outdeg(u) is the number of edges beginning at u. Similarly
the indegree of u, indeg(u) is the number of edges ending at u.
A node u is called a source if it has a positive outdegree but zero indegree. Similarly a node u
is called a sink node if it has a zero outdegree but a positive indegree.
Complete Graph: A connected graph G is said to be complete if every node u in G is
adjacent to every other node v in G. A complete graph with n nodes will have n(n-1)/2
edges.
Tree Graph: A connected graph T without any cycles is called a Tree graph or Free tree or
Tree. There is a unique simple path between any two nodes u and v in tree graph. If T is a
finite tree having m nodes then it will have m-1 edges.
Weighted Graph: A graph G is said to be weighted, if each edge e is assigned a weight (+ve
numeric) value. Each edge in a weighted graph is represented by e = [u , v , w]
The weight of path in graph is the sum of the weights of the edges along the path.
Regular Graph: A graph is said to be a regular graph, if each node of graph has equal
number of indegree and outdegree.
Isomorphic Graphs: Two graphs are said to be isomorphic if they have same behavior in
terms of graphical property. The conditions for isomorphism are:
The number of nodes in two graphs must be same. There must be the same number of
edges in the two graphs. All the corresponding nodes of the two graphs must have same in
degree and out degree.
Representation of Graph in memory
Adjacency-Matrix Representation
A directed graph having n nodes is represented n x n matrix such that
1 Otherwise
Suppose we have a four vertices directed graph.
Undirected Graph
An undirected graph G having n nodes is represented by n x n adjacent matrix such that
Matrix A = aij = 1 if there is an edge between vi and vj
1 otherwise
The adjacency matrix of undirected graph G is a symmetric matrix i.e. aij = aji for every i & j.
Suppose we have a four vertices undirected graph.