Sie sind auf Seite 1von 11

Faculty of Engineering and Computing M70CDE Module (Advance Programming Techniques) Diay2 Sorting Efficiency

2011
Coursework 2

By: Saud Aljaloud

2011 M70_CW2_ By: Saud Aljaloud

WARING: IF YOU WANT GET FAIL IN M70CDE Module COPY THIS DOCUENT, SO BE CAERFULE I got 80 out of 100 on this document but What I mean try to avoid plagiarism as you can
Hey!! Avoid plagiarism. DO NOT COPY PASTE but YOU CAN REWRITE!! FREE Tip: There is a brilliant website to check the percentage of plagiarism www.writecheck.com

CONTENTS Introduction...............................3 1.1. 1.1.1. 1.1.2. 1.1.3. 1.2. 1.2.1. 1.2.2. 1.2.3. 1.2.4. What are three sorting algorithms? ....................................3 Snail-Sort.3 Selection-Sort..4 Quick-Sort.......................4 Comparing the three sorting algorithm ..5 Snail-Sort 5 Selection-Sort..6 Quick-Sort....7 The efficiency of the three sorting algorithms.............................................................................8

2.2. The evaluation of listInSort ...9 2.3. Big-O notation ...............................................................................................................................10 Figures: -Figure1: Show how Snail-Sort works. 3 -Figure2: Shows how the selection-Sort works....4 -Figure3: shows how Quick-Sort works...4 -Figure4: The six tests of the snail_sort....................................................................................................5 2011 M70_CW2_ By: Saud Aljaloud 2

-Figure5: growth rate of snail sort algorithm ...5 -Figure6: The six tests of the selection sort..............................................................................................6 -Figure7: growth rate of selection sort algorithm.6 -Figure8: The six tests of the quick sort...7 -Figure9: growth rate of quick sort algorithm .7 -Figure10: Table shows the number of comparisons8 -Figure11: Line graph for the three sorting algorithms showing the growth-rate8 -Figure12: Table shows the number of comparisons for listInSort..9 -Figure11: Line graph for listInSort algorithms showing the growth-rate...9 -Figure14: Table shows the algorithms in Big-O notation.10 -Figure15: The second table shows the algorithm after applying calculations..10 -Figure16: the graph shows the Big-O notation growth-rate..10

Introduction This coursework will evaluate the efficiency of three sorting algorithms: Snail-Sort, SelectionSort and Quick-Sort. To evaluating these three sorting algorithms, the number of comparisons will be counted. Therefore, by running three comparisons on each list size (10, 20, 40, 80, 160, 320); the average number of three comparisons will be counted. Furthermore, using MS Excel to produce the graph shows the differences between three sorting algorithms in terms of efficiency as well as defining the Big-O formula and if it is confirmed by this test result. 1.3. What are three sorting algorithms? 1.3.1. Snail-Sort It is a simple sorting algorithm and seems to be like Bubble Sort. The average case and worst case for this algorithm are O(2 ). This algorithm works by comparing the first element within the second, and if the first is greater than the second, then swap them with each other, else leave them in the same position. Next, compare the first with the rest of elements if there is one lesser than it, then swap them. Repeat this for the second element then third element till last one.

2011 M70_CW2_ By: Saud Aljaloud

Figure1: Show how Snail-Sort works.

1.3.2. Selection-Sort It is an in-place comparison sort and has the same O(2 ) complexity in worst case,

average case and best case. It works by finding the minimum element and swaps it with the first position, then start from next position applying the same as above for remaining elements in the list. As a result, the running time for selection: O(n + (n-1) + (n-2) + + 1) = O(2 ).

Figure2: Shows how the selection-Sort works. 1.3.3. Quick-Sort It is a divide and conquer algorithm and the main idea is to partition. However, the list will be divided into three parts: The first part has a single element in its sorted position
2011 M70_CW2_ By: Saud Aljaloud 4

called a pivot. The second part is a sub-list that its elements are smaller than the pivot. The third part is a sub-list which its elements are larger than the pivot. Applying this idea to the smaller and to the larger sub-lists, the list will be sorted. On average case, quick-sort requires O( ) whereas in the worst case it requires O(2 ). However, It can be implemented as in-place sort requiring just O( ).

Figure3: shows how Quick-Sort works.

1.4.

Comparing the three sorting algorithm The coursework is required to test those sorting algorithms by running three comparisons tests within different amounts of data. So, here is the result of this test:

1.4.1. Snail-Sort: After running six tests and counting the average of those number of comparisons, it is noticeable that a snail-sort is the worst sorting algorithm in terms of large data, it requires O(2 ). However, between the numbers 10 to 40 of data, it is mostly stable while the amount of data is increased; the number of comparisons is also increased significantly from 80 to 320. As a result, the number of comparisons depends on the amount of data.

N 10 20 40 80 160 320

SnailSort Test1 Test2 Test3 Test4 Test5 Test6 87 132 107 72 86 128 628 806 757 661 766 566 3178 4674 5259 4846 4950 3627 29960 31408 32656 32139 29642 38664 239124 232885 216207 253641 250611 203828 1849343 1918710 1719916 1771351 1995151 1867102

Average 102 697 4422 32411 232716 1853595

Figure4: The six tests of the snail_sort.


5

2011 M70_CW2_ By: Saud Aljaloud

SnailSort
2000000 1500000 1000000 500000 0 SnailSort

Figure5: growth rate of snail sort algorithm

1.4.2. Selection-Sort:

The selection-sort is better than snail-sort as the graph shown that, but it requires O(2 ) in

all cases( best, average or worst). However, it is simplistic and easy to implement as well as bubble-sort. By looking at the graph, between 10 to more than 80 which the amounts of data, the selection-sort is stable. However, by 160 t0 320 it is increased gradually as data increased. Therefore, it is also depend on n (data size).
N 10 20 40 80 160 320 SelectionSort Test3 Test4 45 45 190 190 780 780 3160 3160 12720 12720 51040 51040 Average 45 190 780 3160 12720 51040

Test1 45 190 780 3160 12720 51040

Test2 45 190 780 3160 12720 51040

Test5 45 190 780 3160 12720 51040

Test6 45 190 780 3160 12720 51040

Figure6: The six tests of the selection sort.

2011 M70_CW2_ By: Saud Aljaloud

SelectionSort
60000 50000 40000 30000 20000 10000 0 SelectionSort

Figure7: growth rate of selection sort algorithm

1.4.3. Quick-Sort:

as an average case, whereas 2 in the worst case but it usually runs faster than other sorting algorithms. Quick-Sort is an efficient within a large list and it does not require a big memory.
N 10 20 40 80 160 320 QuickSort Test3 Test4 31 36 89 87 177 181 455 508 1251 1184 2625 2567 Average 27 77 188 472 1206 2818

The quick-sort algorithm is a quick sort and it requires O( log ) in the best case as well

Test1 22 94 199 468 1262 2751

Test2 29 64 185 516 1095 3035

Test5 21 69 208 455 1294 3151

Test6 26 61 183 434 1154 2781

Figure8: The six tests of the quick sort.

2011 M70_CW2_ By: Saud Aljaloud

QuickSort
3000 2500 2000 1500 1000 500 0 QuickSort

Figure9: growth rate of quick sort algorithm

1.4.4. The efficiency of the three sorting algorithms: By looking at both the table and the graph below, Snail-Sort is the less efficient than selection and quick sort whereas the best one is quick sort. However, with the data size is 10, snail sort requires 102 comparisons, while selection sort 45 and quick sort 27 comparisons.
N 10 20 40 80 160 320 SnailSort 102 697 4422 32411 232716 1853596 SelectionSort 45 190 780 3160 12720 51040 QuickSort 27 77 188 472 1206 2818

Figure10: Table shows the number of comparisons.

2011 M70_CW2_ By: Saud Aljaloud

2000000 1900000 1800000 1700000 1600000 1500000 1400000 1300000 1200000 1100000 1000000 900000 800000 700000 600000 500000 400000 300000 200000 100000 0

SnailSort SelectionSort QuickSort

Figure11: Line graph for the three sorting algorithms showing the growth-rate

2.2. The evaluation of listInSort : The listInSort which is similar to insertion sort works as the example below: Assume the Array is empty and we want to add {5, 3, 2, 8, 1} into the array in ascending order, so the array will be sorted. 1- Add the first element which 5 into the array = {5}. 2- Then before adding the next element 3 compare it with the first element which is 5, if it is smaller than, swap it with 5, if it is greater than the previous element. Then add it to the end of the array. So the array now is {3, 5}. 3- Repeating the step above with the rest of the array elements till array becomes = {1, 2, 3, 5, 8}. From the example above, on the first operation, it compares a maximum of one element. Then on the second, it compares a maximum of two elements, and three elements, until the maximum (N-1) comparisons on the last operation. So, it is 1 + 2 + + N-1 = N*(N-1)/2. But the quick sort is the best sorting algorithm comparing with the other algorithms.
N 10 2011 M70_CW2_ By: Saud Aljaloud ListInSort 32 9

20 40 80 160 320

117 430 1684 6744 26125

Figure12: Table shows the number of comparisons for listInSort.

ListInSort
30000 25000 20000 15000 10000 5000 0 ListInSort

Figure13: Line graph for listInSort algorithms showing the growth-rate

2.3. Big-O notation Big O notation is used in Computer Science to describe the performance or complexity of an algorithm. Big O specifically describes the worst-case scenario, and can be used to describe the execution time required or the space used (e.g. in memory or on disk) by an algorithm.(A Beginners Guide to Big O Notation, Rob Bill,2009) Here are algorithms that are used within those sorting algorithms and a line graph which is illustrated the growth-rate. We can see below that: n2 grows faster than nlog2n and nlog2n grows faster than n and n grows faster than log2n.

Name
Snail-Sort Selection-Sort Quick-Sort

Figure14: Table shows the algorithms in Big-O notation.


n 10 O(log N) 1.00 O(N) 10 O(N log N) 10.00 O( ) 100

n 2 n log

Best

Average

2 n log
2

Worst

2 2 2

2011 M70_CW2_ By: Saud Aljaloud

10

20 40 80 160 320

1.30 1.60 1.90 2.20 2.51

20 40 80 160 320

26.02 64.08 152.25 352.66 801.65

400 1600 6400 25600 102400

Figure15: The second table shows the algorithm after applying calculations.
120000 110000 100000 90000 80000 70000 60000 50000 40000 30000 20000 10000 0 1 2 3 4 5 6

O(n) O(N log N) O(N) O(log N) n

Figure16: the graph shows the Big-O notation growth-rate.

2011 M70_CW2_ By: Saud Aljaloud

11

Das könnte Ihnen auch gefallen