Sorting Algorithm’s Analysis

betulkaraman
6 min readMay 16, 2023

--

Being able to perform algorithm analysis for an engineer distinguishes an engineer from being a programmer, helps to understand the disciplines of programming, helps the programmer to develop projects that are fast and have a small memory requirement, thus contributing to high-performance projects.

We will examine all the algorithms together by considering the academic articles I have read and all the research I have done. In this article, we will be analyzing commonly used Sorting algorithms.

Let’s start…

Before we start, You can find the Java codes of all sorting algorithms in this article from the relevant repository in my GitHub account.

Click here for code.

First of all, this is the first sentence that we engineers are told in algorithm lessons. “There is no good or bad algorithm, there is an algorithm with low or high performance in the region where it will be used.”

Let’s show the sort types that we will examine from a table.

  1. Quick Sort

It works with fast sorting and the divide and conquer paradigm. The divide and conquer paradigm is that involves the division of a complex problem into smaller, simpler subproblems that can be solved independently. Each subproblem is solved recursively using the same approach, and the solutions are then combined to solve the original problem. This approach is often used to increase efficiency in algorithm design and reduce the time complexity of solving various computational problems.

In this sorting algorithm, a pivot is selected from the given array.
Several methods of selecting pivots are indicated below.

  • Always pick the first element as a pivot.
  • Always pick the last element as a pivot (implemented below)
  • Pick a random element as a pivot.
  • Pick the middle as the pivot.

This pivot element compares all the elements according to itself and takes the small ones to the left side and the large ones to the right side. The process is repeated in the new ones formed on the right and left. This process continues until there is only one element left in each substring.
Finally, the substrings are combined and the ordered sequence is obtained.

I am going to add a table at the end of the writing for all sorting algorithm’s time complexity.

Quick Sort

2. Merge Sort

It works with the divide and conquer algorithm. First of all, he divides the series into two parts. This process continues recursively, it is performed until there are 1 or 2 elements in the array. After this stage, the order of operations passes to the sorting and merging step. In the merge step, an ordered array is created by comparing the elements of two substrings. The small elements come first, and a combined sequence sorted in this way is obtained.

Merge Sort

3. Heap Sort

Heapsort is an in-place algorithm. Mmm…What is the algorithm in-place? If we need to explain this, the in-place algorithm does not require additional memory usage, it makes changes to original version of the data structure.
The in-place algorithm aims to improve performance by reducing memory usage and increasing time and processor power efficiency.

Depending on the sorting method, there are types of the largest element in the root node, i.e. max heap, or the smallest element in the root node, i.e. min heap.

Suppose we use max heap in the scenario we are going to write. First of all, it is placed on the array heap tree in such a way that the root and children are filled in the first place.

Then the tree is reordered so that it will be the largest element at the root. When the sorting process is completed, the root is placed in the node at the far end, so that it settles to the last part of the array, and then it is deleted from the tree.

Because the balance of the tree is disturbed, it is re-created to comply with the max heap rules. The last remaining element is the first element of the array.

4. Bubble Sort

The bubble sort algorithm is an algorithm used to sort a given sequence. This algorithm sorts an element by comparing it with all other elements. If the previous element is larger than the next element, the locations of these two elements are changed. Then the values of the second and third elements are compared. If the value of the second element is greater than the value of the third element, these two elements switch places, and this process continues in this way until the entire list is sorted.

Bubble Sort

5. Selection Sort

The working logic of the algorithm is to find the smallest element of the list in each iteration and place it at the beginning of the list.

First, the sorted part (sorted part) is created at the beginning of the sequence and the unsorted part (unsorted part) is created at the end. In the first step, it goes through all the elements in the unsorted array and finds the smallest element. Then this element is moved to the first position. The same process is repeated for all other elements, and all elements sorted part a are sorted properly.

It is an in-place algorithm. It cannot work effectively in big data. Cache usage is too high.

Selection Sort

6. Insertion Sort

Insertion sort is one of the most basic sorting algorithms. Their code is easy to understand. It is an in-place algorithm. In general, the working logic proceeds as follows, starting from the second element(the element before it), it repeats the operations of shifting large elements to the right in the array by comparing the element with the elements before it.

Insertion Sort

7. Radix Sort

Unlike other sorting algorithms, Radix Sort is effective for sorting string and large number indexes.

The basic idea behind Radix Sort is to sort the input data by digit position, starting from the least significant digit to the most significant digit. Each digit is sorted in a separate pass, and the results of each pass are combined to obtain the final sorted array. The algorithm starts by grouping the input data according to the value of the current digit, and then sorts the groups according to the next digit. This process is repeated until all the steps have been processed.

Radix Sort can be implemented using two different approaches: LSD (Least Significant Digit) and MSD (Most Significant Digit). In a decimal number, the LSD is the rightmost nonzero digit, while the MSD is the leftmost nonzero digit. For example, in the number 549.223, the LSD is 3 and the MSD is 5. If the number is a whole number, then the LSD is the digit immediately to the left of the radix point.

Radix Sort

Now let’s examine the time complexity and space complexity information of all these algorithms we mentioned.

This article has been written to arouse your interest in sorting algorithms, expand your knowledge and enable you to gain new skills. Understanding the basics of sorting algorithms also lays the groundwork for understanding more complex algorithms. Mastery of these algorithms will put you one step ahead in the world of computer science and will enable you to produce more efficient and creative solutions to problems.

--

--