Friday, August 16, 2024

SORTING (Merge Sort, Quick Sort, Heap sor )

 SORTING (  Merge Sort, Quick Sort, Heap sort )

 Merge Sort:

Merge sort is the sorting technique that follows the divide, conquer, and combine algorithmic paradigm.

Divide means partitioning the n-element array to be sorted into two sub-arrays of n/2 elements. If A is an array containing zero or one element, then it is already sorted. However, if there are more elements in the array, divide A into two sub-arrays, A1 and A2, each containing about half of the elements of A.

Conquer means sorting the two sub-arrays recursively using merge sort.

Combine means merging the two sorted sub-arrays of size n/2 to produce the sorted array of n elements.

The basic steps of a merge sort algorithm are as follows:

  • If the array is of length 0 or 1, then it is already sorted.
  • Otherwise, divide the unsorted array into two sub-arrays of about half the size.
  • Use merge sort algorithm recursively to sort each sub-array.
  • Merge the two sub-arrays to form a single sorted list.

 It is one of the most popular and efficient sorting algorithm. It divides the given list into two equal halves, calls itself for the two halves and then merges the two sorted halves. We have to define the merge() function to perform the merging.

Algorithm

In the following algorithm, arr is the given array, beg is the starting element, and end is the last element of the array.


The important part of the merge sort is the MERGE function. This function performs the merging of two sorted sub-arrays that are A[beg…mid] and A[mid+1…end], to build one sorted array A[beg…end]. So, the inputs of the MERGE function are A[], beg, mid, and end.

The merge sort algorithm  uses a function merge which combines the sub-arrays to form a sorted array. While the merge sort algorithm recursively divides the list into smaller lists, the merge algorithm conquers the list to sort the elements in individual lists. Finally, the smaller lists are merged to form one list.

The implementation of the MERGE function is given as follows:





























To understand the merge algorithm, consider the figure below which shows how we merge two lists to form one list. For ease of understanding, we have taken two sub-lists each containing four elements. The same concept can be utilized to merge four sub-lists containing two elements, or eight sub-lists having one element each.




Compare ARR[I] and ARR[J], the smaller of the two is placed in TEMP at the location specified by INDEX and subsequently the value I or J is incremented.











When I is greater than MID, copy the remaining elements of the right sub-array in TEMP.


Write a program to implement merge sort.

#include <stdio.h>

void merge(int a[], int, int, int);

void merge_sort(int a[],int, int);

void main()

{

int arr[size], i, n;

printf("\n Enter the number of elements in the array : ");

scanf("%d", &n);

printf("\n Enter the elements of the array: ");

for(i=0;i<n;i++)

{

scanf("%d", &arr[i]);

}

merge_sort(arr, 0, n-1);

printf("\n The sorted array is: \n");

for(i=0;i<n;i++)

printf(" %d\t", arr[i]);

}

void merge(int arr[], int beg, int mid, int end)

{

int i=beg, j=mid+1, index=beg, temp[size], k;

while((i<=mid) && (j<=end))

{

if(arr[i] < arr[ j ])

{

temp[index] = arr[ i ];

i++;

}

else

{

temp[index] = arr[ j ];

j++;

}

index++;

}

if(i > mid)

{

while(j<=end)

{

temp[index] = arr[j];

j++;

index++;

}

}

else

{

while(i<=mid)

{

temp[index] = arr[i];

i++;

index++;

}

}

for(k=beg;k<index;k++)

arr[k] = temp[k];

}

void merge_sort(int arr[], int beg, int end)

{

int mid;

if(beg<end)

{

mid = (beg+end)/2;

merge_sort(arr, beg, mid);

merge_sort(arr, mid+1, end);

merge(arr, beg, mid, end);

}

}

Merge sort complexity

  • Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already sorted. The best-case time complexity of merge sort is O(n*logn).
  • Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly ascending and not properly descending. The average case time complexity of merge sort is O(n*logn).
  • Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order. That means suppose you have to sort the array elements in ascending order, but its elements are in descending order. The worst-case time complexity of merge sort is O(n*logn).
*************************

QUICK SORT:

Quicksort is the widely used sorting algorithm that makes n log n comparisons in average case for sorting an array of n elements. It is a faster and highly efficient sorting algorithm. This algorithm follows the divide and conquer approach. Divide and conquer is a technique of breaking down the algorithms into subproblems, then solving the subproblems, and combining the results back together to solve the original problem.

Divide: In Divide, first pick a pivot element. After that, partition or rearrange the array into two sub-arrays such that each element in the left sub-array is less than or equal to the pivot element and each element in the right sub-array is larger than the pivot element.

Conquer: Recursively, sort two subarrays with Quicksort.

Combine: Combine the already sorted array.

The quick sort algorithm works as follows:

1. Select an element pivot from the array elements.

2. Rearrange the elements in the array in such a way that all elements that are less than the pivot appear before the pivot and all elements greater than the pivot element come after it (equal values can go either way). After such a partitioning, the pivot is placed in its final position. This is called the partition operation.

3. Recursively sort the two sub-arrays thus obtained. (One with sub-list of values smaller than that of the pivot element and the other having higher value elements.)


Technique

Quick sort works as follows:

step1. Set the index of the first element in the array to loc and left variables. Also, set the index of the last element of the array to the right variable.

That is, loc = 0, left = 0, and right = n–1 (where n in the number of elements in the array)

step2. Start from the element pointed by right and scan the array from right to left, comparing each element on the way with the element pointed by the variable loc.

That is, a[loc] should be less than a[right].

(a) If that is the case, then simply continue comparing until right becomes equal to loc. Once right = loc, it means the pivot has been placed in its correct position.

(b) However, if at any point, we have a[loc] > a[right], then interchange the two values and jump to Step 3.

(c) Set loc = right

step3. Start from the element pointed by left and scan the array from left to right, comparing each element on the way with the element pointed by loc.

That is, a[loc] should be greater than a[left].

(a) If that is the case, then simply continue comparing until left becomes equal to loc. Once left = loc, it means the pivot has been placed in its correct position.

(b) However, if at any point, we have a[loc] < a[left], then interchange the two values and jump to Step 2.

(c) Set loc = left.






















Working of Quick Sort Algorithm

Now, let's see the working of the Quicksort Algorithm.

To understand the working of quick sort, let's take an unsorted array. It will make the concept more clear and understandable.

Let the elements of array are –


In the given array, we consider the leftmost element as pivot. So, in this case, a[left] = 24, a[right] = 27 and a[pivot] = 24.

Since, pivot is at left, so algorithm starts from right and move towards left.

Now, a[pivot] < a[right], so algorithm moves forward one position towards left, i.e.



Now, a[left] = 24, a[right] = 19, and a[pivot] = 24.

Because, a[pivot] > a[right], so, algorithm will swap a[pivot] with a[right], and pivot moves to right, as -



Now, a[left] = 19, a[right] = 24, and a[pivot] = 24. Since, pivot is at right, so algorithm starts from left and moves to right.

As a[pivot] > a[left], so algorithm moves one position to right as -



Now, a[left] = 9, a[right] = 24, and a[pivot] = 24. As a[pivot] > a[left], so algorithm moves one position to right as -

Now, a[left] = 29, a[right] = 24, and a[pivot] = 24. As a[pivot] < a[left], so, swap a[pivot] and a[left], now pivot is at left, i.e. -

Since, pivot is at left, so algorithm starts from right, and move to left. Now, a[left] = 24, a[right] = 29, and a[pivot] = 24. As a[pivot] < a[right], so algorithm moves one position to left, as -

Now, a[pivot] = 24, a[left] = 24, and a[right] = 14. As a[pivot] > a[right], so, swap a[pivot] and a[right], now pivot is at right, i.e. -

Now, a[pivot] = 24, a[left] = 14, and a[right] = 24. Pivot is at right, so the algorithm starts from left and move to right.

Now, a[pivot] = 24, a[left] = 24, and a[right] = 24. So, pivot, left and right are pointing the same element. It represents the termination of procedure.

Element 24, which is the pivot element is placed at its exact position.

Elements that are right side of element 24 are greater than it, and the elements that are left side of element 24 are smaller than it.

Now, in a similar manner, quick sort algorithm is separately applied to the left and right sub-arrays. After sorting gets done, the array will be -


Quicksort complexity

  • Best Case Complexity - In Quicksort, the best-case occurs when the pivot element is the middle element or near to the middle element. The best-case time complexity of quicksort is O(n*logn).
  • Average Case Complexity - It occurs when the array elements are in jumbled order that is not properly ascending and not properly descending. The average case time complexity of quicksort is O(n*logn).
  • Worst Case Complexity - In quick sort, worst case occurs when the pivot element is either greatest or smallest element. Suppose, if the pivot element is always the last element of the array, the worst case would occur when the given array is sorted already in ascending or descending order. The worst-case time complexity of quicksort is O(n2).

Write a program to implement quick sort algorithm.


#include <stdio.h>

#define size 100

int partition(int a[], int beg, int end);

void quick_sort(int a[], int beg, int end);

void main()

{

int arr[size], i, n;

printf("\n Enter the number of elements in the array: ");

scanf("%d", &n);

printf("\n Enter the elements of the array: ");

for(i=0;i<n;i++)

{

scanf("%d", &arr[i]);

}

quick_sort(arr, 0, n-1);

printf("\n The sorted array is: \n");

for(i=0;i<n;i++)

printf(" %d\t", arr[i]);

getch();

}

int partition(int a[], int beg, int end)

{

int left, right, temp, loc, flag;

loc = left = beg;

right = end;

flag = 0;

while(flag != 1)

{


while((a[loc] <= a[right]) && (loc!=right))

right--;

if(loc==right)

flag =1;

else if(a[loc]>a[right])

{

temp = a[loc];

a[loc] = a[right];

a[right] = temp;

loc = right;

}

if(flag!=1)

{

while((a[loc] >= a[left]) &&(loc!=left))

left++;

if(loc==left)

flag =1;

else if(a[loc] <a[left])

{

temp = a[loc];

a[loc] = a[left];

a[left] = temp;

loc = left;

}

}

}

return loc;

}

void quick_sort(int a[], int beg, int end)

{

int loc;

if(beg<end)

{

loc = partition(a, beg, end);

quick_sort(a, beg, loc-1);

quick_sort(a, loc+1, end);

}

}

***********

HEAP SORT:




No comments:

Post a Comment

Binary Search Trees(BST)

  Binary Search Trees (BST): A binary search tree, also known as an ordered binary tree. In a binary search tree, all the nodes in the left ...