# Algorithms Summary Points – Part 5: QuickSort, QuickSelect

This note has some summary points from my study of algorithms and algorithmic analysis. If you have little or no idea of the concepts I will talk about, this note may not be so helpful to you. I only post these here to serve as a quick note or as a refresher about these concepts should the need arise. In that case, I suggest you take an advanced course on algorithms and data structures first.  Nevertheless, if you are interested in discussing any of the concepts here I will be glad to be of help.

## Quick Sort, Quick Select

### Quick Sort

• Another Divide and Conquer Algorithm
• Like Merge Sort except it does not split into halves.
• It picks a pivot element from the array
• Then partitions the array into 3 parts
• L – The elements less than the pivot
• E – The elements equal to the pivot
• G – The elements greater than the pivot
• Then L and G are recursively broken again till each L and G has 1 or 0 elements – This is the base case
• Then a union is done on them in the order L E G
• Running Times
• Worst Case – O(n^2). This happens when we always pick the unique smallest or greatest of the elements as pivot. This makes either L or G to be empty while the other has n-1 elements. The sequence generated till we get to 0 this time will be n-1, n-2, till n-n. Which means it rises from 1 to n and gives us an asymptotic run time of O(n^2) (sum of numbers of 1 to n is n(n+1)/2).
• However, using Chernoff’s bounds, the probability of this occurring is extremely low.
• Average Case – O(n log n) – We arrive at this by looking at the binary tree is we have an expected number of self-calls. Each good self-call reduces the input by at least ¾ which gives us 2+ log4/3n items in the sequence
• Best case – O(n log n)
• When using quick sort, a self-call is good if the number of elements in L and G is < 3n/4
• Is unstable

### In-Place Quick Sort

• Still quick sort, but we do not partition into 3.
• We also pick a pivot and do in-place partition
• Select the pivot
• Swap with the element at the right.
• Then use two pointers I and j, I moves to the right past all elements less than the pivot. J moves to the left past all elements greater than the pivot.
• If they are stuck, we swap the elements at their positions.
• When they cross, we swap the pivot with the element at I and return the position of the pivot.
• Then we recursively do the same thing on the left and right of the pivot – excluding the position of the pivot because the pivot now sits in its rightful place.
• It is not stable

### Quick Select

• Used to search for the nth smallest value in a sequence.
• Works like quick sort because it
• Gets a pivot value
• Creates L E G based on the pivot.
• If the length of L is greater than the position we need, say K, we search recursively in L giving it the same value of K
• If the k is greater than the length of L but less than the length of L plus the length of E, then we know K exists in E, so we return any value in E
• Otherwise, we know K exists in G. So, we get the sum of the lengths of L and G, this will become our new K and we recursively search G.
• The running times
• Worst Case is Θ(n^2 – Happens when the array is already sorted, our K= 1, and our pivot is always the rightmost element. This means the input will only be reduced by 1 each time, making us do the recursion n times, inside our partition we loop n times again to fix elements in either L, E or G
• Average Case is O(n) – For expected good self-calls where L and G have less than ¾n each time.
• Best Case is O(n)