# Algorithms Summary Points – Part 3: BubbleSort, SelectionSort, InsertionSort, LibrarySort

This note has some summary points from my study of algorithms and algorithmic analysis. If you have little or no idea of the concepts I will talk about, this note may not be so helpful to you. I only post these here to serve as a quick note or as a refresher about these concepts should the need arise. In that case, I suggest you take an advanced course on algorithms and data structures first.  Nevertheless, if you are interested in discussing any of the concepts here I will be glad to be of help.

## Bubble Sort, Selection Sort, Insertion Sort, Library Sort

### Bubble Sort

• Works with two nested loops
• Outer (i) loop will run n times, does nothing else. It’s just there to make the inner loop run repetitively.
• Inner (j) loop will run n-1 times, then it checks if the element at position j is greater than that at j+1
• If true, it swaps both.
• This makes the largest number always move to the right after every j loop
• Because we have two nested loops that depend on n, the running time for
• Worst Case – Θ(n^2) – Occurs when the array is reverse sorted
• Average Case – Θ(n^2)
• Best Case – Θ(n^2) – Occurs when the array is already sorted.
• It always runs the outer n times and runs the inner n-1 times no matter what. So, the Best, Average and Worst-Case Running Times are always Θ(n^2).
• The order of duplicates is maintained, so it is stable.
• It is inversion bound.

### Selection Sort

• Also known as Min-sort
• Works with two nested loops. The second loop is in a method – findMin() – Which gets the smallest item within a lower and upper bound.
• The idea is to get the smallest element in each unsorted section each time and bring it to the left.
• Hence, the inner loop finds the smallest item in each new section of the array and returns it to the outer loop. The outer loop then swaps the item at the ith position, with the smallest element.
• Because we have two nested loops that depend on n, the running time for
• Worst Case – Θ(n^2) – Occurs when the array is reverse sorted
• Average Case – Θ(n^2)
• Best Case – Θ(n^2) – Occurs when the array is already sorted.
• It does not check if the ith element is the smallest, it swaps anyway.
• The order of duplicates may change, so it is unstable.
• It is inversion bound.

### Insertion Sort

• Like inserting a new book to a shelf in a library. The book here is the element placed in temp.
• It takes a successive value x, in the input list and compares in the already sorted left side of the array to find the position of x.
• Uses two loops
• It pulls out the item in the ith iteration into a temp variable and uses a variable j to keep track of the position the temp could possibly fit in.
• Then it uses the inner while loop to check if the temp is less than the value before it.
• If yes, it brings the item forward (It does not swap them)
• Then decrements the j since temp may fit into that position. But then it keeps checking if the temp will be less than the item before it if placed in that position.
• If yes, it continues the process.
• At the end of the while loop, temp is placed wherever j stops. Since j holds the possible position of temp.
• We have the possibility of having two nested loops run. Therefore, running time for
• Worst Case – Θ(n^2) – Occurs when the item is reverse sorted
• Average Case – Ω(n^2) by the inversion theorem, Θ(n^2) since we have worst case confirming it.
• Best Case – Θ(n) – Occurs when the array is already sorted. It will still have to run the outer loop and check the condition of the while loop which will fail.
• It is inversion bound.

Number of Inversions Theorem states that in an array without duplicates and items are chosen at random,

• the expected (average) number of inversions is n(n-1)/4

Hence, if an algorithm carries out at least as many comparisons as there are inversions, it carries out at least n(n-1)/4 comparisons, so the average case running time is Ω(n^2). Such an algorithm is known as inversion bound.

### Library Sort

• Is a refinement of insertion sort
• It also takes successive values x, but instead of doing a linear scan to find where to place x in the already sorted array, it uses binary search to find the position.
• It also creates spaces in the sorted section in each pass. Hence it doesn’t have to always shift values to place x.
• It has an average case running time of O(n log n). This is most likely because searching is in log n time, and the passes are in n time.
• Best Case is O(n)
• It is not inversion bound.