Types of sorts
Bubble sort
1. Start from index 0
2. Each time element is compared (ie current index) with the next element
3. If the next is bigger than current, do nothing, move to the next element
If the next is smaller than current, swap with next and then increment index counter
4. Repeat step 2
The back is always sorted first
Sorted Right to left
O(n^2)
S(1)
Swaps: n^2
Selection sort
1. Start for index 0, pointer 1 will point at this element
2. As iterate up each index, compare each of the remaining values with my pointer 1
3. If its smaller, a new pointer 2 will point to it, everytime a smaller value is encountered, reassign pointer 2 with it
4. When reach the end, swap pointer 2 with pointer 1.
5. Move on to the next index, with pointer 1 pointing at that element
6. repeat step 2
The front is sorted first
Sorted Left to Right
O(n^2)
S(1)
Swaps N
Insertion sort
1. Start from index 1
2. Iterate backwards if the prev index is larger than current
3. Insert if the prev index element is smaller than current
5. Stop iterating backwards
6. Iterate forwards once o the next element
7. Repeat 2
The front is sorted first
Sorts from left to right
The left is almost sorted
O(n^2)
S(1)
Swaps N^2
Merge sort:
Using recursion, we do wishful thinking and we have a merge function alreadyWe merge left and right
Then combine it together.
Visually
1. Split the ADT into half
2. Continue splitting it into half until there's only one element in each half
3. Combine by comparison, sorting each half each time
Algo:
1. Find mid by dividing high+low
2, In sort function, check that low< high, sort right and left, merge it afterwards
3. In merge function, while low<high, create two temp copies of arrays left and right with mid as the border
4. 3 while loops:
- If left counter and right counter haven exit the index
put the smaller element into final
increase respective counter. ie left index counter or right
- If left counter haven exit index and right has finish
put all remain left elements into array
- If right counter haven exit index and lefthas finish
put all remain right elements into array
O(nlogn)
S(n)
Divide and conquer
No swaps
Order of Growth
Joshua Wee Shing Hao sucks and snakes all the time
If the function is f(x) = t^2 + t
we can just take it as t^2 since t is insignificant compared to t^2
3 ways to express:
Big O
O(g(n)) is a set that contains non-neg function that are smaller than cg(n) for large n.
If i can find a constant and multiply it to a g(n), if its bigger than f(n), our original function, then its the upper bound
If i can find a constant and multiply it to a g(n), if its bigger than f(n), our original function, then its the upper bound
"Upper bound" for some function fn
f(n) = O(g(n))
where 0<= f(n) <= cg(n)
Question:
Is 2^n+1 = o(2^n)
-> Yes
because o(2^n) is some constant * f(c)
thus if we let constant = 2, it holds true as 2^n+1 is 2^n * 2
This is for worst case scenario
If we say an algo is O(n^2), the worse running time is O(n^2) and the running time will not exceed cn^2 for large n
Worst Case running time for statement x=1
O(1)
Worst Case running time for statement x=1
O(1)
Big Omega (EXTRA)
Is a set that contains non-negative functions that are larger or equal than cg(n) for large n and some constant c.
"Lower bound"
Big theta (EXTRA)
Is a set that contains non negative functions that are larger than c1g(n) and smaller than c2g(n) for large n and some constants c2 and c1