Computational complexity
The computational complexity of an algorithm is the amount of resources (time and memory) required to run it.
Arithmetic complexity of computations
Big-O notation
Big-O notation gives an upper bound of the complexity in the worst case, helping to quantify performance as the input becomes arbitrarily large.
Where n is the size of the input; complexities ordered from smallest to largest:
Notation |
Name |
O(1) |
Constant time |
O(log(n)) |
Logarithmic time |
O(n) |
Linear time |
O(nlog(n)) |
Linearithmic time |
O(n2) |
Quadric time |
O(n3) |
Cubic time |
O(bn), b > 1 |
Exponential time |
O(n!) |
Factorial time |
Popular algorithms:
- O(log(n)) - binary search
- O(n) - linear search
- O(nlog(n)) - merge sort
- O(n2) - selection sort, bubble sort
- O(n!) - finding all permutations of a string
Big-Omega notation
Big-Omega notation gives a lower bound of the complexity in the best case.
Popular algorithms:
- Ω(1) - linear search, binary search
- Ω(n) - bubble sort
- Ω(nlog(n)) - merge sort
- Ω(n2) - selection sort
Big-Theta notation
Big-Theta notation gives a asymptotic bound (upper and lower bounds) of the complexity in the average case.
Popular algorithms:
- Θ(nlog(n)) - merge sort
- Θ(n2) - selection sort