Computational complexity

The computational complexity of an algorithm is the amount of resources (time and memory) required to run it.

Arithmetic complexity of computations

Big-O notation

Big-O notation gives an upper bound of the complexity in the worst case, helping to quantify performance as the input becomes arbitrarily large.

Where n is the size of the input; complexities ordered from smallest to largest:

Notation Name
O(1)O(1) Constant time
O(log(n))O(\log(n)) Logarithmic time
O(n)O(n) Linear time
O(nlog(n))O(n \log(n)) Linearithmic time
O(n2)O(n^2) Quadric time
O(n3)O(n^3) Cubic time
O(bn)O(b^n), b > 1 Exponential time
O(n!)O(n!) Factorial time

Popular algorithms:

  • O(log(n))O(\log(n)) - binary search
  • O(n)O(n) - linear search
  • O(nlog(n))O(n \log(n)) - merge sort
  • O(n2)O(n^2) - selection sort, bubble sort
  • O(n!)O(n!) - finding all permutations of a string

Big-Omega notation

Big-Omega notation gives a lower bound of the complexity in the best case.

Popular algorithms:

  • Ω(1)\Omega(1) - linear search, binary search
  • Ω(n)\Omega(n) - bubble sort
  • Ω(nlog(n))\Omega(n \log(n)) - merge sort
  • Ω(n2)\Omega(n^2) - selection sort

Big-Theta notation

Big-Theta notation gives a asymptotic bound (upper and lower bounds) of the complexity in the average case.

Popular algorithms:

  • Θ(nlog(n))\Theta(n \log(n)) - merge sort
  • Θ(n2)\Theta(n^2) - selection sort