# Computational complexity

The computational complexity of an algorithm is the amount of resources (time and memory) required to run it.

## Arithmetic complexity of computations

### Big-O notation

Big-O notation gives an upper bound of the complexity in the *worst* case, helping to quantify performance as the input becomes arbitrarily large.

Where *n* is the size of the input; complexities ordered from smallest to largest:

Notation |
Name |

$O(1)$ |
Constant time |

$O(\log(n))$ |
Logarithmic time |

$O(n)$ |
Linear time |

$O(n \log(n))$ |
Linearithmic time |

$O(n^2)$ |
Quadric time |

$O(n^3)$ |
Cubic time |

$O(b^n)$, b > 1 |
Exponential time |

$O(n!)$ |
Factorial time |

Popular algorithms:

- $O(\log(n))$ - binary search
- $O(n)$ - linear search
- $O(n \log(n))$ - merge sort
- $O(n^2)$ - selection sort, bubble sort
- $O(n!)$ - finding all permutations of a string

### Big-Omega notation

Big-Omega notation gives a lower bound of the complexity in the *best* case.

Popular algorithms:

- $\Omega(1)$ - linear search, binary search
- $\Omega(n)$ - bubble sort
- $\Omega(n \log(n))$ - merge sort
- $\Omega(n^2)$ - selection sort

### Big-Theta notation

Big-Theta notation gives a asymptotic bound (upper and lower bounds) of the complexity in the *average* case.

Popular algorithms:

- $\Theta(n \log(n))$ - merge sort
- $\Theta(n^2)$ - selection sort