Common Running Times

  • Rank asymptotic complexities from smallest to largest.

Here are the common running times in the order of increasing growth rate.

  • O(1)\Omicron(1) — Constant functions
    • Running time does not depend on the input size.
    • It does not mean it only takes one step!
  • O(logn)\Omicron(\log n) — Logarithmic functions
    • Also called sub-linear functions.
    • Runtime grows slower than the size of the problem.
    • Example: Binary Search.
  • O(n)\Omicron(n) — Linear functions
    • Running time is proportional to input size.
    • Example: Linear search.
  • O(nlogn)\Omicron(n\log n) — Superlinear functions
  • O(n2)\Omicron(n^2) — Quadratic functions
    • Running time is proportional to the square of the input size.
    • Example: Bubble/Insertion/Selection sort (we'll cover later).
  • O(nk)\Omicron(n^k) — Polynomial functions
    • kk is a constant >1>1
    • O(n2)\Omicron(n^2) is a special case of this
    • Example: Enumerate all subsets of kk nodes among nn nodes.
  • O(cn)\Omicron(c^n) — Exponential functions
    • cc is a constant >1>1
    • Example: Enumerating all subsets of nn items.
  • O(n!)\Omicron(n!) — Factorial functions
    • Example: Generating all permutations or orderings of nn items.

All you need to appreciate is the order of increasing growth rate:

n!2nn3n2nlognnlogn1 n! \gg 2^n \gg n^3 \gg n^2 \gg n\log n \gg n \gg \log n \gg 1

There are other (more esoteric) functions that arise in the study of algorithms. However, the functions above account for describing the runtime of basic algorithms (those used in this course and you may come by during a coding interview).

The following illustration is taken from bigocheatsheet.com:

Resources

This article contains an explanation of Big-Oh notation, common running times, and a good-looking plot of those runtimes!