• Measures how runtime scales with input size
  • Important to note that O() notation is not everything
    • Its important to consider how algorithms work in the memory/rest of the system

O(1)

  • Constant runtime

  • Accessing an array by index

  • Access a hash table

O( log(n) )

  • Logarithmic time

  • Runtime increases slowly as input grows

  • Binary search on sorted array

  • performing operations on a balanced binary tree

0(n)

  • Linear time

  • Runtime grows with input as each element is touched once

  • Finding minimax elements in an unsorted array

  • Checking if an element exists in an unsorted array

O(nlogn)

  • Linearithmic time

  • Runtime of most efficient sorting algorithms (merge sort, quick sort, heap sort)

O(n)

  • Quadratic runtime

  • Bubble sort

  • Nested loops

O(n)

  • Cubic runtime

  • Naïve matrix multiplication

  • Triple nested loops

O(2)

  • Exponential runtime

  • Recursive algorithms

O(n!)

  • Factorial runtime

  • Permutation-generation problems

    • Very inefficient for trivial input size

Array Access

  • Although accessing a 2D array is O(n)
    • Accessing by rows is faster than accessing by columns!
  • Reading row-wise maximises sequential access
    • More cache friendly

Linked list us Array

  • Both are linear time
  • Accessing an array is faster
    • Array elements are closer together
    • Linked-list nodes are often scattered in memory