Analysis Tools

The two Goodrich et all textbooks diverge here, so the reading differs depending on language.

The “Seven” Functions

An algorithm is always measured by the number of steps that it takes to execute on an input of size \(n\). This isolates the quality of the algorithm from the speed of the computer or the skill of the programmer.

One of the universal truisms of computer science is that polynomials are “good” and exponentials are “bad.”

The “Big-O” Notation

Asymptotic notation


  1. We can justify that the function \(8n - 2\) is \(O(n)\) by finding \(c\) and \(n_0\) such that the definition of \(O(n)\) holds.

  2. Note that \(5n^4 + 3n^3 + 2n^2 + 4n + 1\) is \(O(n^4)\) because \[5n^4 + 3n^3 + 2n^2 + 4n + 1 \le (5 + 3 + 2 + 4 + 1) n^4 =c n^4\] for \(c=15\), when \(n\ge n_0 = 1\). This correctly indicates that the degree of the polynomial gives the growth of the function, and that we can ignore terms of lesser degree.

  3. \(3\log n + 2\) is \(O(\log n)\) because \(3\log n + 2 \le 5\log n\) for \(n \ge 2\).

  4. This example is fairly sophisticated: \((n + a)^5\) is \(\Theta(n^5)\) for any real value \(a\).

Note, if \(d(n)\) is \(O(e(n))\) and \(f(n)\) is \(O(g(n))\) then

Algorithm analysis

Now we make a leap forward in the textbook to analyze a few algorithms familiar from Intro:

Selection sort

  for (\(j\gets 1\); \(j\leq n - 1\); \(j\gets j + 1\))
   \(m\gets j\)
   for (\(i\gets j + 1\); \(i\leq n\); \(i\gets i + 1\))
    if (\(A[i] < A[m]\))
     \(m\gets i\)
   if (\(m\neq j\))
    swap(\(A[j]\), \(A[m]\))


To merge-sort a sequence of items: cut the unsorted sequence in half, recursively sort each half, and merge the two sorted halves into a sorted sequence. The base case is a sequence of zero or one items, which is by definition sorted.

Analyze the running order of merge-sort by expanding the recurrence relation:

\[f(n)=\begin{cases} a & n\le 1 \\ 2\,f(n\,/\,2)+b\,n & \text{otherwise}\end{cases}\]


To quick-sort a sequence of items: pick an item \(x\) from the unsorted sequence. Copy all items less than \(x\) to a sequence \(L\), all items equal to \(x\) to a sequence \(E\), and all items greater than \(x\) to a sequence \(G\). Recursively sort \(L\) and \(G\), then concatenate the sorted \(L\), \(E\), and \(G\), giving a sorted sequence. The base case is a sequence of zero or one items.

Analyze the best-case running order quick-sort by noting the steps required at each level of recursion:

\[\begin{align} s(0)&=n\\ s(1)&=n-1\\ s(2)&=n-1-2\\ s(3)&=n-1-2-4\\ s(i)&=n-(2^i-1) \end{align}\]

This can only proceed until \(2^i-1=n\) or \(i=\log_2(n+1)\), and \[\sum_{i=0}^{\log_2(n+1)} n-2^i+1 \mathrm{\ is\ } O(n\log n)\].