More recurrence examples

Give asymptotic upper and lower bounds for \(T(n)\) in each of the following recurrences. Assume that \(T(n)\) is constant for sufficiently small \(n\). Make your bounds as tight as possible, and justify your answers.

  1. \(T(n) = 4T(n/3) + n \lg n\).
  2. \(T(n) = 3T(n/3) + n/\lg n\).
  3. \(T(n) = 4T(n/2) + n^2 \sqrt{n}\).
  4. \(T(n) = 3T(n/3 - 2) + n/2\).
  5. \(T(n) = 2T(n/2) + n/\lg n\).
  6. \(T(n) = T(n/2) + T(n/4) + T(n/8) + n\).
  7. \(T(n) = T(n-1) + 1/n\).
  8. \(T(n) = T(n-2) + \lg n\).
  9. \(T(n) = T(n-2) + 1/ \lg n\).
  10. \(T(n) = \sqrt{n} T(\sqrt{n}) + n\).

This problem is a workout for solving recurrences. We have ten different recurrence relations, each requiring a different technique or careful application of the master theorem. Some fit neatly into one of the three cases of the master theorem, others fall into the “gap” where the theorem doesn’t apply, and some don’t even have the divide-and-conquer structure the master theorem requires.

The key to solving recurrences is pattern recognition. For divide-and-conquer recurrences (those that split the problem into smaller pieces), we apply the master theorem by comparing the work done at each level (\(f(n)\)) against the natural growth rate of the recursion tree (\(n^{\log_b a}\)). For subtractive recurrences (those that decrease by a constant), we often need to expand and sum the series. Let’s work through each one.

A

\[T(n) = 4T(n/3) + n \lg n = \Theta(n^{\log_3 4})\]

Here is how to think about it: we have 4 subproblems, each of size \(n/3\), with \(n \lg n\) work done at each level. Let’s identify the parameters for the master theorem:

\[a=4, b=3, f(n)=n \lg n\]

The “natural” growth rate of the recursion tree is \(n^{\log_b a} = n^{\log_3 4} \approx n^{1.262}\). This tells us how fast the number of subproblems grows as we go down the tree.

Now we need to compare \(f(n) = n \lg n\) against \(n^{1.262}\). Although \(n \lg n\) includes a logarithmic factor, logarithms grow so slowly that they don’t affect the polynomial comparison. The function \(n \lg n\) grows slower than \(n^{1.262}\), which means the work is dominated by the leaves of the recursion tree, not the internal nodes.

More precisely, \(f(n) = O(n^{\log_3 4 - \epsilon})\) for some small \(\epsilon > 0\) (say \(\epsilon = 0.1\)). This is Case 1 of the master theorem.

B

\[T(n) = 3T(n/3) + n/\lg n = \Theta(n \lg \lg n)\]

The master theorem doesn’t apply here because \(f(n) = n/\lg n\) is only logarithmically smaller than \(n^{\log_b a} = n\), not polynomially smaller. We need a recursion tree analysis.

Let \(f(n) = n/\lg n\). Here \(a=3, b=3\) so \(n^{\log_b a} = n\). At level \(i\) (root is \(i=0\)) there are \(3^i\) nodes of size \(n/3^i\). Each node contributes:

\[f\left(\frac{n}{3^i}\right) = \frac{n/3^i}{\lg(n/3^i)} = \frac{n/3^i}{\lg n - i\lg 3}\]

Total work at level \(i\) is:

\[3^i \cdot \frac{n/3^i}{\lg n - i\lg 3} = \frac{n}{\lg n - i\lg 3}\]

Levels go from \(i=0\) to \(L-1\) with \(L = \lg_3 n = \lg n / \lg 3\). The total work is:

\[n\sum_{i=0}^{L-1} \frac{1}{\lg n - i\lg 3}\]

Let \(m = \lg n\) and \(c = \lg 3\). We can simplify this sum using a substitution. Note that \(L = m/c\) (since \(L = \lg n/\lg 3\)), which means \(m = Lc\). Now substitute \(j = L - 1 - i\), so as \(i\) ranges from \(0\) to \(L-1\), \(j\) ranges from \(L-1\) down to \(0\). The denominator becomes:

\[m - ic = m - (L-1-j)c = m - Lc + c + jc = Lc - Lc + c + jc = c(j+1)\]

Therefore:

\[\sum_{i=0}^{L-1} \frac{1}{m - ic} = \sum_{j=0}^{L-1} \frac{1}{c(j+1)} = \frac{1}{c}\sum_{k=1}^{L} \frac{1}{k} = \frac{1}{c}H_L\]

where \(H_L = \sum_{k=1}^{L} 1/k\) is the \(L\)-th harmonic number. From equation (A.7) in Appendix A, we know that \(H_L = \Theta(\ln L) = \Theta(\lg L)\). Since \(L = \lg_3 n = \lg n/\lg 3\), we have:

\[H_L = \Theta(\lg L) = \Theta(\lg \lg n)\]

Multiplying by the factor of \(n\), the internal levels contribute \(\Theta(n \lg \lg n)\). The leaves contribute \(\Theta(n)\), which is lower order.

C

\[T(n) = 4T(n/2) + n^2\sqrt{n} = \Theta(n^2\sqrt{n})\]

We have 4 subproblems of size \(n/2\) each, with \(n^{2.5}\) work at each level. Using the master theorem:

\[a=4, b=2, f(n)=n^{2.5}, n^{\log_b a} = n^2\]

Since \(f(n) = n^{2.5} = \Omega(n^{2+0.5})\), the work done at each level grows faster than the natural growth rate of the tree. This suggests Case 3.

We need to verify the regularity condition: \(af(n/b) \le cf(n)\) for some \(c < 1\). Indeed:

\[4f(n/2) = 4(n/2)^{2.5} = 4 \cdot \frac{n^{2.5}}{2^{2.5}} = \frac{n^{2.5}}{2^{1.5}} = \frac{1}{2\sqrt{2}} \cdot n^{2.5} < cf(n)\]

for \(c = 1/(2\sqrt{2}) \approx 0.35 < 1\). Case 3 applies.

D

\[T(n) = 3T(n/3-2) + n/2 = \Theta(n \lg n)\]

The \(-2\) term is a low-order perturbation that doesn’t affect the asymptotic behavior. For large \(n\), the recurrence behaves like \(T(n) = 3T(n/3) + n/2\).

Using the master theorem with \(a=3, b=3, f(n)=n/2\):

\[n^{\log_b a} = n^{\log_3 3} = n\]

Since \(f(n) = n/2 = \Theta(n) = \Theta(n^{\log_b a})\), we have Case 2 of the master theorem.

E

\[T(n) = 2T(n/2) + n/\lg n = \Theta(n \lg \lg n)\]

This looks similar to merge sort but with less work at each level. The master theorem doesn’t apply because \(f(n) = n/\lg n\) is only logarithmically smaller than \(n^{\log_b a} = n\).

At level \(i\) of the recursion tree, we have \(2^i\) nodes of size \(n/2^i\). Each node does \((n/2^i)/\lg(n/2^i)\) work. Total work at level \(i\) is:

\[2^i \cdot \frac{n/2^i}{\lg(n/2^i)} = \frac{n}{\lg n - i}\]

Summing over all \(\lg n\) levels:

\[\sum_{i=0}^{\lg n - 1} \frac{n}{\lg n - i} = n \sum_{j=1}^{\lg n} \frac{1}{j} = n \cdot H_{\lg n} = \Theta(n \lg \lg n)\]

where \(H_k\) is the \(k\)-th harmonic number.

F

\[T(n) = T(n/2) + T(n/4) + T(n/8) + n = \Theta(n)\]

This doesn’t fit the standard master theorem form because we have unequal subproblems. We use a recursion tree analysis.

At the root, we do \(n\) work and spawn three subproblems of sizes \(n/2\), \(n/4\), and \(n/8\). The key insight is that at each level, the total work done decreases by a constant factor. Let’s track the work at each level:

  • Level 0: \(n\)
  • Level 1: \(n/2 + n/4 + n/8 = 7n/8\)

At level 2, each of the three nodes from level 1 spawns its own three children following the same pattern (each splits into halves, quarters, and eighths of itself). The total work at level 2 is \((7/8)\) times the work at level 1, giving us \((7/8)^2 n\). In general, level \(i\) contributes \((7/8)^i n\) work.

The total work across all levels is:

\[\sum_{i=0}^{\infty} \left(\frac{7}{8}\right)^i n = n \sum_{i=0}^{\infty} \left(\frac{7}{8}\right)^i\]

This is a geometric series with ratio \(r = 7/8 < 1\). From equation (A.6) in Appendix A, we know that \(\sum_{k=0}^{\infty} x^k = 1/(1-x)\) for \(\|x\| < 1\). Therefore:

\[n \sum_{i=0}^{\infty} \left(\frac{7}{8}\right)^i = n \cdot \frac{1}{1 - 7/8} = n \cdot \frac{1}{1/8} = 8n = \Theta(n)\]

We can verify this by substitution. Guess \(T(n) = cn\) for some constant \(c\):

\[cn = n + c(n/2) + c(n/4) + c(n/8) = n + cn(1/2 + 1/4 + 1/8) = n + \frac{7cn}{8}\]

Solving for \(c\): \(cn - \frac{7cn}{8} = n\), which gives \(\frac{cn}{8} = n\), so \(c = 8\).

G

\[T(n) = T(n-1) + 1/n = \Theta(\lg n)\]

This is a subtractive recurrence (decreases by 1, not by a fraction), so the master theorem doesn’t apply. We expand directly:

\[T(n) = T(n-1) + \frac{1}{n} = T(n-2) + \frac{1}{n-1} + \frac{1}{n} = \cdots = \sum_{i=1}^{n} \frac{1}{i} = H_n\]

where \(H_n\) is the \(n\)-th harmonic number. This famous series grows logarithmically: \(H_n = \Theta(\lg n)\).

Think of it this way: we’re adding smaller and smaller fractions (\(1, 1/2, 1/3, \ldots, 1/n\)), and even though there are \(n\) terms, their sum only reaches about \(\lg n\) because the terms shrink so rapidly. You can verify this by comparing to the integral \(\int_1^n \frac{1}{x}dx = \ln n\).

H

\[T(n) = T(n-2) + \lg n = \Theta(n \lg n)\]

This is a subtractive recurrence that decreases by 2. Expanding for even \(n\):

\[T(n) = \lg n + \lg(n-2) + \lg(n-4) + \cdots + \lg 2\]

This sums approximately \(n/2\) terms. We can approximate:

\[\sum_{i=1}^{n/2} \lg(2i) = \sum_{i=1}^{n/2} (\lg 2 + \lg i) = \frac{n}{2}\lg 2 + \lg((n/2)!) = \Theta(n \lg n)\]

Alternatively, the product is \(\lg(n \cdot (n-2) \cdot (n-4) \cdots 2) = \lg(n!!)\), where \(n!!\) is the double factorial. Using approximations, \(\lg(n!!) = \Theta(n \lg n)\).

I

\[T(n) = T(n-2) + 1/\lg n = \Theta(n/\lg n)\]

This recurrence decreases by 2 each time, so we sum over roughly \(n/2\) terms:

\[T(n) = \sum_{i \text{ even}} \frac{1}{\lg i}\]

We can approximate this sum by the integral:

\[\int_2^n \frac{1}{\lg x}dx = \Theta(n/\lg n)\]

The integral can be evaluated using the substitution \(u = \lg x\).

J

\[T(n) = \sqrt{n} T(\sqrt{n}) + n = \Theta(n \lg \lg n)\]

This recurrence is unusual because the number of subproblems (\(\sqrt{n}\)) depends on \(n\) itself! The master theorem doesn’t apply. We use a substitution.

Let \(n = 2^m\), so \(m = \lg n\). Then:

\[T(2^m) = 2^{m/2}T(2^{m/2}) + 2^m\]

Define \(S(m) = T(2^m)\):

\[S(m) = 2^{m/2}S(m/2) + 2^m\]

Divide both sides by \(2^m\) and let \(R(m) = S(m)/2^m\):

\[R(m) = R(m/2) + 1\]

This gives \(R(m) = \Theta(\lg m)\), so \(S(m) = \Theta(2^m \lg m)\).

Substituting back \(m = \lg n\):

\[T(n) = T(2^m) = S(m) = \Theta(2^m \lg m) = \Theta(n \lg \lg n)\]

Note the double logarithm! This grows very slowly, faster than linear but much slower than \(n \lg n\).