Show that case 3 of the master theorem is overstated, in the sense that the regularity condition \(af(n/b) \leq cf(n)\) for some constant \(c < 1\) implies that there exists a constant \(\epsilon > 0\) such that \(f(n) = \Omega(n^{\log_b a + \epsilon})\).

Case 3 of the master theorem requires two conditions to hold:

  1. Polynomial-larger condition: \(f(n) = \Omega(n^{\log_b a + \epsilon})\) for some constant \(\epsilon > 0\)
  2. Regularity condition: \(af(n/b) \leq cf(n)\) for some constant \(c < 1\) and all sufficiently large \(n\)

The exercise asks us to prove that condition 2 actually implies condition 1, showing that the regularity condition is the stronger requirement. In other words, if you verify the regularity condition, you automatically get the polynomial-larger condition for free.

The regularity condition says that when we divide the problem by \(b\), create \(a\) subproblems, and sum their costs, the total is at most a constant fraction \(c < 1\) of the original cost \(f(n)\). Think of it like compound interest working in reverse: if each time we shrink the problem by a factor of \(b\), the combined cost of \(a\) subproblems drops by at least a factor of \(1/c > 1\), then \(f(n)\) must be growing very rapidly. This rapid growth is precisely what forces \(f(n)\) to be polynomially larger than \(n^{\log_b a}\).

To see a concrete example, consider \(f(n) = n^2\) with \(a = 2\) and \(b = 2\) (so \(n^{\log_b a} = n\)). The regularity condition becomes: \(2f(n/2) = 2(n/2)^2 = n^2/2 \leq cf(n) = cn^2\). This holds with \(c = 1/2 < 1\). Indeed, \(f(n) = n^2 = \Omega(n^{1+\epsilon})\) for \(\epsilon = 1 > 0\), confirming the polynomial-larger condition.

Given the regularity condition \(af(n/b) \leq cf(n)\) for some \(c < 1\) and all sufficiently large \(n\), we can rewrite this as:

\[\frac{f(n/b)}{f(n)} \leq \frac{c}{a}\]

Let \(d = c/a < 1/a\) (since \(c < 1\)). Then:

\[f(n/b) \leq d \cdot f(n)\]

Applying this inequality repeatedly (for \(n, n/b, n/b^2, \ldots\)), we get:

\[\begin{align*} f(n/b) &\leq d \cdot f(n) \\ f(n/b^2) &\leq d \cdot f(n/b) \leq d^2 \cdot f(n) \\ f(n/b^3) &\leq d \cdot f(n/b^2) \leq d^3 \cdot f(n) \\ &\vdots \\ f(n/b^k) &\leq d^k \cdot f(n) \end{align*}\]

For \(k = \log_b n\) (assuming \(n\) is a power of \(b\)), we have \(n/b^k = 1\), so:

\[f(1) \leq d^{\log_b n} \cdot f(n)\]

Therefore:

\[f(n) \geq f(1) \cdot d^{-\log_b n} = f(1) \cdot \left(\frac{1}{d}\right)^{\log_b n}\]

Since \(d = c/a < 1/a\), we have \(1/d > a\). Let’s write \(1/d = a^{1 + \delta}\) for some \(\delta > 0\) (this is possible since \(1/d > a\)).

Then:

\[\begin{align*} f(n) &\geq f(1) \cdot (a^{1 + \delta})^{\log_b n} \\ &= f(1) \cdot a^{(1 + \delta) \log_b n} \\ &= f(1) \cdot a^{\log_b n} \cdot a^{\delta \log_b n} \\ &= f(1) \cdot (b^{\log_b a})^{\log_b n} \cdot a^{\delta \log_b n} \\ &= f(1) \cdot n^{\log_b a} \cdot a^{\delta \log_b n} \end{align*}\]

Now, \(a^{\delta \log_b n} = (b^{\log_b a})^{\delta \log_b n} = b^{\delta \log_b a \cdot \log_b n} = n^{\delta \log_b a}\).

Therefore:

\[f(n) \geq f(1) \cdot n^{\log_b a} \cdot n^{\delta \log_b a} = f(1) \cdot n^{\log_b a (1 + \delta)}\]

Setting \(\epsilon = \delta \log_b a > 0\), we have:

\[f(n) \geq f(1) \cdot n^{\log_b a + \epsilon}\]

Since \(f(1)\) is a positive constant, this shows that:

\[f(n) = \Omega(n^{\log_b a + \epsilon})\]

for some \(\epsilon > 0\) that depends on \(a, b,\) and \(c\).

Why Case 3 is “Overstated”

This proof shows that if a function \(f(n)\) satisfies the regularity condition, it automatically satisfies the polynomial-larger condition. Therefore, stating both conditions in case 3 is somewhat redundant—the regularity condition alone would suffice.

However, the master theorem states both conditions for clarity and pedagogical reasons:

  1. The polynomial-larger condition is easier to check in practice
  2. It makes the structure parallel to cases 1 and 2
  3. It provides intuition about when case 3 applies