Prove that the running time of an algorithm is $$\Theta(g(n))$$ if and only if its worst-case running time is $$O(g(n))$$ and its best-case running time is $$\Omega(g(n))$$.

Let’s assume that the running time of the algorithm is $$T(n)$$. If $$T(n) = \Theta(g(n))$$, then for $$n \ge n_0$$,

$0 \le c_1 g(n) \le T(n) \le c_2 g(n)$

As $$0 \le T(n) \le c_2 g(n)$$ for $$n \ge n_0$$, $$T(n) = O(g(n))$$, i.e. $$T(n)$$ is upper bounded by $$O(n)$$. In other words, worst-case running time of the algorithm is $$O(n)$$.

And as $$0 \le c_1 g(n) \le T(n)$$ for $$n \ge n_0$$, $$T(n) = \Omega(g(n))$$, i.e. $$T(n)$$ is lower bounded by $$\Omega(n)$$. In other words, best-case running time of the algorithm is $$\Omega(n)$$.

Similarly we can prove the reverse as we did in Exercise 3.1-5.