With a \(b\)-bit counter, we can ordinarily only count up to \(2^b - 1\). With R. Morris’s probabilistic counting, we can count up to a much larger value at the expense of some loss of precision.

We let a counter value of \(i\) represent a count of \(n_i\) for \(i = 0, 1, \ldots, 2^b - 1\), where the \(n_i\) form an increasing sequence of nonnegative values. We assume that the initial value of the counter is 0, representing a count of \(n_0 = 0\). The \(\textsc{Incremenet}\) operation works on a counter containing the value \(i\) in a probabilistic manner. If \(i = 2^b - 1\), then the operation reports an overflow error. Otherwise, the \(\textsc{Incremenet}\) operation increases the counter by 1 with probability \(1/(n_{i+1} - n_i)\), and it leaves the counter unchanged with probability \(1 - 1/(n_{i+1} - n_i)\).

If we select \(n_i = i\) for all \(i \geq 0\), then the counter is an ordinary one. More interesting situations arise if we select, say, \(n_i = 2^{i-1}\) for \(i > 0\) or \(n_i = F_i\) (the \(i\)th Fibonacci numberh—see Section 3.2).

For this problem, assume that \(n_{2^b-1}\) is large enough that the probability of an overflow error is negligible.

  1. Show that the expected value represented by the counter after \(n\) \(\textsc{Incremenet}\) operations have been performed is exactly \(n\).

  2. The analysis of the variance of the count represented by the counter depends on the sequence of the \(n_i\). Let us consider a simple case: \(n_i = 100i\) for all \(i \geq 0\). Estimate the variance in the value represented by the register after \(n\) \(\textsc{Incremenet}\) operations have been performed.

A

The key insight is that each \(\textsc{Incremenet}\) operation increases the represented count by a specific expected amount, regardless of the current counter state. Think of it like a probabilistic odometer where each “click” represents a different distance depending on the current reading, but the expected distance traveled per operation remains constant.

Let \(C_k\) denote the counter value after \(k\) \(\textsc{Incremenet}\) operations, and let \(V_k = n_{C_k}\) denote the value represented by the counter. We want to show that \(\mathrm{E}[V_n] = n\).

We use indicator random variables. For the \(k\)th \(\textsc{Incremenet}\) operation (where the counter value before the operation is \(C_{k-1}\)), let \(X_k\) be the indicator random variable for whether the counter increments:

\[X_k = \begin{cases} 1 & \text{if counter increments on operation } k \\ 0 & \text{otherwise} \end{cases}\]

Given that the counter value before operation \(k\) is \(i\), we have:

\[\Pr\{X_k = 1 \mid C_{k-1} = i\} = \frac{1}{n_{i+1} - n_i}\]

The change in represented value when the counter increments from \(i\) to \(i+1\) is \(n_{i+1} - n_i\). Therefore, the expected change in represented value for operation \(k\), given \(C_{k-1} = i\), is:

\[\begin{align*} \mathrm{E}[\Delta V_k \mid C_{k-1} = i] &= (n_{i+1} - n_i) \cdot \Pr\{X_k = 1 \mid C_{k-1} = i\} \\ &= (n_{i+1} - n_i) \cdot \frac{1}{n_{i+1} - n_i} \\ &= 1 \end{align*}\]

This beautiful result shows that regardless of the current counter state, each \(\textsc{Incremenet}\) operation increases the expected represented value by exactly 1. Using the law of total expectation:

\[\mathrm{E}[\Delta V_k] = \sum_{i=0}^{2^b-2} \mathrm{E}[\Delta V_k \mid C_{k-1} = i] \cdot \Pr\{C_{k-1} = i\} = \sum_{i=0}^{2^b-2} 1 \cdot \Pr\{C_{k-1} = i\} = 1\]

Since \(V_0 = n_0 = 0\) and each operation increases the expected value by 1:

\[\mathrm{E}[V_n] = V_0 + \sum_{k=1}^{n} \mathrm{E}[\Delta V_k] = 0 + n \cdot 1 = n\]

Therefore, the expected value represented by the counter after \(n\) \(\textsc{Incremenet}\) operations is exactly \(n\).

B

For the specific case where \(n_i = 100i\), we can estimate the variance by analyzing how the squared value behaves. The difference between consecutive values is constant: \(n_{i+1} - n_i = 100(i+1) - 100i = 100\) for all \(i\).

From part (a), we know that when the counter is at value \(i\), the probability of incrementing is:

\[p_i = \frac{1}{n_{i+1} - n_i} = \frac{1}{100}\]

This probability is remarkably independent of \(i\) for this particular sequence. Each \(\textsc{Incremenet}\) operation is essentially a Bernoulli trial with success probability \(1/100\).

Let’s track \(\mathrm{E}[V_k^2]\) to compute the variance. After \(k\) operations, if the counter value is \(i\) (so \(V_k = 100i\)), then after one more \(\textsc{Incremenet}\):

\[V_{k+1} = \begin{cases} 100(i+1) = V_k + 100 & \text{with probability } 1/100 \\ 100i = V_k & \text{with probability } 99/100 \end{cases}\]

The expected squared value after the \((k+1)\)st operation, given \(V_k\):

\[\begin{align*} \mathrm{E}[V_{k+1}^2 \mid V_k] &= \frac{1}{100}(V_k + 100)^2 + \frac{99}{100}V_k^2 \\ &= \frac{1}{100}(V_k^2 + 200V_k + 10000) + \frac{99}{100}V_k^2 \\ &= V_k^2 + 2V_k + 100 \end{align*}\]

Taking expectations and using \(\mathrm{E}[V_k] = k\) from part (a):

\[\mathrm{E}[V_{k+1}^2] = \mathrm{E}[V_k^2] + 2\mathrm{E}[V_k] + 100 = \mathrm{E}[V_k^2] + 2k + 100\]

Starting with \(V_0 = 0\), we have \(\mathrm{E}[V_0^2] = 0\). Summing the recurrence:

\[\begin{align*} \mathrm{E}[V_n^2] &= \sum_{k=0}^{n-1} (2k + 100) \\ &= 2\sum_{k=0}^{n-1} k + 100n \\ &= 2 \cdot \frac{(n-1)n}{2} + 100n \\ &= n^2 - n + 100n \\ &= n^2 + 99n \end{align*}\]

The variance is:

\[\begin{align*} \mathrm{Var}[V_n] &= \mathrm{E}[V_n^2] - (\mathrm{E}[V_n])^2 \\ &= (n^2 + 99n) - n^2 \\ &= 99n \end{align*}\]

Therefore, the variance in the value represented by the counter after \(n\) \(\textsc{Incremenet}\) operations is 99n, which means the standard deviation is approximately \(\sqrt{99n} \approx 10\sqrt{n}\).