Intereting Posts

What is a local parameter in algebraic geometry?
Is “A New Kind of Science” a new kind of science?
Achilles and the tortoise paradox?
Why is $dy dx = r dr d \theta$
Is there a function that is contiunous at all the rationals, and discontinuous at all the irrationals?
When does pointwise convergence imply uniform convergence?
Definition of boundedness in topological vector spaces
Solving a ODE from a diffeomorphism numerically
Number of binary and constant-weight strings of length m which do not contain k consecutive zeros
$ \mathbb Z$ is not isomorphic to any proper subring of itself.
How to comprehend $E(X) = \int_0^\infty {P(X > x)dx} $ and $E(X) = \sum\limits_{n = 1}^\infty {P\{ X \ge n\} }$ for positive variable $X$?
Convolution of compactly supported function with a locally integrable function is continuous?
Show that $1^n+2^n+3^n+4^n$ is divisible by 5 if and only if n is not divisible by 4
How to prove that $\frac{(5m)!(5n)!}{(m!)(n!)(3m+n)!(3n+m)!}$ is a natural number?
A problem on condition $\det(A+B)=\det(A)+\det(B)$

This a reformulation of a practical problem I encountered.

Say we have an infinite sequence of random, i.i.d bits. For each bit $X_i$, $P(X_i=1)=p$.

What is the expected time until we get a sequence of $n$ 1 bits?

- Probability of winning a game in tennis?
- Average number of guesses to guess number between 1 and 1000?
- Why is Binomial Probability used here?
- Evaluating the distribution involving a Brownian motion.
- Second moments from survival function
- Thinning a Renewal Process - Poisson Generalization

Thanks!

- Crossing the road
- Proof that the hypergeometric distribution with large $N$ approaches the binomial distribution.
- Probability/Combinatorics Question
- Probability a coin comes up heads more often than tails
- Prove that $E(X) = \int_{0}^{\infty} P(X>x)\,dx = \int_{0}^{\infty} (1-F_X(x))\,dx$.
- Conditional expectation to de maximum $E(X_1\mid X_{(n)})$
- how many ways can the letters in ARRANGEMENT can be arranged
- Probability that 3 points in a plane form a triangle
- Question about probability? World Population - (Question Revised)
- Probability a die will come up 6 at least twice in twelve rolls

There is a lot of literature on such questions concerning the mean time for patterns. For your particular problem a solution can be found on page 156 of *Introduction to Probability Models* (10th edition) by Sheldon Ross. The formula is

$$E[T]=1/p+1/p^2+\cdots+1/p^n={(p^{-n}-1)/(1-p)}.$$

As expected, this is a decreasing function of $p$ for fixed $n$: it takes longer to see rarer events. As $p$ goes from 0 to 1, $E[T]$ decreases from infinity to $n$.

**Added:** Here is a derivation of the formula in my answer.

Let $T$ be the random variable that records the first time

we see $n$ ones in a row. Let’s also

define the random variable $L$ to be the position of the first zero bit in

the sequence.

Looking at the first $n$ bits there are, roughly speaking,

two possibilities: either I get the desired pattern of $n$ ones

or I got a zero bit at time $k$ and the whole problem starts over.

More formally, conditioning on the value of $L$ we get

\begin{eqnarray*}

E[T] &=& \sum_{k=1}^{n} E[T \ |\ L=k]\ P(L=k) + E[T\ |\ L> n]\ P(L>n)\cr

&=& \sum_{k=1}^{n} (k+E[T])\ P(L=k) + n P(L > n)\cr

&=& \sum_{k=1}^{n} (k+E[T])\ p^{k-1}(1-p) + n p^n.

\end{eqnarray*}

Solving this equation for $E[T]$ gives the formula.

There are many generalizations of this problem and

variations on the above proof that use, for instance, Markov chains,

or martingales, or generating functions, etc. In addition to Ross’s book

mentioned above, you may like to look at

- Section 8.4 of
*Concrete Mathematics*by Graham, Knuth, and Patashnik - Chapter 14 of
*Problems and Snapshots from the World of Probability*by Blom, Holst, and Sandell - Section XIII 7 of
*An Introduction to Probability Theory and Its Applications*by Feller

Here is a generating function approach to this problem.

**Preliminaries**

For $0\le k<n$, let an atom $a_k$ be a sequence of $k$ $1$ bits followed by a $0$ bit. Let a terminal atom $a_n$ be a sequence of $n$ $1$ bits. Let a complete $n$ sequence be a binary sequence which ends at the first subsequence of $n$ consecutive $1$ bits. Any complete $n$ sequence can be written as a unique sequence of atoms followed by a terminal atom. The probability of an atom occurring is $p^k(1-p)$ and the length of an atom is $k+1$. The probability of a terminal atom occurring is $p^n$ and the length of a terminal atom is $n$.

Therefore, for each atom $a_k$, define the monomial $\phi_x(a_k)$ as

$$

\phi_x(a_k)=\left\{\begin{array}{ll}p^k(1-p)x^{k+1}&\text{if }0\le k< n\\p^nx^n&\text{if }k=n\end{array}\right.\tag{1}

$$

and for the concatenation of two binary sequences, $s_1$ and $s_2$

$$

\phi_x(s_1s_2)=\phi_x(s_1)\phi_x(s_2)\tag{2}

$$

The probability of any sequence of atoms $s$ occurring is the coefficient of the monomial $\phi_x(s)$ and the length of $s$ is the exponent of $x$ in $\phi_x(s)$.

**The Generating Function**

Thus, the probability of a complete $n$ sequence with length $k$ coming from the concatenation of $j$ atoms and a terminal atom is the coefficient of $x^k$ in

$$

\begin{align}

&(\phi_x(a_0)+\phi_x(a_1)+\phi_x(a_2)+\dots+\phi_x(a_{n-1}))^j\phi(a_n)\\

&=p^nx^n((1-p)x)^j\left(1+px+p^2x^2+\dots+p^{n-1}x^{n-1}\right)^j\\

&=p^nx^n\left((1-p)x\frac{1-p^nx^n}{1-px}\right)^j\tag{3}

\end{align}

$$

Summing over all $j$, the probability of a complete $n$ sequence with length $k$ is the coefficient of $x^k$ in

$$

\begin{align}

f_n(x)

&=\sum_{j=0}^\infty p^nx^n\left((1-p)x\frac{1-p^nx^n}{1-px}\right)^j\\

&=\frac{p^nx^n}{1-(1-p)x\frac{1-p^nx^n}{1-px}}\tag{4}

\end{align}

$$

That is, $f_n(x)$ is the generating function for the probability of a complete $n$ sequence with a given length.

Since $f_n(1)=1$, the probability that a complete $n$ sequence occurs eventually is $1$.

**Expected Length**

$$

f_n(x)=\sum_{k=0}^\infty c_kx^k\tag{5}

$$

where $c_k$ is the probability of a complete $n$ sequence with length $k$. Therefore, $xf_n^\prime(x)$ evaluated at $x=1$ is

$$

\sum_{k=0}^\infty kc_k\tag{6}

$$

which is the expected length of a complete $n$ sequence.

$$

xf'(x)=p^nx^n\frac{n\frac{1-x}{1-px}+x\frac{1-p}{1-px}\frac{1-p^nx^n}{1-px}}{\left(1-(1-p)x\frac{1-p^nx^n}{1-px}\right)^2}\tag{7}

$$

Evaluating $(7)$ at $x=1$ yields that the expected length of a complete $n$ sequence is

$$

\mathrm{E}(k)=\frac{1-p^n}{p^n(1-p)}\tag{8}

$$

**Expected Variance**

Using the same idea that was used to get $(6)$, we “take the derivative and multiply by $x$” twice to see that $xf_n^\prime(x)+x^2f_n^{\prime\prime}(x)$ evaluated at $x=1$ is

$$

\sum_{k=0}^\infty k^2c_k\tag{9}

$$

which is the expected square of the length of a complete $n$ sequence.

$$

\begin{align}

xf_n^\prime(x)+x^2f_n^{\prime\prime}(x)

&=\frac{p^nx^n}{(1-px)^3\left(1-(1-p)x\left(\frac{1-p^nx^n}{1-px}\right)\right)^3}\\

&\times\left(

\begin{aligned}

&n^2(1-x)(1-px)\left(1-x-(1-p)xp^nx^n\right)\\

&+2n(1-p)x\left(1-x-(2-x-px)p^nx^n\right)\\

&+(1-p)x(1-p^nx^n)\left(1+x-(1-p)xp^nx^n\right)\\

\end{aligned}

\right)\tag{10}

\end{align}

$$

Evaluating $(10)$ at $x=1$ yields that the expected square of the length of a complete $n$ sequence is

$$

\mathrm{E}(k^2)=\frac{2(1-p^n)}{p^{2n}(1-p)^2}-\frac{1+2n-p^n}{p^n(1-p)}\tag{11}

$$

Combining $(8)$ and $(11)$, we get that the expected variance of the length of a complete $n$ sequence is

$$

\begin{align}

\mathrm{Var}(k)

&=\mathrm{E}(k^2)-\mathrm{E}(k)^2\\

&=\frac{1-p^{2n+1}}{p^{2n}(1-p)^2}-\frac{2n+1}{p^n(1-p)}\tag{12}

\end{align}

$$

- Use taylor series to arrive at the expression f'(x)=1/h
- Uniform continuity of $f(x)=x^{\frac{2}{3}}\log x$ on $[0, \infty)$
- Rate of divergence for the series $\sum |\sin(n\theta) / n|$
- unique factorization of matrices
- Graph of a matrix
- Please help to complete proof of inclusion and exclusion principle
- Differential equations with dense solutions
- Showing $\lim_{n\rightarrow\infty}\sqrt{n^3+n^2}-\sqrt{n^3+1}\rightarrow\frac{1}{3}$
- Question about a problem on the argument principle
- Conditionally combining vanilla and chocolate ice cream scoops
- The exact probability of observing $x$ unique elements after sampling with replacement from a set with $N$ elements
- Proving there don't exist $F(x), G(x)$ such that $1^{-1}+2^{-1}+3^{-1}+\cdots+n^{-1}={F(n)}/{G(n)}$
- Motivation for adjoint operators in finite dimensional inner-product-spaces
- Prove the AGM identity using only Hypergeometric series
- What is the probability of losing in the Taiwainese IMO team's game?