Expected value of two successive heads or tails (stuck on computation)

Problem:
This is problem 33 of section 2.6 (conditioning) in Bertsekas, Tsitsiklis intro to probability text.

We are given that a coin has probability of heads equal to p and tails equal to q and it is tossed successively and independently until a head comes twice in a row or a tail comes twice in a row. What isthe expected value of the number of tosses?

Solution:

Let $X$ be a random value that tracks the number of tosses. In addition, let $H_k, T_k$ be the event that a head or tail comes up on the kth flip respectively. The solution I am looking at involves using total expectation theorem.

I outline the following steps in the solution.

We can partition the sample space into $H_1$ and $T_1$, allowing us to invoke the total expectation theorem. Hence,

$E[X] = pE[X|H_1] + qE[X|T_1]$

Again by total expectation theorem, we can utilize another partition to obtain

$E[X|H_1] = pE[X|H_1,H_2] + qE[X|H_1,T_2] = 2p + q(1 + E[X|T_1])$

$E[X|T_1] = qE[X|T_1,T_2] + pE[X|T_1,H_2] = 2q + p(1+E[X|H_1])$

Stuck:
The solution follows up by saying that we can combine the above two relations and use the fact that p+q = 1 to obtain

$E[X|T_1] = \frac{2+p^2}{1-pq} , E[X|H_1] = \frac{2+q^2}{1-pq}$

I have been staring at this for a while now, but I cannot seem to see the steps of the computation. I’d be very grateful if someone could fill in the details regarding the computation. Thank you.

Solutions Collecting From Web of "Expected value of two successive heads or tails (stuck on computation)"

Let $x=E(X|H_1)$ and $y=E(X|T_1)$. Then we have the two equations
$$x=2p+q+qy\quad\text{ and}\quad y=2q+p+px.$$
Two linear equations in two unknowns.

Solve in any of the familiar ways, for example by substituting $2p+q+px$ for $y$ in the first equation.

We can also solve it using absorbing markov chain:

\begin{align*}
A = \left(\begin{array}{cccccc}
& \mathrm{I} & \mathrm{H} & \mathrm{T} & \mathrm{HH} & \mathrm{HT}\\
\mathrm{I} & 0 & p & q & 0 & 0 \\
\mathrm{H} & 0 & 0 & q & p & 0 \\
\mathrm{T} & 0 & p & 0 & 0 & q \\
\mathrm{HH} & 0 & 0 & 0 & 1 & 0 \\
\mathrm{HT} & 0 & 0 & 0 & 0 & 1
\end{array}\right)
\end{align*}

‘I’ is the initial state.

To get the expected number of steps to go to the absorbing state:

\begin{align*}
Q &= \left(\begin{array}{rrr}
0 & p & q \\
0 & 0 & q \\
0 & p & 0
\end{array}\right)
\end{align*}
and compute $(I-Q)^{-1}$ where $I$ is the identity matrix, and sum the first row, and hence:
\begin{align*}
\mathbb{E}(X) &= \frac{(1+p)(1+q)}{1-p\, q}
\end{align*}

An alternative solution to this problem that avoids using recursion and simultaneous equations is to split all possible terminating sequences into four blocks.

Block 1: Start Heads, ends 2 Heads

Block 2: Start Heads, ends 2 Tails

Block 3: Start Tails, ends 2 Tails

Block 4: Start Tails, end 2 Heads


pp

pqpp

pqpqpp


pqq

pqpqq

pqpqpqq


The other two blocks are the same except it starts with q/tails so qq, qpqq …

The expectation is the sum of the expectation of each block and so E[X] can the be given by the following sums
$$ E[X] = \sum_{x=1}^{\infty}2x\ p^{x+1}q^{x-1} + \sum_{x=1}^{\infty} (2x+1)p^{x}q^{x+1} +\sum_{x=1}^{\infty}2x\ q^{x+1}p^{x-1} + \sum_{x=1}^{\infty} (2x+1)q^{x}p^{x+1}
$$

The last two sums are basically the first two sums with p and q reversed. You can compute the infinity sums using the following two formulas.
$$\sum_{x=1}^\infty x^k=\frac{x}{x-1}$$
$$\sum_{x=1}^\infty k x^k=\frac{x}{(x-1)^2}$$

Honestly though I cheated and used mathematica to get the following formula. Given $p + q = 1$
$$E[X]=-1 + \frac{3}{1 + (-1 + p) p}$$