Intereting Posts

How to start a math blog?
Logic behind continuity definition.
What is a primitive polynomial?
Existence of continuous and analytic function
Does locally compact separable Hausdorff imply $\sigma$-compact?
What are the analogues of Littlewood-Richardson coefficients for monomial symmetric polynomials?
Vector Spaces: Redundant Axiom?
Splitting of prime ideals in algebraic extensions
Prove if $56x = 65y$ then $x + y$ is divisible by $11$
Is it true that $ U \oplus W_1 = U \oplus W_2 \implies W_1 = W_2 $?
n-ary version of gcd$(a,b)\space $lcm$(a,b)$ = $ab$
Topology needed for differential geometry
Spectrum of a product of operators on a Banach space
CRC computation
Principal ideal domains that are not integral domains

**Problem:**

This is problem 33 of section 2.6 (conditioning) in Bertsekas, Tsitsiklis intro to probability text.

We are given that a coin has probability of heads equal to p and tails equal to q and it is tossed successively and independently until a head comes twice in a row or a tail comes twice in a row. What isthe expected value of the number of tosses?

**Solution:**

- Flip two coins, if at least one is heads, what is the probability of both being heads?
- Conditional expectation on components of gaussian vector
- Definition of conditional expectation
- Conditional expectation of an uniformly distributed random variable
- Question about Conditional Expectation
- Brownian Motion Conditional Expectation Question

Let $X$ be a random value that tracks the number of tosses. In addition, let $H_k, T_k$ be the event that a head or tail comes up on the kth flip respectively. The solution I am looking at involves using total expectation theorem.

I outline the following steps in the solution.

We can partition the sample space into $H_1$ and $T_1$, allowing us to invoke the total expectation theorem. Hence,

$E[X] = pE[X|H_1] + qE[X|T_1]$

Again by total expectation theorem, we can utilize another partition to obtain

$E[X|H_1] = pE[X|H_1,H_2] + qE[X|H_1,T_2] = 2p + q(1 + E[X|T_1])$

$E[X|T_1] = qE[X|T_1,T_2] + pE[X|T_1,H_2] = 2q + p(1+E[X|H_1])$

**Stuck:**

The solution follows up by saying that we can combine the above two relations and use the fact that p+q = 1 to obtain

$E[X|T_1] = \frac{2+p^2}{1-pq} , E[X|H_1] = \frac{2+q^2}{1-pq}$

I have been staring at this for a while now, but I cannot seem to see the steps of the computation. I’d be very grateful if someone could fill in the details regarding the computation. Thank you.

- Classic birthday problem turned on its head: With $N$ people, how many are likely to share most common birthday?
- Probability of drawing cards in ascending order
- How many 5-cards poker hands are there containing at least 3 of the 4 suits?
- Probability for having consecutive success in an experiment
- Eliminating Repeat Numbers from a Hat
- Free throw interview question
- Random $0-1$ matrices
- When to stop in this coin toss game?
- expected value of a function
- Difference between Probability and Probability Density

Let $x=E(X|H_1)$ and $y=E(X|T_1)$. Then we have the two equations

$$x=2p+q+qy\quad\text{ and}\quad y=2q+p+px.$$

Two linear equations in two unknowns.

Solve in any of the familiar ways, for example by substituting $2p+q+px$ for $y$ in the first equation.

We can also solve it using absorbing markov chain:

\begin{align*}

A = \left(\begin{array}{cccccc}

& \mathrm{I} & \mathrm{H} & \mathrm{T} & \mathrm{HH} & \mathrm{HT}\\

\mathrm{I} & 0 & p & q & 0 & 0 \\

\mathrm{H} & 0 & 0 & q & p & 0 \\

\mathrm{T} & 0 & p & 0 & 0 & q \\

\mathrm{HH} & 0 & 0 & 0 & 1 & 0 \\

\mathrm{HT} & 0 & 0 & 0 & 0 & 1

\end{array}\right)

\end{align*}

‘I’ is the initial state.

To get the expected number of steps to go to the absorbing state:

\begin{align*}

Q &= \left(\begin{array}{rrr}

0 & p & q \\

0 & 0 & q \\

0 & p & 0

\end{array}\right)

\end{align*}

and compute $(I-Q)^{-1}$ where $I$ is the identity matrix, and sum the first row, and hence:

\begin{align*}

\mathbb{E}(X) &= \frac{(1+p)(1+q)}{1-p\, q}

\end{align*}

An alternative solution to this problem that avoids using recursion and simultaneous equations is to split all possible terminating sequences into four blocks.

**Block 1:** Start Heads, ends 2 Heads

**Block 2**: Start Heads, ends 2 Tails

**Block 3**: Start Tails, ends 2 Tails

**Block 4**: Start Tails, end 2 Heads

pp

pqpp

pqpqpp

pqq

pqpqq

pqpqpqq

The other two blocks are the same except it starts with *q*/tails so qq, qpqq …

The expectation is the sum of the expectation of each block and so E[X] can the be given by the following sums

$$ E[X] = \sum_{x=1}^{\infty}2x\ p^{x+1}q^{x-1} + \sum_{x=1}^{\infty} (2x+1)p^{x}q^{x+1} +\sum_{x=1}^{\infty}2x\ q^{x+1}p^{x-1} + \sum_{x=1}^{\infty} (2x+1)q^{x}p^{x+1}

$$

The last two sums are basically the first two sums with *p* and *q* reversed. You can compute the infinity sums using the following two formulas.

$$\sum_{x=1}^\infty x^k=\frac{x}{x-1}$$

$$\sum_{x=1}^\infty k x^k=\frac{x}{(x-1)^2}$$

Honestly though I cheated and used mathematica to get the following formula. Given $p + q = 1$

$$E[X]=-1 + \frac{3}{1 + (-1 + p) p}$$

- Gamma Infinite Summation $\sum_{n=0}^{\infty}\frac{\Gamma(n+s)}{n!}=0$
- The integral $\int \frac{J_{d/2}^{2}(x)}{x} \ \mathrm{d}x$
- Models of the empty theory
- Good Number Theory books to start with?
- Evaluate $\int e^{2\theta} \sin (3\theta)\ d\theta$
- Are surreal numbers actually well-defined in ZFC?
- Every section of a measurable set is measurable? (in the product sigma-algebra)
- $\lim\limits_{x\to\infty}f(x)^{1/x}$ where $f(x)=\sum\limits_{k=0}^{\infty}\cfrac{x^{a_k}}{a_k!}$.
- Are there infinitely many primes of the form $12345678901234567890\dots$
- Lebesgue Integral of Non-Measurable Function
- The limit of a sum $\sum_{k=1}^n \frac{n}{n^2+k^2}$
- How many subsets of $\mathbb{N}$ have the same cardinality as $\mathbb{N}$?
- How many weakly-connected digraphs of n vertices are there without loops and whose vertices all have indegree 1?
- Find all $a,b,c\in\mathbb{Z}_{\neq0}$ with $\frac ab+\frac bc=\frac ca$
- Help with convergence in distribution