Articles of probability

Deriving the mean of the Geometric Distribution

I am missing something that might be trivial in deriving the mean of the geometric distribution function by using expected value identity $$ \sum_x x \theta (1-\theta)^{x-1}. $$

Truncated Mean Squared

Suppose that $X_{\sigma} \sim \mathcal{N}(\mu,\sigma^{2})$. I am interested in whether $f(\sigma)=\mathbb{E} (X_{\sigma}^2 1_{\{X_{\sigma}>0\}})$ is monotonic in $\sigma$ for all $\mu$. I ran a Monte Carlo and it does appear monotonic but I can’t rigourous prove it.

Sum of random subsequence generated by coin tossing

Let $(\pi_1, \pi_2, \cdots)$ be an infinite sequence of real numbers such that $\forall i\; \pi_i > 0$ and $\sum_i \pi_i = 1$. This can be thought of as a probability over natural numbers. Let $(z_1, z_2, \ldots)$ be a sequence of independently and identically distributed Bernoulli random variables such that $P(z_i = 1) = […]

Since $\mathbb{E}$ is defined as the integral from $0$ to infinity of $S(x)$, what do you do when $-1<x<1$?

I have the PDF $f(x)=\frac{3}{4}(1-x^2)\mathbf 1_{-1<x<1}$ and accordingly the CDF $$F(x)=\begin{cases}0, &\phantom{-}x\le -1\\\frac{3}{4}x-\frac{1}{4}x^3+\frac12, & -1<x<1 \\1, & \phantom{-}1\le x\end{cases}$$ Since the formula for $\mathbb{E}[X]$ is the integral of $S(x)=1-F(x)$ is from $0$ to $\infty$, how do I account for the fact that $x$ is only defined from $-1$ to $1$ when I need to calculate […]

Coin with unknown bias flipped N times with N heads, what is p(h)?

Given a coin with an unknown bias and the observation of $N$ heads and $0$ tails, what is expected probability that the next flip is a head?

Density/probability function of discrete and continuous random variables

I was wondering how does the rules of probability apply if we have a discrete random variable $Y$ and a continuous random variable $X$. What would $P(Y=y,X=x)$ be equal to? Is joint probability defined or is it joint density function? (note: I denote probability as $P$ and density functions with $f$) Is it correct to […]

Bound on the $Q$ function related to Chernoff bound

For the function $Q(x) := \mathbb{P}(Z>x)$ where $Z \sim \mathcal{N}(0,1)$ \begin{align} Q(x) = \int_{x}^\infty \frac{1}{\sqrt{2\pi}} \exp \left(-\frac{u^2}{2} \right) \text{d}u, \end{align} for $x \geq 0$ the following bound is given in many communication systems textbooks: \begin{align} Q(x) \leq \frac{1}{2} \exp \left(-\frac{x^2}{2} \right). \end{align} The bound without the $\frac{1}{2}$ in front of the exponential can be proven […]

Conditional expectation and almost sure equality

Let $X$, $Y$ be r.v. with finite second moments. Suppose $\mathbb{E}(X\mid\sigma (Y))=Y$, and $\mathbb{E}(Y\mid\sigma(X))=X$, show that $\Pr(X=Y)=1$. So what I have done is this, I first consider $\mathbb{E}((X-Y)^2)$ by conditioning on $X$ and $Y$ $\mathbb{E}((X-Y)^2\mid X)=\mathbb{E}(X^2\mid X)-2\mathbb{E}[XY\mid X]+\mathbb{E}[Y^2\mid X]=X^2-2X^2+\mathbb{E}(Y^2\mid X)=-X^2+\mathbb{E}[Y^2\mid X]$, and similarly for conditioning on $Y$, but I am not sure how to subtract […]

Prove uniform distribution

For any random variable $X$, there exists a $U(0,1)$ random variable $U_X$ such that $X=F_X^{-1}(U_X)$ almost surely. Proof: In the case that $F_X$ is continuous, using $U_X=F_X(X)$ would suffice. In the general case, the statement is proven by using $U_X=F_X(X^-)+V(F_X(X)-F_X(X^-))$, where $V$ is a $U(0,1)$ random variable independent of $X$ and $F_X(x^-)$ denotes the left […]

Quick way to tell if a set of dice is NOT non-transitive

Is there a quick way to tell if a set of six-sided dice cannot be non-transitive? I’ve writing an algo and brute force is taking too long to find out. I had a look at http://math.ku.edu/~jschweig/dice.pdf but it has a precondition that numbers on a die’s face shouldn’t repeat on other faces of that die […]