I’ve seen some posts similar to this one, but I feel that the problem is just different enough that I can warrant a separate post. The task at hand is the following. “Give an example of three random variables $X_1$, $X_2$, $X_3$ such that for $k=1,2,3$, $P\{X_k=1\} = P\{X_k = -1\} = \frac{1} {2}$ and […]

If I have $n$ independent, identically distributed uniform $(a,b)$ random variables, why is this true: $$ \max(x_i) | \min(x_i) \sim \mathrm{Uniform}(\min(x_i),b) $$ I agree that the probability density function of $\max(x_i) | \min(x_i)$ must be non-zero only in the range $[\min(x_i), b]$. I also find it reasonable that the posterior distribution is uniform, but I […]

This is part of the proof of the Strong Markov property of Brownian motion given in Schilling’s Brownian motion. Here $B_t$ is a $d$-dimensional Brownian motion with admissible filtration $\mathscr{F}_t$ and some a.s. finite stopping time $\sigma$. Let $t_0=0<t_1<\cdots t_n$ and $\xi_1,\dots,\xi_n\in \mathbb{R}^d$ and $F\in \mathscr{F}_{\sigma +}$. Then $$E\left[e^{i\sum_{j=1}^n \langle \xi_j, B_{\sigma+t_j} – B_{\sigma+t_{j-1}} \rangle} […]

I have following question: Suppose we have two independent events whose probability are the following: $P(A)=0.4$ and $P(B)=0.7$. We are asked to find $P(A \cap B)$ from probability theory. I know that $P(A \cup B)=P(A)+P(B)-P(A \cap B)$. But surely the last one is equal zero so it means that result should be $P(A)+P(B)$ but it […]

Let $(A_n)$ be a sequence of independent events with $\mathbb P(A_n)<1$ and $\mathbb P(\cup_{n=1}^\infty A_n)=1$. Show that $\mathbb P(\limsup A_n)=1$. It looks like the problem is practically asking to apply the Borel-Cantelli. Yet the suggested solution went differently: via $\prod_{n=1}^\infty \mathbb P( A_n^c)=0$. How can we apply the Borel-Cantelli lemma here? I.e. how to show […]

I was wondering what is the relation between the first and second Borel–Cantelli lemmas and Kolmogorov’s zero-one law? The former is about limsup of a sequence of events, while the latter is about tail event of a sequence of independent sub sigma algebras or of independent random variables. Both have results for limsup/tail event to […]

For which $N \in \mathbb{N}$ is there a probability distribution such that $\frac{1}{\sum_i X_i} (X_1, \cdots, X_{N+1})$ is uniformly distributed over the $N$-simplex? (Where $X_1, \cdots, X_{N+1}$ are accordingly distributed iid random variables.)

We’re tossing a coin until two heads or two tails in a row occur. The game ended with a tail. What’s the probability that it started with a head? Let’s say we denote the game as a sequence of heads and tails, e.g. $(T_1, H_2, T_3, H_5, H_6)$ is a game that started with a […]

I am having trouble understanding IID random variables. I’ve tried reading http://scipp.ucsc.edu/~haber/ph116C/iid.pdf, http://www.math.ntu.edu.tw/~hchen/teaching/StatInference/notes/lecture32.pdf, and http://www-inst.eecs.berkeley.edu/%7Ecs70/sp13/notes/n17.sp13.pdf but I don’t get it. Would someone explain in simple terms what IID random variables are and give me an example?

Let $Y_0, Y_1, \ldots$ be independent random variables with $P(Y_n = 1) = P(Y_n = -1) = 1/2$ for $n = 0, 1, 2, \ldots$ Define $X_n = Y_0Y_1Y_2\cdots Y_n = \prod_{i=0}^n Y_i$ for $n = 0, 1, 2, \ldots$ It can be shown that $X_0, X_1, X_2, \ldots$ are independent. Define $\mathscr{Y} \doteq \sigma(Y_1, […]

Intereting Posts

Isomorphism between complex numbers minus zero and unit circle
The Pythagorean theorem and Hilbert axioms
Expectation inequality for a set and its subset.
When do free variables occur? Why allow them? What is the intuition behind them?
Techniques to prove that there is only one square in a given sequence
When is $x=\{ x\}$?
Is there a geometrical method to prove $x<\frac{\sin x +\tan x}{2}$?
Local diffeomorphism is diffeomorphism provided one-to-one.
Nonlinear 1st order ODE involving a rational function
Video lectures on Group Theory
In a reduced ring the set of zero divisors equals the union of minimal prime ideals.
What's are all the prime elements in Gaussian integers $\mathbb{Z}$
What is the point of “seeing” a set of polynomials or functions as a vector space?
Show $\vert G \vert = \vert HK \vert$ given that $H \trianglelefteq G$, $G$ finite and $K \leq G$.
Power of prime ideal