Articles of independence

Pairwise Independent Random Variables that aren't Jointly Independent

I’ve seen some posts similar to this one, but I feel that the problem is just different enough that I can warrant a separate post. The task at hand is the following. “Give an example of three random variables $X_1$, $X_2$, $X_3$ such that for $k=1,2,3$, $P\{X_k=1\} = P\{X_k = -1\} = \frac{1} {2}$ and […]

Distribution of Max(X_i) | Min(X_i), X_i are iid uniform random variables

If I have $n$ independent, identically distributed uniform $(a,b)$ random variables, why is this true: $$ \max(x_i) | \min(x_i) \sim \mathrm{Uniform}(\min(x_i),b) $$ I agree that the probability density function of $\max(x_i) | \min(x_i)$ must be non-zero only in the range $[\min(x_i), b]$. I also find it reasonable that the posterior distribution is uniform, but I […]

A variant of Kac's theorem for conditional expectations?

This is part of the proof of the Strong Markov property of Brownian motion given in Schilling’s Brownian motion. Here $B_t$ is a $d$-dimensional Brownian motion with admissible filtration $\mathscr{F}_t$ and some a.s. finite stopping time $\sigma$. Let $t_0=0<t_1<\cdots t_n$ and $\xi_1,\dots,\xi_n\in \mathbb{R}^d$ and $F\in \mathscr{F}_{\sigma +}$. Then $$E\left[e^{i\sum_{j=1}^n \langle \xi_j, B_{\sigma+t_j} – B_{\sigma+t_{j-1}} \rangle} […]

union of two independent probabilistic event

I have following question: Suppose we have two independent events whose probability are the following: $P(A)=0.4$ and $P(B)=0.7$. We are asked to find $P(A \cap B)$ from probability theory. I know that $P(A \cup B)=P(A)+P(B)-P(A \cap B)$. But surely the last one is equal zero so it means that result should be $P(A)+P(B)$ but it […]

How can we apply the Borel-Cantelli lemma here?

Let $(A_n)$ be a sequence of independent events with $\mathbb P(A_n)<1$ and $\mathbb P(\cup_{n=1}^\infty A_n)=1$. Show that $\mathbb P(\limsup A_n)=1$. It looks like the problem is practically asking to apply the Borel-Cantelli. Yet the suggested solution went differently: via $\prod_{n=1}^\infty \mathbb P( A_n^c)=0$. How can we apply the Borel-Cantelli lemma here? I.e. how to show […]

Relation between Borel–Cantelli lemmas and Kolmogorov's zero-one law

I was wondering what is the relation between the first and second Borel–Cantelli lemmas and Kolmogorov’s zero-one law? The former is about limsup of a sequence of events, while the latter is about tail event of a sequence of independent sub sigma algebras or of independent random variables. Both have results for limsup/tail event to […]

Uniform distribution on a simplex via i.i.d. random variables

For which $N \in \mathbb{N}$ is there a probability distribution such that $\frac{1}{\sum_i X_i} (X_1, \cdots, X_{N+1})$ is uniformly distributed over the $N$-simplex? (Where $X_1, \cdots, X_{N+1}$ are accordingly distributed iid random variables.)

Two tails in a row – what's the probability that the game started with a head?

We’re tossing a coin until two heads or two tails in a row occur. The game ended with a tail. What’s the probability that it started with a head? Let’s say we denote the game as a sequence of heads and tails, e.g. $(T_1, H_2, T_3, H_5, H_6)$ is a game that started with a […]

independent, identically distributed (IID) random variables

I am having trouble understanding IID random variables. I’ve tried reading http://scipp.ucsc.edu/~haber/ph116C/iid.pdf, http://www.math.ntu.edu.tw/~hchen/teaching/StatInference/notes/lecture32.pdf, and http://www-inst.eecs.berkeley.edu/%7Ecs70/sp13/notes/n17.sp13.pdf but I don’t get it. Would someone explain in simple terms what IID random variables are and give me an example?

Prove $Y_0$ is $\mathscr{L}$-measurable.

Let $Y_0, Y_1, \ldots$ be independent random variables with $P(Y_n = 1) = P(Y_n = -1) = 1/2$ for $n = 0, 1, 2, \ldots$ Define $X_n = Y_0Y_1Y_2\cdots Y_n = \prod_{i=0}^n Y_i$ for $n = 0, 1, 2, \ldots$ It can be shown that $X_0, X_1, X_2, \ldots$ are independent. Define $\mathscr{Y} \doteq \sigma(Y_1, […]