Intereting Posts

How to algebraically prove $\binom{n+m}{2} = nm + \binom{n}{2} + \binom{m}{2}$?
Determining matrix $A$ and $B$, rectangular matrix
Compound Distribution — Normal Distribution with Log Normally Distributed Variance
$\mathbb Z$ is Noetherian
Is $2^\infty$ uncountable and is cardinality a continuous function?
If $G$ acts transitively and $\Gamma \subseteq \Omega$ is not a block, then each pair of points could be separated
$(x+y)^p\leq x^p+y^p$ for positive x, y and 0<p<1
Intuitive Approach to de Rham Cohomology
Countable Chain Condition for separable spaces?
Why is the function $\Omega\rightarrow\mathbb{R}$ called a random variable?
Show that $T\to T^*$ is an isomorphism (Where T is a linear transform)
Bockstein homomorphism and Steenrod square
Polynomial Interpolation and Data Integrity
Limit of this recursive sequence: $x_{n+1}=\bigl(1-\frac{1}{2n}\bigr)x_{n}+\frac{1}{2n}x_{n-1}.$
Non-UFD integral domain such that prime is equivalent to irreducible?

We have a random vector $X=(X_j)_{j=1,…,n}$, whose components $X_j$ are mutually independent. We build a new random vector $Y=A X+b$, with $Y=(Y_k)_{k=1,…,m}$, $A=(a_{k,j})_{k=1,…,m;j=1,…,n}$, $b=(b_k)_{k=1,…,m}$.

When the components of $Y$ are still mutually independent random variables? How to prove the result? I suppose this can be related with the rank of $A$ but I’m not sure and I cannot find a way to prove if I don’t make some special assumption about the law of $X$.

I know $U$ and $V$ are independent iff $f(U)$ and $g(V)$ are independent for each and every pair of measurable function $f$ and $g$. This can help, I think, when, for example, $Y_1=X_1+X_2$ and $Y_2=X_3+X_4$.

- Finding value of exponential sum
- proving equalities in stochastic calculus
- Cox derivation of the laws of probability
- Is a square-integrable continuous local martingale a true martingale?
- Show that $\lim\limits_{y\downarrow 0} y\mathbb{E}=0$.
- Convergence in distribution ( Two equivalent definitions)

But when does $Y_1 = X_1 + X_2$ and $Y_2 = X_1 – X_2$? Or when does $Y_1=X_1 + X_2$ and $Y_2=X_5$?

None of the texts I searched discuss seriously this topic, but leave it in a certain way to “intuition”?

Thanks

- Prove that if $E(X\log X)<\infty$ then $E(\sup_n |S_n|/n)<\infty$.
- $\aleph_1$ almost sure events that almost never all hold
- Thinning a Renewal Process - Poisson Generalization
- If $Var(X)=0$ then is $X$ a constant?
- Zero variance Random variables
- Martingale and bounded stopping time
- Moment generating function and exponentially decaying tails of probability distribution
- Can one tell based on the moments of the random variable if it is continuos or not
- Unique Stationary Distribution for Reducible Markov Chain
- Prove $Y_0$ is $\mathscr{L}$-measurable.

If $A = a_1 \oplus a_2 \oplus \cdots \oplus a_m$, for $m \leq n$, where $a_i$ are *row* vectors of dimension $n_i$ such that $\sum_{i=1}^m n_i = n$ and $\oplus$ denotes the direct sum, then the random vector $Y$ has independent coordinates.

This is not hard to see since $Y_1$ is measurable with respect to $\sigma(X_1, \ldots X_{n_1})$, $Y_2$ is measurable with respect to $\sigma(X_{n_1+1}, \ldots, X_{n_1+n_2})$, etc., and these $\sigma$-algebras are independent since the $X_i$ are independent (essentially, *by definition*).

Obviously, this result still holds if we consider matrices that are column permutations of the matrix $A$ described above. Indeed, as we see below, in the case where the distribution of each $X_i$ is non-normal (though perhaps depending on the index $i$), this is essentially the *only* form that $A$ can take for the desired result to hold.

In the normal-distribution case, as long as $A A^T = D$ for some diagonal matrix $D$, then the coordinates of $Y$ are independent. This is easily checked with the moment-generating function.

Suppose $X_1$ and $X_2$ are iid with finite variance. If $X_1 + X_2$ is independent of $X_1 – X_2$, then $X_1$ and $X_2$ are normal distributed random variables. See here. This result is known as **Bernstein’s theorem** and can be generalized (see below). A proof can be found in Feller or here (Chapter 5).

In the case where $A$ cannot be written as a direct sum of row vectors, you can always cook up a distribution for $X$ such that $Y$ does not have independent coordinates. Indeed, we have

Theorem(Lukacs and King, 1954): Let $X_1, X_2, \cdots, X_n$ be $n$ independently (but not necessarily identically) distributed random variables with variances $\sigma_i^2$, and assume that the $n$th moment of each $X_i(i = 1, 2, \cdots, n)$ exists. The necessary and sufficient conditions for the existence of two statistically independent linear forms $Y_1 = \sum^n_{i=1} a_i X_i$ and $Y_2 = \sum^n_{i=1} b_i X_i$ are

- Each random variable which has a nonzero coefficient in both forms is normally distributed, and
- $\sum^n_{i=1} a_i b_i \sigma^2_i = 0$.

I think little can be said in general, apart from the rather trivial case in which each of the resulting $y_i$ depends on non-intersecting subssets of $X$ (what would correspond to each column of A having no more than one non-null value).

Some weaker result can be obtained by considering non-correlation instead of independence (or, what would equivalent, restricting to gaussian variables), because the covariances matrices are simply related

$C_Y = A \; C_X A^t $

By assumption, $C_X$ is diagonal, and we want fo find out for which $A$ matrices $C_Y$ is also diagonal. Again, little can be said in general, more than just that. If we restrict further the assumptions, and assume that $x_i$ are iid (or just equal variances), then we get that orthogonality of $A$ in sufficient (but not necessary).

In particular, for the case $Y_1 = X_1 + X_2$, $Y_2 = X_1 – X_2$, $Y$ is not guaranteed to be uncorrelated, unless we assume $x_i$ are iid (or have same variance).

**The Skitovich-Darmois theorem A** (Skitovich(1953), Darmois(1953), see also

A. Kagan, Yu. Linnik, and C.R. Rao (1973, Ch.3)). Let $\xi_j$, where $j=1,

2,\dots, n,$ and $n\geq 2$, be independent random variables. Let

$\alpha_j, \beta_j$ be nonzero constants. If the linear statistics

$L_1=\alpha_1\xi_1+\cdots+\alpha_n\xi_n$ and

$L_2=\beta_1\xi_1+\cdots+\beta_n\xi_n$ are independent, then all

random variables $\xi_j$ are Gaussian.

- Difference Between Tensor and Tensor field?
- More than one blocks of infinitely repeating digits in a number
- Using Gröbner bases for solving polynomial equations
- Intuition: If $a\leq b+\epsilon$ for all $\epsilon>0$ then $a\leq b$?
- Nets and Convergence: Why directed indices?
- Ideal of $\mathbb{C}$ not generated by two elements
- Squaring across an inequality with an unknown number
- Intuition behind Descartes' Rule of Signs
- if $A$ has Lebesgue outer measure $0$ then so does $B=\left\{x^2: x\in A \right\}$
- Cost of Solving Linear System
- Is computer science a branch of mathematics?
- Hypergeometric formulas for the Rogers-Ramanujan identities?
- Square root of 1
- Prove that every function that verifies $|f(x)-f(y)|\leq(x-y)^2$ for all $x,y$ is constant.
- Proof of “triangles are similar iff corresponding angles are equal”